FlightEye relies purely on ADS-B data, which means no need for an external internet connection and easy integration into any aircraft.
FlightEye will decide where planes are around you, and place them accordingly, which means more time focusing on what matters most.
FlightEye will analyze previous aircraft behavior to give you an accurate prediction of where they will go next, keeping you ahead of the game.
ADS-B data is the backbone of FlightEye, as it allows us to be aware of aircraft around us. On average, every second aircraft broadcast ADS-B data, showing their position, speed, altitude, etc. Using the Raspberry Pi, we gather this data and convert it to JSON.
After generating ADS-B data in JSON format, FlightEye uses a private WiFi connection to send JSON data from the Raspberry Pi to the HoloLens 2. As long as the Raspberry Pi is powered on and the HoloLens 2 is connected to the Pi's network, data transmission will begin.
Our predictive algorithm uses linear-quadratic estimation to determine the approximate location of each nearby aircraft in between ADS-B transmissions. The algorithm is a derivation of the Kalman Filter, designed to sustain the computational intensity of real-time augmented reality systems. When activated, the UI markers will move smoothly across the viewport, modeling the instantaneous predicted location of an aircraft until a new ADS-B frame is received.
Once all of the aircraft data is updated, we are ready to render the marker for each aircraft within the headset. Our rendering algorithm allows for exactly 500ft of error in all directions, which consequently makes closer aircraft appear larger perspectively. Additionally, an aircraft icon is displayed on the marker which models the exact 6DOF position of the aircraft in space, and this allows the pilot to perform a quick evaluation of each aircraft's orientation. For those who require more precise information, the numerical data corresponding to each aircraft is displayed alongside the icon.
Using the relative orientation to the pilot, FlightEye can show which direction the aircraft is heading by moving the aircraft icon.
Depending on how far away the aircraft is, the UI will increase or decrease its size to bring attention to closer aircraft.
Airspeed, relative elevation difference, heading, distance, and tail number are all prominently shown for each aircraft.
By a simple extension of the left arm, the FlightEye Menu will appear. No need for physical buttons or complex menus, everything is avaliable at once.
In order to calibrate the FlightEye systems, users must enter their heading at startup. After pressing the calibrate button, users are directed to enter their heading.
Using the buttons and sliders, users can control the filter range, UI brightness, toggle UI, toggle the predictive algorithm, reconnect to the Pi, or ping the Pi.
Unity is the primary development enviornment for deploying software on the Microsoft HoloLens. The platform allows us to generate renderings that correlate directly with our geospatial orientation.
FlightEye makes use of two seperate components: our custom ADS-B enclosure (code named ADS-Box) and the Microsoft HoloLens 2. The ADS-B enclosure is comprised of a 1090MHz antenna, GPS dongle, and a Raspberry Pi. The enclosure gathers all relevant data for nearby aircraft and user position and sends it wirelessly to the HoloLens 2 via a WiFi network hosted on the Raspberry Pi.
MRKT provides us with the ability to interact with tracking and sensing within the Microsoft HoloLens. In our project, MRTK allows us to generate hand menus, utilize eye tracking, and build our project for deployment.
In order to receive and decode ADS-B transmissions, FlightEye uses an industry standard, open-source, program called dump1090. This program runs on the Raspberry Pi and configures the gain of the 1090MHz antenna, and decodes the incoming ADS-B packets. dump1090 then outputs the decoded packets into a simple JSON file, which is sent to the HoloLens 2 so it can be displayed.
Unity/XR Lead
Hardware/Linux Lead
Research Lead
Software Engineering Lead
Our team members all have backgrounds in VR/AR technology, computer networks, and algorithm development.