When navigating and planning routes, unmanned systems and vehicles rely on GPS technology. Unfortunately, it can often be unreliable in dense urban settings or even be entirely unavailable in closed spaces, such as rooms or tunnels. Even a two-meter difference between the actual and the calculated location can prevent a small drone from landing precisely on a charging base or placing a package in the correct spot.

Local navigation can remedy these complications. For instance, the vehicle can use its onboard camera to scan QR codes placed along the landing area. However, this approach, too, has its drawbacks: it requires proper lighting and a great number of QR codes of various sizes that could be detected from different distances. Another local navigation approach involves radio beacons and ultrasound; with its receivers, the drone detects sources of radio waves or sound and thus locates itself. However, radio waves are reflected from walls, while ultrasound is drowned out by the sound of the drone’s blades, which hinders calculation of coordinates.

Scientists from ITMO’s Faculty of Secure Information Technologies have developed VOSTOK, a local navigation system that helps drones navigate, land, and take off in areas without GPS access. Unlike its analogs, VOSTOK doesn’t require a camera, radar, or lidar to navigate and the system can be assembled from cheap, easily accessible components. Additionally, the new solution is faster and more accurate: coordinates are calculated with an accuracy of up to four millimeters and at several thousand times per second, surpassing real-time kinematic positioning (GPS RTK), while speed is tracked ten times per second with an accuracy of up to two centimeters per second – this is the same level of accuracy as with GPS. Thanks to these parameters, a drone can land on or launch from a moving vehicle even in dense fog.

“Our method is based on the classic principles of radio navigation and radio direction finding (RDF) and works not unlike talking to two people standing in one place and communicating with a megaphone. If you want to hear one of them, you have to face their megaphone; and if you want to hear both – you need to put yourself between the two. In VOSTOK, two LEDs placed at an angle to each other play the role of the megaphones. Each module has two pairs of LEDs. They shine on the drone’s light detectors with a signal modulated to different frequencies. The light detectors receive the signal while a special module amplifies and processes it, calculating the drone’s coordinates and sharing the data with the onboard computer. In real time, we can see the drone’s exact location, as well as any changes in its altitude or angle of tilt,” explains Timofey Melnikov, the head of the project and a first-year PhD student at ITMO’s Faculty of Secure Information Technologies.

VOSTOK can have various applications. For instance, if used along the Northern Sea Route, the system can ensure automatic landing of unmanned helicopters onto moving icebreakers. It can also automate the routes of driverless trucks in quarries, where dust is a problem for navigation, as well as drone deliveries of groceries to the countryside.

Currently, the signal from two pairs of modules covers an 8x8 meter room with ceilings up to 3 meters, but the system can easily be scaled up by increasing the number of modules, their power, and configuration. 

In the future, the developers are planning to test the system inside a sports hall and configure the system to allow vehicles to position themselves relative to one another, and not just within a fixed space. They hope that their solution will become a part of a robodrone arena, a playground for developers to collect data and polish their drone control algorithms.

“Using VOSTOK, we are planning to create a 3D autopilot that will be able to fly a drone as well as a human pilot. Typically, to create such an autopilot, you would need to record several hours of flight, process camera data, and use it to train an AI model. With our system, such a dataset can be collected based on coordinates received by the drone in-flight. Thus, the AI model will be more precise, as we collect more data per second than other systems, while also being significantly cheaper because you don’t need a motion capture system. With a 3D autopilot, it will be possible to reduce the time required for routine tasks, such as monitoring agricultural land or assessing bridges, tunnels, or similar structures, as the drone will require less time to turn or avoid obstacles at a higher speed,” explained Andrey Boyko, the project’s research consultant and a senior researcher at the Faculty of Secure Information Technologies.

Calculation system demonstration. Andrey Boyko is holding a mobile module with three photo receivers. Data from the module is passed to the computer. On the screen on the left, you can see received raw signal amplitudes, while on the right the three spiral shapes that correspond to the movement of the three receivers on the mobile module. Video by Dmitry Grigoryev / ITMO.NEWS