
In a recent study published in Science Robotics, Tu Delft researchers were inspired by ants to develop An autonomous navigation strategy inspired by insects for tiny and light robots. This innovative approach allows robots to go home after long journeys, requiring minimal calculation and memory – only 0.65 kilobbytes per 100 meters.
Scientists have long marveled with remarkable navigation skills in ants, despite their relatively simple sensory and neural systems. Previous research, such as a study carried out in the universities of Edinburgh and Sheffield, has enabled the development of a Artificial neural network that helps robots to recognize and remember the roads in complex natural environments By imitating the prowess of navigation of the ants.
In the recent study, researchers focused on tiny robots, weighing from a few tens to a few hundred grams, which have enormous potential for various applications. Their light design guarantees security even if they accidentally collide with something. Their small size allows them to easily maneuver in restricted spaces. In addition, if low -cost production is established, these robots can be used in large numbers, quickly covering large areas such as greenhouses to detect pests or diseases in plants early.
However, allowing these tiny robots to operate independently poses significant challenges due to their limited resources compared to the largest robots. A major obstacle is their ability to navigate independently. Although robots can use external infrastructure such as GPS satellites outside or wireless communication beacons inside, count on such an infrastructure is often undesirable. GPS signals are not available inside and can be inaccurate in congested environments such as urban areas. The installation and maintenance of tags can be expensive or improper, especially in the search and rescue scenarios.
To overcome these challenges, the researchers turned to nature. Insects, in particular ants, operate over relevant distances for many real world applications while using minimum detection and IT resources. Insects combine omometry (according to their own movement) with visually guided behaviors according to their low -resolution but omnidirectional (view memory) visual system. This combination has inspired researchers to develop new navigation systems.
One of the theories of insect navigation, the “instant” model suggests that insects sometimes capture snapshots of their environment. Later, they compare their current visual perception to these snapshots to go home, correcting any drift that occurs with omometry alone. The main idea of researchers was that the snapshots could be spaced much further if the robot was traveling among themselves according to the domometry. Guido de Croon, a professor in bio-inspired drones and co-author of the study, explained that the homing will operate as long as the robot ends sufficiently close to the location of the snapshots, that is to say as long as the drift of the robot's odometry is in the “capture zone” of the snapshot. This also allows the robot to travel much further, because the robot flies much slower when returning to an instantaneous than during a snap of an snapshot to the next depending on the algorithms of Odometry.
The proposed navigation strategy was tested on a “Crazyflie” drone of 56 grams equipped with an omnidirectional camera. The drone successfully covered distances up to 100 meters using only 0.65 kilo-kilo-kilo-kilo-kilo-. All visual treatments have been managed by a tiny computer called “micro-controller”, commonly found in cheap electronic devices.
According to Guido de Croon, this new navigation strategy inspired by insects is an important step towards the application of tiny autonomous robots in the real world. Although the functionality of the strategy is more limited than modern navigation methods, it may be enough for many applications. For example, drones could be used for monitoring stocks in warehouses or crop monitoring in greenhouses. They could fly away, collect data and return to a base station, storing images relevant to the mission on a small SD card for post-processing by a server without the need for these images for navigation.
In related research and development, which has also made significant progress in Autonomous navigation systems for drones in GS environments.. Our innovative approach uses advanced AI algorithms, computer vision and on -board sensors to allow drones to navigate and operate effectively without relying on external GPS signals. This technology is particularly useful for applications in interior environments, urban or rural areas and other difficult contexts when traditional GPS navigation fails.
This progress marks a step forward in the deployment of tiny robots and autonomous drones, expanding their potential uses and improving their operational efficiency in real world scenarios.
