
In an era rapidly evolving of artificial intelligence (AI), the integration of AI into agriculture takes the scene. Among the latest innovations, Ecorobotix, a solar unit assisted by GPS wide, elegantly slides in the fields of cultures, targeting and eradicating weeds with an amazing 95%precision, effectively reducing waste. In addition, energy and universal robots radically change citrus harvest through a combination of several flexible robotic cameras and weapons. The latuce river uses a scanning of crop geometry to optimize growth and minimize the use of pesticides, distinguishing between weeds and crops to prevent suctionuration and diseases.
However, the current challenge is to navigate in complex and constantly evolving natural environments, such as dense forests or high grass terrains. How can robots remember effectively where they have been and recognize the places they have previously visited in a visually repetitive environment?
The inspiration was found in an improbable source: ants. These tiny creatures have remarkable navigation skills despite their relatively simple sensory and neural systems. Researchers, led by Zhu in the universities of Edinburgh and Sheffield, sought imitate the prowess of navigation of ants in a new artificial neural network. This network would help robots to recognize and remember the ways in complex natural environments, especially in agriculture, where dense vegetation poses an important challenge.
Ants use a single neuronal structure known as “fungi body” in their brain to detect visual models and store space-time memories, which allows them to navigate effectively in a visually repetitive environment. Zhu and his team used this biological mechanism as inspiration for their research.
Their approach consisted in designing a bio -spindled event camera mounted on a terrestrial robot to capture visual sequences along the routes in natural outdoor environments. To facilitate the recognition of the route, they have developed a neuronal algorithm for spatio-temporal memory which closely reflects the circuit of the body of insect mushrooms.
Above all, they used neuromorphic computers, imitating the structure and function of biological neurons, to code memory in a pointe neural network operating on a low -power neuromorphic computer. The result was a robotic system that could assess the visual familiarity in real time from the images of the event camera, by supporting the recognition of routes for visual navigation.
In rigorous tests in different contexts, including meadows, woods and agricultural land, the neural model inspired by the ant has proven its effectiveness. He outperformed another method of learning the route called Seqslam when evaluated on repeated races on the same route or the same routes with small lateral offsets. Seqslam is a technique that corresponds to image sequences to find similarities between different races.
The implications of this research extend far beyond robotics. This neural model inspired by the ant is the promise of transforming agricultural robotics, which makes it more effective and effective in navigating through dense vegetation. In addition, researchers suggest that the principles of this model could be extended to other sensory methods, such as olfaction or sound, improving the perception of a robot of its environment.
This study represents a significant front step in the exploitation of the collective wisdom of nature browsers to improve our technological progress. While we continue to be inspired by the natural world, Robotics focused on AI could find even more innovative solutions to complex challenges, ultimately benefiting from far industries.
Read the rest of news on the navigation of the robot here.
