Have you ever wondered how insects can travel far from their habitat and still find their way? The answer to this question has implications not only for biology, but also for creating AI for small, autonomous robots.
Inspired by the biological discovery that ants visually perceive their surrounding environment, combined with step counting, to safely return to their nest, drone researchers at Delft University of Technology used these insights to develop an autonomous navigation strategy for a small, lightweight insect-inspired robot.
This strategy allows the robot to return home after long trajectories with extremely low computational and memory usage (0.65 kilobytes per 100 meters).In the future, small autonomous robots will be used for a wide range of applications, from monitoring inventory in warehouses to detecting gas leaks in industrial areas.
The researchers The survey results were announced in Science RoboticsJuly 17, 2024.
Defending the weak
Small robots weighing just a few tens to a few hundred grams have interesting potential applications in the real world. Because they are so light, they are extremely safe even if they accidentally bump into someone.
Because they are small, they can be moved around in tight spaces. If they can be manufactured cheaply, they can be deployed in large numbers to quickly cover a wide area, such as a greenhouse, and detect pests and diseases at an early stage.
However, it is difficult to move such small robots on their own, since they have much more limited resources than larger robots. A major obstacle is that the robot must be able to move on its own. For this, the robot can receive assistance from external infrastructure: outdoors it can use position estimates from GPS satellites, indoors it can use radio communication beacons.
However, relying on such infrastructure is often undesirable: GPS is unavailable indoors and can be highly inaccurate in cluttered environments such as urban canyons, and installing and maintaining beacons in indoor spaces is either prohibitively expensive or simply impossible, such as in search and rescue scenarios.
The AI required for autonomous navigation using only on-board resources was developed with large robots such as self-driving cars in mind. Some approaches rely on heavy, power-hungry sensors such as LiDAR laser rangers that cannot be carried or powered by small robots.
Other approaches use vision, which is a very power-efficient sensor that provides a wealth of information about the environment. But these approaches typically try to create a highly detailed 3D map of the environment, which requires a lot of processing and memory that can only be provided by computers, which are too large and power hungry. Small robot.
Step counting and visual breadcrumbs
For this reason, some researchers are looking to nature for inspiration. Insects are particularly interesting because they operate over distances that could be relevant for many real-world applications while using very limited sensor and computing resources.
Biologists, insectSpecifically, insects combine tracking their own movements (called “odometry”) with visually guided behavior (called “view memory”) based on a low-resolution but nearly omnidirectional visual system.
Although odometry is increasingly understood down to the neuronal level, the exact mechanisms underlying visual memory are still not fully understood.
One of the earliest theories about how this works issnap shotThe model proposes that insects such as ants take occasional snapshots of their surrounding environment.
Then, when approaching a snapshot, the insect can compare its current vision with the snapshot and move to minimize the difference, allowing the insect to navigate, or “home”, to the location of the snapshot, eliminating the drift that inevitably occurs when performing odometry alone.
“Snapshot-based navigation can be compared to how Hansel and Gretel in the fairy tale tries to avoid getting lost. When Hans throws stones on the ground, he finds his way back home. But when he throws bread crumbs for the birds to eat, Hans and Gretel get lost. In our case, the stones are the snapshots,” says Tom van Dijk, lead author of the study.
“Just like with stones, for snapshots to work, the robot needs to get close enough to the snapshot location. If the snapshot location and the visual surroundings are too different, the robot may move in the wrong direction and never come back. So you need to use enough snapshots, or, in Hansel’s case, drop enough stones.
“On the other hand, if the stones are placed too close together, Hans will quickly run out of stones. For a robot, using too many snapshots will lead to excessive memory consumption. Previous work in this area has typically placed snapshots very close together, allowing the robot to first visually return to one snapshot and then move on to the next.”
“The main insight underlying our strategy is that if a robot moves between snapshots based on odometry, the interval between snapshots can be much greater,” said Guido de Kroon, professor of Bio-Inspired Drones and co-author of the paper.
“Homing works as long as the robot gets close enough to the snapshot location — that is, as long as the robot’s odometry drift falls within the snapshot’s ‘catchment area.’ And because the robot flies much slower when homing to a snapshot than it would if it were flying from one snapshot to the next based on odometry, the robot can travel farther.”
This insect-inspired navigation strategy enables the 56-gram CrazyFlie drone, equipped with an omnidirectional camera, to navigate distances of up to 100 meters on just 0.65 kilobytes. All the vision processing is done by tiny computers called “microcontrollers,” which are found in many inexpensive electronic devices.
Utilizing robotics technology
“The proposed insect-inspired navigation strategy is a novel way to find tiny objects. Autonomous Robots In the real world,” says Guido de Kroon.
“The capabilities of the proposed strategy are more limited than those offered by state-of-the-art navigation methods: it does not generate a map and can only guide the robot back to its starting point.”
“Still, this may be more than enough for many applications. For example, tracking inventory in a warehouse or monitoring crops in a greenhouse, a drone could fly off, collect data, and then return to a base station. Images relevant to the mission could be stored on a small SD card and post-processed on a server. But they aren’t needed for navigation itself.”
For more information:
Tom Van Dyke et al. “Visual Path Following for Small Autonomous Robots” Science Robotics (2024). DOI: 10.1126/scirobotics.adk0310. www.science.org/doi/10.1126/scirobotics.adk0310
Provided by
Delft University of Technology
Quote: Researchers develop insect-inspired autonomous navigation strategy for tiny, lightweight robots (July 17, 2024) Retrieved July 18, 2024 from https://techxplore.com/news/2024-07-insect-autonomous-strategy-tiny-lightweight.html
This document is subject to copyright. It may not be reproduced without written permission, except for fair dealing for the purposes of personal study or research. The content is provided for informational purposes only.