 |
ISR-affiliated Professor Yiannis Aloimonos (CS/UMIACS) and Associate Research Scientist Cornelia Fermüller (UMIACS) imagine the future of insect-scale, active vision sensing systems in a July 15, 2020 Science Robotics commentary.
In “A bug’s eye view,” the well-known computer vision specialists consider the ramifications of research appearing in the same issue of the journal, “Wireless steerable vision for live insects and insect-scale robots,” by V. Iyer, A. Najafi, J. James, S. Fuller, and S. Gollakota. This research reports a wireless, Bluetooth-enabled, low-power, rotatable vision system that can be mounted on insect-scale robots and live insects. The research demonstrates that it is now feasible to equip insect-scale robots with an “active vision” system that allows scene selection, manipulation of the field of vision, and a capability to zoom the camera.
This is a superior system to existing microrobot schemes that rely on fixed parameters, Aloimonos and Fermüller note.
Because of their size, weight, area and power constraints, tiny robots currently cannot reconstruct scenes using simultaneous localization and mapping (SLAM) algorithms. What Iyer has done is an alternative: create a steerable camera system that can perform a specific set of tasks in a more optimal way.
Aloimonos and Fermüller imagine a future of microrobots with such steerable cameras that could find holes, successfully fly through them, avoid obstacles, and recognize a variety of objects without computing 3D world models—an alternative to SLAM.
“If the insect robotics community succeeds with this SLAM alternative,” they write, “then the results will permeate all robotics.”
Related Articles:
Autonomous drones based on bees use AI to work together Chahat Deep Singh named a Future Faculty Fellow Zampogiannis, Ganguly, Aloimonos and Fermüller author "Vision During Action," chapter in new Springer book CSRankings places Maryland robotics at #10 in the U.S. The Falcon and the Flock Aloimonos, Sandini contribute chapter to MIT Press book, Cognitive Robotics 'OysterNet' + underwater robots will aid in accurate oyster count New system uses machine learning to detect ripe strawberries and guide harvests Game-theoretic planning for autonomous vehicles Which way should I go?
July 15, 2020
|