Wireless steerable vision for live insects and insect-scale robots
Vision serves as an essential sensory input for insects but consumes substantial energy resources. The cost to support sensitive photoreceptors has led many insects to develop high visual acuity in only small retinal regions and evolve to move their visual systems independent of their bodies through head motion.
By understanding the trade-offs made by insect vision systems in nature, we can design better vision systems for insect-scale robotics in a way that balances energy, computation, and mass. Here, we report a fully wireless, power-autonomous, mechanically steerable vision system that imitates head motion in a form factor small enough to mount on the back of a live beetle or a similarly sized terrestrial robot.
Our electronics and actuator weigh 248 milligrams and can steer the camera over 60° based on commands from a smartphone. The camera streams “first person” 160 pixels–by–120 pixels monochrome video at 1 to 5 frames per second (fps) to a Bluetooth radio from up to 120 meters away.
We mounted this vision system on two species of freely walking live beetles, demonstrating that triggering image capture using an onboard accelerometer achieves operational times of up to 6 hours with a 10–milliamp hour battery.
We also built a small, terrestrial robot (1.6 centimeters by 2 centimeters) that can move at up to 3.5 centimeters per second, support vision, and operate for 63 to 260 minutes.
Our results demonstrate that steerable vision can enable object tracking and wide-angle views for 26 to 84 times lower energy than moving the whole robot.
More : https://robotics.sciencemag.org/content/5/44/eabb0839
Scientific publication : https://www.sciencemag.org/about/science-licenses-journal-article-reuse
185 Stevens Way, AE100R Campus Box 352500
Paul G. Allen Center, Department of Electrical Engineering
Seattle, WA 98195-2500
I am a final year PhD. student in Electrical and Computer Engineering at the University of Washington where I work in the Network and Mobile Systems Lab with Shyam Gollakota. I also work closely with Sawyer Fuller who runs the Autonomous Insect Robotics Lab. My research focuses on wireless technologies such as communication, power and localization for a variety of resource constrained platforms including low power sensors and insect scale robots. Recently I have been focused on developing bio-integrative systems such as cameras and sensors small enough to ride on the back of live insects like bumblebees and beetles. I am also a part of the Urban Innovation Initiative at Microsoft Research working on Project Eclipse, a low-cost cloud connected air quality monitoring platform for cities.
Before coming to UW I did my Bachelors in Electrical Engineering and Computer Sciences at UC Berkeley where I worked on a chip scale flow cytometer with Bernhard Boser.
I will be applying for faculty positions this year. I expect to graduate in spring 2021.