The world of science buzzed in 2012 when Harvard researchers announced they had created a robot insect. Smaller than a penny, lighter than a paper clip, the RoboBee could fly and land. A year later, it could follow a preprogrammed path. More recently, it became capable of swimming underwater.
Despite these advancements, the diminutive drones are years away from pollinating crop fields, searching collapsed buildings or performing the numerous other tasks that researchers envision swarms of the tiny machines doing. This is mainly because the RoboBee cannot sense the size, shape or distance of approaching objects. In other words, it can’t see.
To solve the problem, a UB-led team of researchers, supported by a $1.1 million grant from the National Science Foundation, is testing ways to shrink lidar—the laser-based sensor system currently used in driverless cars.
“Essentially, it’s the same technology that automakers are using to ensure that driverless cars don’t crash into things,” says Karthik Dantu, a computer science professor in UB’s School of Engineering and Applied Sciences, who is leading the project. “Only we need to shrink that technology so it works on robot bees that are no bigger than your fingertip.”
Developed in the 1960s, lidar works like radar, except that it emits invisible laser beams instead of microwaves. Mounted on a car, the beams capture light reflected from distant objects. Sensors then measure the time it takes the light to return to calculate the distance and shape of the objects. The information is analyzed by computer algorithms to form an image of the car’s path, which enables the car to “see” its environment and follow traffic signals, avoid obstacles and make other adjustments.
These systems, which are typically mounted on the car roof, are about the size of a camping lantern. The team Dantu is leading, which includes researchers from Harvard and the University of Florida, wants to make a much smaller version, called microlidar.
“The smallest commercial lidar systems weigh just over 800 grams, or nearly two pounds, but the robot bees are just 80 milligrams. To make this work, we need to shrink the entire sensing system,” says Dantu, who worked on the RoboBee project as a postdoctoral researcher at Harvard before joining UB in 2013.
Over the next three years, Florida researchers will develop tiny, lightweight sensors that measure the light’s reflection. Meanwhile, Dantu will create algorithms that enable the bees to process and map the world around them. Harvard researchers will then incorporate the technology into the bees.
The technology is years, if not decades, away from commercial viability. For example, the bees currently do not fly solo; they are tethered by a wire to an external power source.
But the potential applications are inspiring. For example, swarms of robot bees could be deployed to monitor air quality or the health of crops. They could be sent out to conduct search missions during landslides and other disasters, or examine buildings damaged by earthquakes.
Microlidar could be used to improve endoscopic tools, the wand-like probes that doctors use in surgery to visualize internal organs, and wearable technology, including sensors that monitor our bodies for signs of illness or stress. It could also help people use mobile devices in a way similar to Microsoft Kinect, which enables users to interact with computers by detecting the gestures people make.
“The technology could have so many uses, which have the potential to help so many people in need,” says Dantu. “That’s what really excites me about the work we’re doing.”