Image

Drones learn to observe environment and steer clear

July 17, 2014
Late in 2012, maker-tinkerer and technical writer Paul Wallich created a buzz when he wrote about the drone he built to walk his son to the bus stop each morning. Fast-forward to 2014: Now researchers think they have a new way to make such UAVs and other vehicles more nimble in the face of environmental variables. The design uses neuromorphic sensors (that get triggered by sudden events) instead of the standard inertia-measuring sensors (such as accelerometers and gyroscopes) to track motion.
About a year and a half ago, maker-tinkerer and technical writer Paul Wallich created a buzz when he wrote about the drone he built to walk his son to the bus stop each morning. The unmanned aerial vehicle (UAV) took the form of a quadrotor (four-rotor) copter, and did the job ... mostly — but Wallich admitted to having troubles with getting the UAV to power through windy days and avoid collisions.

Well, drone technology advances by leaps and bounds.  Now researchers think they have a new way to make these UAVs and other vehicles more nimble in the face of environmental variables. The programming uses neuromorphic sensors (that get triggered by sudden events) instead of the standard inertia-measuring sensors (such as accelerometers and gyroscopes) to track motion.

Even autonomous vehicles with cameras and controls need time interpret camera data about the environment. Here, state-estimation algorithms identify image features first (usually boundaries between objects through shade and color differences) then select a subset unlikely to change with new perspectives. Some dozen msec later, cameras fire again and the algorithm attempts to match current features to previous ones. Once the algorithm matches features, it calculates the vehicle’s change in position. The sampling takes 50 to 250 msec depending on how dramatically the environment changes, and the whole control cycle to correct course takes 0.2 sec or more — not fast enough to react to sudden changes in a vehicle’s surroundings.

Neuromorphic-sensor-based designs give autonomous vehicles ultra-fast reaction times — something current models lack.

To address this limitation, researcher Andrea Censi of MIT’s Laboratory for Information and Decision Systems and others have developed a way to supplement cameras with a neuromorphic sensor that takes measurements a million times a second.

Censi and colleagues presented the new algorithm at the International Conference on Robotics and Automation earlier this year. Vehicles running the algorithm can update location every 0.001 sec to make nimble maneuvers. "Other cameras have sensors and a clock, so with a 30-frames-per-sec camera, the clock freezes all the values every 33 msec" says Censi — and then values are read. In contrast, neuromorphic sensors let each pixel act as an independent sensor. “When a change in luminance is larger than a threshold, the pixel … communicates this information as an event and then waits until it sees another change."

The algorithm tracks every change in luminance every 1 µsec and supplements camera data with events, so doesn’t need to identify features. Comparing the before and after of a situation's change is easier, because even dynamic environments don't change much over a µsec. The algorithm doesn’t match all the features in the previous and current situation at once, either — but instead generates hypotheses about how far the vehicle moved. Then over time, the algorithm uses a statistical construct called a Bingham distribution to pick the hypothesis that’s confirmed most often and track vehicle orientation more efficiently that other approaches.

Recent experiments with a small vehicle fitted with a camera and event-based sensor show the algorithm is as accurate as existing state-estimation algorithms. Censi says with that done, the next step is to develop controls that decide what to do based on state estimates.

What's most interesting is that the algotrithm is said to work particularly well for making quadrators with only onboard perception and control nimbler. So maybe it's time for Wallich to perfect his son-walking UAV at last.

About the Author

Elisabeth Eitel Blog

Elisabeth is Senior Editor of Machine Design magazine. She has a B.S. in Mechanical Engineering from Fenn College at Cleveland State University. Over the last decade, Elisabeth has worked as a technical writer — most recently as Chief Editor of Motion System Design magazine.

Sponsored Recommendations

Safety Risk Assessment Guidelines for Automation Equipment

Dec. 20, 2024
This Frequently Asked Questions (FAQ) covers the basics of risk assessments, including the goals of the assessment, gathering the right team to perform them, and several methodologies...

Safety Products Overview

Dec. 20, 2024
The collection of machine safeguarding devices from Schmersal include keyed interlocks, solenoid locks, safety sensors, limit switches, safety light curtains and more.

SAFER Workplace: Stop, Assess, Formulate, Execute, Review

Dec. 20, 2024
Our SAFER Workplace initiative promotes workplace safety, with a heightened focus on machine safety, to reduce the potential of near misses, accidents, and injuries. Behaving ...

The advantages of a Built-in Bluetooth Interface for Your Safety Light Curtains

Dec. 20, 2024
Safety Light Curtains with Bluetooth InterfaceGuido Gutmann, Key Account Manager, Optoelectronic Safety Devices, Schmersal Group, explains the advantages of ...

Voice your opinion!

To join the conversation, and become an exclusive member of Machine Design, create an account today!