No more Fender Benders

Feb. 23, 2006
Advanced video-imaging sensors and driver-assistance systems may make auto accidents increasingly rare.

Gun Poespowidjojo
Automotive Manager
Eastman Kodak Co.
Rochester, N.Y.

Cameras mounted in the side rearview mirrors check for other vehicles in the driver's blind spot. The driverassistance system alerts drivers to the presence of the vehicle if they attempt to change lanes, preventing a possible fender bender.


Cameras in automotive vision systems use sensors similar to this KAC9618 CMOS Active Pixel Sensor from Kodak. The large rectangular area is a 648 488 active-pixel array. The array snaps 30 monochrome pictures/sec and converts them to a digital data stream for processing by an external controller.


Today's cars satisfy an astonishing array of consumer priorities. Our vehicles have become extensions of our working and living environments.

Safety is, of course, a consumer priority. The World Health Organization says traffic accidents cause 1.2 million deaths and 50 million injuries annually. In 2002, traffic accidents were the ninth leading cause of death worldwide. And the frequency of traffic accidents rises with population density. Experts project traffic-related incidents will be the third leading cause of worldwide deaths by the year 2020.

Engineers have greatly enhanced both vehicle and road safety over the past 25 years. Vehicles withstand crash impacts more effectively while safety systems such as air bags and seat belts protect vehicle occupants.

Automotive engineers agree that there is little more they can do to improve passive-safety devices. Instead, they're shifting focus to active-safety systems promoting accident prevention instead of protection. The overall goal is to help drivers avoid accidents altogether. Examples of active-safety technologies currently deployed include antilock brakes, traction control, and electronicstability systems. As good as these systems are, newer innovations promise to be even better.

Driver-assistance systems (DAS) analyze data from a combination of cameras trained on the driver and surrounding area. Cameras positioned at various locations throughout the car monitor the exterior front, sides, and rear of the vehicle while interior cameras watch the driver and front side passenger.

Onboard vision-sensing processors identify both fixed and moving objects while calculating their speed, position, and distance. The DAS uses this information to identify potential hazards or events and triggers appropriate safety actions to either mitigate or eliminate the problem. Actions may take the form of driver alerts, such as warning sounds, vibrations, and signal lights; or they may include defensive countermeasures such as applying brakes, tightening seat belts, or deploying air bags.

Driver-assistance systems work in three areas relating to automobile safety. First, the systems help speed reactions in critical situations. Drivers alerted those few extra seconds before a potential crash may have enough time to maneuver the vehicle out of danger.

Second, a DAS reduces driver stress and potential accidents by taking over monotonous tasks. For example, many commuters know the mind-numbing experience of driving in "stop-and-go" traffic. At best it creates an environment where drivers are less alert. By monitoring the surrounding traffic, the DAS reduces the risk of preventable driver errors such as rear-impact collisions.

And finally, driver-assistance systems can monitor a driver's workload and state of awareness. Interior cameras determine whether a driver is alert, attentive, and in a proper state behind the wheel.

Calculating a car's speed and acceleration is trivial. Gathering information about the area surrounding the vehicle is more difficult. Early driver-assistance systems tried numerous forms of sensing technologies including radar, lidar, ultrasonic, thermal, and imaging. Of all sensor types, imaging emerges as the most promising. Benefits that image sensors offer include high-spatial resolution for object recognition, color information, and relatively low cost for high volumes.

For example, high-spatial resolution lets the system identify various objects as vehicles, pedestrians, and lane markings. Color capture discerns differences between signs, traffic lights, and car head and stoplights.

Image sensors do not emit signals that may create interference or become distorted on return as can happen with radar or ultrasonics. Image sensors also offer flexible control and easy configuration for multiple applications.

Driver-assistance systems can take many forms. From simple lane-departure warnings to imminent crash detection, the goal is to keep the driver and occupants of the vehicle safe.

Lane-departure warning is an example of a DAS application best handled through imaging technology. A camera mounted near the rearview mirror faces the front of the vehicle. From there it tracks lane markings on the left and right sides of the car. Software uses the sensor data to calculate wheel-tolane distance, time-to-lane crossing, and road curvature. It can also distinguish between different types of lane markings.

If the driver appears to be veering out of the lane without applying the turn signal, the system triggers a warning. The warning could be a voiced alert, an audible rumbling sound, or vibration of the driver's seat.

Adaptive-cruise control is another DAS function that takes over the constant acceleration and braking necessary when there is other traffic on the road. It's another front-facing application where imaging provides special benefits. The image sensor tracks the vehicle located directly ahead of the car. The DAS uses information from the image sensor as well as vehicle speed and acceleration gleaned from other onboard sensors to calculate a safe following distance. is Cruise control then automatically maintains this distance.

Image sensors are ideal for adaptive-cruise control because it not only tracks vehicles but also classifies them. By capturing details such as vehicle shape, width, and height, image sensors can distinguish between a small car, medium SUV, or heavy truck, and appropriately adjust the distance at which to follow.

Imaging systems can track vehicles through turns. Neither radar nor lidar-based systems perform that task well because they measure distance by reflected signals. A turning vehicle may not reflect enough signal for proper sensing. The wide field of view of the image sensor also detects and warns against cars cutting into the lane.

Another application where imaging plays a vital role concerns the rear view. Cameras mounted in the bumper, license plate, or brake light of the car capture scenes directly behind the vehicle. Small objects detected low to the ground and outside the field of view of the driver trigger both an aural and visual alert. An audio tone varying in pitch or loudness signals the vehicle's approach to the object. A video monitor then displays the object for the driver to review. This way the driver can check areas not otherwise visible through rear or side mirrors.

Rearview systems also help drivers park their vehicles. Imagingbased rearview systems give drivers added piece of mind by displaying exactly where the car is relative to its surroundings.

Current "smart air-bag systems" use weight sensing to trigger air-bag deployment. Multiple weight sensors reside in the seat cushions. Rough calculations based on the amount and distribution of weight determine the size and position of passengers.

Instead of making rough calculations, image sensors would actually capture a picture of the front passenger area. Two image sensors mounted near the dashboard operate in synchronized stereovision. By capturing shape, width, and height, the sensors discern people from other heavy objects such as boxes. These systems also distinguish between adults and small children. This data lets the system determine whether or not to launch the air bag in a collision and if so, how much force to deploy.

The image sensors also identify and track the heads of passengers to determine the best angle of airbag deployment, maximizing protection and minimizing injury. Similar systems using weight sensors to determine passenger position cannot offer the same degree of accuracy.

The benefits of using imaging technology grow as imaging becomes more sophisticated. One example is in the signal-to-noise ratio — how well the sensor gathers accurate imaging data. Moresensitive devices can monitor objects farther away under adverse conditions. For example, " nightvision" sensors can identify potential hazards at night well beyond the range of headlights.

Improvements in dynamic range let sensors extract details from dark and light regions. For instance, drivers exiting a tunnel into direct sunlight may miss dangerous conditions like stopped traffic as their eyes adjust to the brightness. An image sensor compensates immediately to the intensity change, assesses the dangerous situation, and alerts the driver.

Automotive engineers have identified 20 different driverassistance systems suitable for imagesensor technologies requiring up to 10 cameras/car. First-generation DAS gives drivers only audible sounds with some visual data. Future systems may assume monotonous tasks and provide higher levels of support during critical situations. For example, cutting-edge rearview systems automatically guide cars into parking spots. They might even help identify suitable parking spaces.

MAKE CONTACT:
Eastman Kodak Co., (585) 477-1470,
kodak.com

Sponsored Recommendations

Flexible Power and Energy Systems for the Evolving Factory

Aug. 29, 2024
Exploring industrial drives, power supplies, and energy solutions to reduce peak power usage and installation costs, & to promote overall system efficiency

Timber Recanting with SEW-EURODRIVE!

Aug. 29, 2024
SEW-EURODRIVE's VFDs and gearmotors enhance timber resawing by delivering precise, efficient cuts while reducing equipment stress. Upgrade your sawmill to improve safety, yield...

Advancing Automation with Linear Motors and Electric Cylinders

Aug. 28, 2024
With SEW‑EURODRIVE, you get first-class linear motors for applications that require direct translational movement.

Gear Up for the Toughest Jobs!

Aug. 28, 2024
Check out SEW-EURODRIVEs heavy-duty gear units, built to power through mining, cement, and steel challenges with ease!

Voice your opinion!

To join the conversation, and become an exclusive member of Machine Design, create an account today!