Today there are many distractions in the car to distract the driver. These distractions include more than just the ones mentioned by the government such as arguing kids, turning on the radio and roadside diversions. Therefore, the driver can sometimes miss accidents that are about to happen.
Active safety systems such as forward collision prevention and lane keeping assist can automatically take over a car’s brakes and steering when sensors detect that an accident is imminent. These so-called “driver assist” systems use cameras and other sensors as well as software to detect and then respond to potentially dangerous situations that drivers may miss.
While driver assist systems look at external factors to determine whether to take action, researchers at Cornell and Stanford that go by the name Brain4Cars are working on a prototype that also takes into account internal elements, namely drivers and their body language. The system uses some of the same cameras and sensors employed by driver assist systems along with a new computer algorithm to predict what a driver will do and then issues a warning or takes corrective action.
“There are many systems now that monitor what’s going on outside the car,” said Ashutosh Saxena, an assistant professor of computer science at Cornell who spearheaded the project. “Internal monitoring of the driver will be the next leap forward.”
Systems such as Driver Attention Monitor found in some Lexus vehicles already keep an eye on drivers by using a small infrared camera mounted on the steering column that detects their head position. If it senses that a driver is looking away from the road for a certain length of time, a warning sounds to draw attention forward. I’ve also tested prototype systems from Continental and Volvo that can track drivers’ head as well as eye movements to determine if they are looking away from the road.