When you are no longer the driver, but the car is do the rules still apply the same way, and do you suffer the consequences of driving in one.
For example when the Florida Highway Patrol pulled him over this month for driving too fast, Brooks Weisblat didn’t bother telling the officer that his Tesla Model S had been driving itself.
“That would have definitely got me a ticket,” said Weisblat, who got a warning notice instead.
Florida doesn’t have a driver’s handbook dictating robot rules of the road. No state does, but California could become the global model next year when it publishes first-in-the-world consumer rules for self-driving cars.
Those regulations are already a year behind schedule. Among the problems vexing officials with the Department of Motor Vehicles is how to handle not just the machines but their over trusting owners.
“The technology is ready. I’m not sure the people are ready,” said Weisblat, who along with his Model S and its new Autopilot feature didn’t notice the sign warning that the freeway speed limit had dropped by 10 miles per hour as it approached Miami. “You still need to pay attention.”
Google has for years been testing vehicles near its Mountain View headquarters that are meant to be fully autonomous, requiring no human intervention except a rider’s voice saying “Take me to the supermarket.” But most carmakers developing self-driving technology are working on tools that relieve but don’t entirely replace human drivers.
Can you input a conscience into a car, can ethics be broken down into data and codes.
Skeptics of driverless cars have a variety of criticisms, from technical to demand based, but perhaps the most curious is the supposed ethical trolley problem it creates. While the question of how driverless cars will behave in ethical situations is interesting and will ultimately have to be answered by programmers, critics greatly exaggerate it’s importance. In addition, they assume that driverless cars have to be perfect rather than just better.
The basic trolley problem involves being put in a situation where you have to choose between killing some people and killing others. For example, imaging you are driving your car and another car is headign right towards you and you have to either hit them head on or swerve into a group of pedestrians. What does a robot do!? This, it is argued, presents a big issue for driverless cars. How do we program them? How will they react in such situations?
The first problem with this is that humans are assumed to be doing a pretty good job at driving already, including in so-called trolley car situations. For example, here is Patrick Lin writing at the Atlantic with a paean to human’s driving abilities:
“But there are important differences between humans and machines that could warrant a stricter test. For one thing, we’re reasonably confident that human drivers can exercise judgment in a wide range of dynamic situations that don’t appear in a standard 40-minute driving test; we presume they can act ethically and wisely. Autonomous cars are new technologies and won’t have that track record for quite some time.”
The idea that humans will act ethically and wisely while driving is an absurd and false assumption. For starters, in 2013 over 10,000 people were killed in alcohol-impaired driving crashes, which accounts for 31% of vehicle related deaths. So from the start we have a third of all driving deaths resulting from humans who are probably often using poor judgment, and unethical and unwise decision-making.
The recent dream for car companies is to be able to sell a car that can drive itself, which company will get there first?
Last week’s “exclusive” in the Guardian claiming to “confirm Apple is building self-driving car” raised quite a buzz. Much of that buzz was skeptical, with many pointing out that the facts failed to support the Guardian’s conclusion.
The logical leap that Guardian made was that an Apple engineer’s interest in the GoMentum Station vehicle test track confirmed Apple’s driverless car program. This is too big a leap, as a range of Apple car-related aspirations—self-driving or not—might have use for such a test track.
Let’s assume, however, that the Guardian is right and Apple does have a driverless car ready for testing. (This is possible, as Apple has hired many automotive engineers, including the former CEO of Mercedes Benz’s Silicon Valley research center.) What would that say about the relative state of Apple’s driverless car?
It would tell us that Apple is millions of miles behind Google, and falling further behind every day.
As one of the few companies in the world richer than Google, Apple can match the cars, sensors, processors, navigational systems and other pieces of hardware that Google might deploy. It can replicate the sophisticated maps that Google has compiled. It will have a very hard time, however, catching up with Google’s on-the-road learning.
Apple taking precautions to keep their progress secret as they test their autonomous vehicles.
Apple is looking into using a former military base northeast of San Francisco as a high-security proving ground for autonomous vehicles it is developing, according to an online report by British newspaper The Guardian.
Engineers from the technology giant’s Special Projects group have been in contact with representatives of GoMentum Station, a 2,100-acre facility on the site of what used to be the Concord Naval Weapons Station, in Concord, Calif.
Correspondence obtained by The Guardian through public records requests shows Apple is interested in using the sprawling sites, which has more than 20 miles of paved roads, city streets, railroad crossings and tunnels, to test self-driving vehicles.
Both Honda and Mercedes-Benz have been using GoMentum Station for testing their own autonomous cars.
News of Apple’s interest in the former base is the latest glimpse into Apple’s secretive autonomous-car program. The maker of iPhones and MacBooks had said little publicly about its vehicle-development efforts, but in recent months it has hired some well-known executives from automakers.
The race has begun for car companies everywhere to manufacture an autonomous car. The concept is simple, but the execution has been difficult. Everybody cannot wait for Mercedes to release their Class- E car for the world to see.
What has so far only been shown in test situations will be available as of about March next year, when Daimler’s new model goes on sale. The technology packing the vehicle shows how quickly automated driving systems have advanced since 1998, when the Mercedes S class first featured cruise control that could adjust its speed to follow a car in front.
“Innovations in this area are coming thick and fast,” Thomas Weber, Daimler’s head of development, said in his office in Sindelfingen, near Stuttgart, Germany. “While we don’t want to feed wrong expectations such as sleeping in the car, autonomous driving is set to become a reality much more quickly than the public thinks.”
Self-driving systems are among many areas in which Mercedes is working to gain an edge on rivals Audi and BMW. Currently No. 3 in luxury-car sales, Daimler is fighting to take the lead in the segment by 2020.
It’s also testing the limits of what’s allowed under current regulations, which in most places require the driver to be in a position to control the vehicle at all times.