Tesla's Autopilot

I’m not saying the system could be distracted or not monitoring. I’m saying I don’t trust that the system is reliable enough to make, or not make, a decision which would create or fail to prevent an accident. I just don’t feel comfortable putting my safety in the hands of something I don’t trust.

http://i.huffpost.com/gen/1792244/thumbs/o-ROBOT-CAR-570.jpg?6

In the end, too, it’s basically impossible to completely eliminate accidents, even with autonomous vehicles. There are SO MANY environmental things that could occur while on the road, that NO manufacturer could ever imagine to develop for. Seriously, who would have thought that the sun glaring off the side of the truck would cause a detection issue? There are probably countless other fringe scenarios that would trip this thing up. Heck if a dear runs out in front of an autonomous vehicle on a busy highway, there is still bound to be a collision. Even if your confident the dear will be detected, you’re braking times may be unable to stop the car. And what if different cars have different braking distances (as they do now). Maybe a Tesla can stop before rear-ending the car in front of it, but the large autonomous SUV behind it can’t. There you’ve got a collision.

If you significantly reduce the overall accidents and death rates, I think you’re still winning. However, I think there is a psychological issue at hand here. If a human crashes their car, it’s considered an accident. An accidental mistake made by a human. As serious as accidents are, we can usually forgive a human for this action. Or we at least punish them, as a one-off, using existing law enforcement. If an autonomous device causes an accident, then how do you place blame? “OMG ALL TESLAS ARE DEATH TRAPS!” How do we as a society cope with the thought that a vehicle or machine has caused a collision and death. It’s no longer though of as a human accident, it’s thought of as a computer system error.

As a software engineer I cringe thinking about writing the routines that have to deal with no win situations on an autopilot.

Popular Mechanics had a great article explaining it.

Cliffs, your autopilot car is cruising down the road and 3 pedestrians jump out in front of you. In a millisecond the car knows it can’t stop in time. Should the autopilot swerve off the road? What if the only swerve option it has puts it into a wall likely to be an extreme crash or you? What if the only “out” left is into oncoming traffic?

Yeah, good point. But, most pedestrian accidents happen in cities, where the speed limits are say, 25-35mph. Stopping a car at those speeds is a lot easier. How many times to you see people trying to cross a highway? My Volvo stops itself for pedestrians and cyclists at speeds under 35mph. I’ve tried it, it works.

So just thinking here. Your car strikes / kills a pedestrian. Who’s responsible / liable for the incident?

Right, but the city is also where pedestrians are far more likely to step out from between cars and leave you no time to stop. Besides that the Volvo system is a driver aid not an autopilot. There’s a huge difference there. Ultimately the driver is still responsible for every action in a car with driver aids.

This is the difference of looking at it from a software engineering point of view vs not. Coding for how it will need to react the majority of the time is easy and that’s how most people think. Programming for all the exceptions is the part that takes a different mindset to write good code. When you’re dealing with a 4000lb object going 30-70mph you can’t just throw up an error message that says, “invalid input” when some lady pushes her baby carriage out from between two cars right into the path of your autopilot driven car just because there’s an oncoming car and no option to swerve. The human driver gets away with, “I didn’t have time to react”, but the autopilot car is going to have a log that shows it did have time and someone’s code decided to either hit the pedestrian or the oncoming car.

Tesla openly says autopilot mode isn’t fully autonomous, you are supposed to monitor road conditions.

I love driving for the sake of driving. But we will have fully autonomous driving cars in the future and it will be safer. But accidents will still happen. And then hopefully after that we will have some Jetsons type tube shit to transport us around. And even then someone will get cut in half from time to time when the blinker fluid in the regional flux capacitor goes low.

Buy Tesla
Buy put options on Tesla stock
Crash Tesla
PROFIT $$$$$$$

Either way, Telsa is still the safest way to get road head on the market.

I still think Comma.ai is gonna change the game a bit with how they are building learning into their app and not so much a defined set of rules. Started by the old iPhone and PlayStation jailbreaker Geohot.

http://comma.ai/

This, EXACTLY.

Even if you reduce the risks significantly, you still deal with the psychological issue of assigning fault.

Someone will buy it for a bunch and roll it into something else

Yea. Will see how profitable the consumer market is but eventually its gonna be driven by tech being licensed or bought into the manufacturers.

Woz says not happening:

That article would be a lot better if it explained what Level 5 actually was.

And yes, I agree with Woz that Level 5 is probably a long ways off.

Lol… whoops.

2 Likes

Tesla’s autopilot seems to have a blind spot for giant white trucks and trailers. :slight_smile:

yowch.

Related:

I was just coming here to post this video. There’s NO WAY this should be legal to have in beta like this on public roads. I love the one comment, “Seems like a viable alternative to having a toddler drive you home”.

Yeah this isn’t good in snow yet…

1 Like

Hello

I came back here and logged in to bump the Indian thread, it came up in conversation today and the examples I remembered got a good laugh from coworkers. I was sad to see it was closed.

Anyway, autopilot is one of those things I was also super leery of until I experienced it firsthand. It’s fantastic. I do a lot of drives to Rochester and it feels like it goes by so much faster now. I’ve been driving a model 3 for a few weeks for Uber and I can stay out later, longer, and it’s so nice. You still have to pay attention and be ready to take over at any time. Full self driving isn’t quite there yet (but I’ve never used it)

As far as snow performance, let’s just say I’m really happy I still have my AWD manual Element on falken wildpeaks as well :grimacing:

1 Like