Will you buy a self driving car?

vadimax

Flashlight Enthusiast
Joined
Dec 28, 2015
Messages
2,273
Location
Vilnius, Lithuania
This is my car. What do you think?

~ Chance :cool:
Q7yGjGn.jpg

And what's wrong with your car? :)

BTW, nice surroundings you've got.
 

vadimax

Flashlight Enthusiast
Joined
Dec 28, 2015
Messages
2,273
Location
Vilnius, Lithuania
maybe only after the tech has been out for a few years. maybe. sounds like something that can be hacked though.

Years have nothing in common with the issue. Those "autopilots" should be standardized, certified, and regularly inspected on par with same equipment in aviation. Otherwise they are dangerous gadgets so far and no more.
 

tab665

Flashlight Enthusiast
Joined
May 8, 2009
Messages
1,212
Location
north carolina
Years have nothing in common with the issue. Those "autopilots" should be standardized, certified, and regularly inspected on par with same equipment in aviation. Otherwise they are dangerous gadgets so far and no more.
thats why i would wait a few years.
 

Chicken Drumstick

Flashlight Enthusiast
Joined
Dec 9, 2011
Messages
1,651
Location
UK
Years have nothing in common with the issue. Those "autopilots" should be standardized, certified, and regularly inspected on par with same equipment in aviation. Otherwise they are dangerous gadgets so far and no more.
Any work on an aeroplane costs HUGE money though.

Would you really be happy paying £2000+ every 6 months just for an inspection?
 

stfc69

Newly Enlightened
Joined
Jan 22, 2016
Messages
163
Location
Wiltshire, UK
On a motorway or freeway it might work, but only if all cars were the same, probably reduce accidents as well.
On single carriageway roads, can't see it ever working tbh. How could a computer decide who has priority on a single track road? And will it avoid all the bl**dy potholes we have over here!!
 

Burgess

Flashaholic
Joined
Apr 10, 2006
Messages
6,548
Location
USA
Self-driving car might be
Just the Ticket for driving the kids to school
each morning, when YOU don't wanna' get
yourself outta' Bed to do so !


Just sayin' . . . . .
 

vadimax

Flashlight Enthusiast
Joined
Dec 28, 2015
Messages
2,273
Location
Vilnius, Lithuania
Any work on an aeroplane costs HUGE money though.

Would you really be happy paying £2000+ every 6 months just for an inspection?

Driving a device that weights a metric ton or more IS responsibility. A driver is an armed person. Being irresponsible he is capable to kill human beings. Will you let robots with guns in the streets paying nothing to provide their reliability and proper status? Vehicles ARE objects of danger. They are not a coffee machine which may spoil a drink at worst.

That's why certification and regular inspections of any autopilot is a must.
 

mattheww50

Flashlight Enthusiast
Joined
Jun 24, 2003
Messages
1,048
Location
SW Pennsylvania
Years have nothing in common with the issue. Those "autopilots" should be standardized, certified, and regularly inspected on par with same equipment in aviation. Otherwise they are dangerous gadgets so far and no more.
I think the problem is far more complex than an aircraft autopilot. If an aircraft autopilot doesn't like the situation, it simply disengages, and aircraft are designed so that they will usually continue to fly for some time with no control inputs from any source. There are no 'safe' failure modes for an automobile auto pilot. Usually the reason for disengaging is a situation that the software cannot deal with, and generally that means immediate action is required to prevent something bad from happening. The most dramatic such failure in recent history was probably Air France 447, where the auto pilot disengaged because of a Pitot tube icing up. The folks on the flight deck were far better at managing the automation that they were at flying the aircraft, and the situation rapidly went from bad to catastrophe. There is an important lesson buried the AF447 accident. Someone has to be available who can take over, and understands how to actually operate the aircraft (or in the case of a self driving vehicle, the vehicle). I.e. in an aircraft at least in theory, there is always a competent human to back up the autopilot. Self Driving vehicles will still require someone in the driver's seat to take over when the software cannot deal with the situation. it may be a very rare event, but with several hundred million vehicles on the road, low probability events happen with surprising frequency. In an aircraft an event that occurs once per million flight hours may never occur in the entire life of an aircraft fleet. In a vehicle that would probably happen several times a day on the roads of North America. It is the management of these rare (and often total unexpected events) that concerns me.

Simply bringing the vehicle to a halt in the middle of traffic on an expressway with the traffic traveling at 70+MPH for example is likely to get you rear ended at very high speed. When I lived in Kuwait I used to see the end result of these crashes frequently, and it wasn't pretty. (In Kuwait the locals believe the local laws didn't apply to them, which was in general true, but the are no exemptions from the laws of physics, so when someone was driving down the road at 120kph, and one of the locals drives into him at 240kph, the laws of physics dictate a very unpleasant outcome). Software can be viewed as a giant state engine, which a near infinite number of states. It is virtually impossible to create software that will cover all of the possibilities. In my career working on operating systems for computer, the most common comment next to the Halt instruction was 'This cannot possibly happen'. Obviously It did happen, and in ways the manufacturer and/or software authors never imagined as possible. The main problem is the user invariably thinks of things to do with the product that manufacturer never imagined, and usually the results were very ugly.


I don't see self driving vehicles operating in unrestricted environments anytime soon. My other question is do the companies creating the software have sufficient assets to cover the potential liability? Obviously companies like Apple and Google do. Most Software companies have little in the way of 'hard' assets. Their value is in the value of their intellectual property, and that doesn't pay for jury awards. I know in my career I walked away from several potentially lucrative contracts because I didn't like the potential liabilities. While we would have made good money, if there was a liability problem, the costs were going to be in the millions if not tens of millions of dollars.

So while self driving vehicles on limited access highways may be available within the next decade, the general purpose solution for use in all kinds of traffic, roads and environments is far much further down the road. Without doubt, in the real world, there are many traffic situations that will be encountered that the original software never contemplated, and that almost always leads to unpleasant outcomes.
 

bykfixer

Flashaholic
Joined
Aug 9, 2015
Messages
20,477
Location
Dust in the Wind
^^ we'll put you in the "no way they should allowed" category.





I hope mine will stop for this little dude.
If they don't stop for squirrels, cats and balls rolling in the street... no thanx.

Interstate driving? I aint letting my car do that for me. No way.
 

Thetasigma

Flashlight Enthusiast
Joined
Nov 10, 2015
Messages
1,197
Location
Michigan, USA
Self-driving requires superhuman precision and reliability to be practical, so no, the technology is no where near ready. Even once it is I would highly prefer to be in control. As has been mentioned, maintenance is a problem as many do not have the time or the funds to deal with the full range of repairs and work required to keep an automobile in perfect operating condition, and adding automation would significantly complicate the problem unless manufacturers were footing the bill for maintenance.
 

martinaee

Flashlight Enthusiast
Joined
Sep 16, 2012
Messages
1,495
Location
Ohio
I think seeing how good some of the tech is already in cars like Teslas shows how good it will potentially become even if its not perfected yet. With smaller and smaller chip-sets that have magnitudes more processing power in the future it will only get better and better.

What I'm not sure of is if it should ever be seen as "total automation" rather than more of an auto-pilot mode that a human can quickly override. When the human "driver" is also always ready to take over if necessary it only makes it even safer. I don't think most people should see this (at least right now) as tech that lets them not drive at all. I have a feeling situations will dictate where it works best too. Sunny weather on well paved streets will be great. Mega snowstorm with horrible visibility and both the human and car need to be working together.
 

scs

Flashlight Enthusiast
Joined
Feb 9, 2015
Messages
1,803
I'm for either more driver assistance and I still do all the driving, or complete automation and I do none of it and don't have to keep an eye on the car at all. The in between state that's being marketed right now, I'm not a fan. If I have to monitor another party doing the driving at all times, I might as well drive myself.
 

P_A_S_1

Flashlight Enthusiast
Joined
Jul 1, 2010
Messages
1,271
Location
NYC
Yes. I don't enjoy driving and I hate traffic. While it won't happen in my lifetime I would love to see all vehicles self driving as it will reduce accidents through driver error/driver negligence/fatigue, reduce traffic as much traffic is the result of individual driving habits (ghost traffic jams I believe is the term), allow more vehicles on the road with less congestion via automated traffic control, and simply be more efficient overall. There was a time when an elevator had an operator and to do for the automobile what automation did for the elevator will be a great improvement over the norm.




bykfixer, great photo.
 
Last edited:
Top