BBC News Item re Self Driving Cars

The place to discuss everything else..
User avatar
Tom 2000
Posts: 1049
Joined: Sat Aug 19, 2017 7:23 am
Location: Norn Iron

Post by Tom 2000 »

Macan SD Vocano Grey. LEDs, Pano Roof, PSE, Sports Chrono, PASM, Sports Design Mirrors, 21" Sports Classics in Black, lots of other extras.
http://www.porsche-code.com/PJ2XHAR5 for the day that this works again.

987 Boxster 2.7 (2006)

Dandock
Posts: 4096
Joined: Sat Jun 06, 2015 7:29 pm

Post by Dandock »

Ditto.

Moral 1... never underestimate the stupidity and gullibility of Joe & Josie Driver.
Moral 2... never underestimate the willingness of providers to promote and defend their offerings.

https://www.theguardian.com/technology/ ... SApp_Other
VG Petrol S http://www.porsche-code.com/PHIVCQU7           And a GT3 RS... by Lego! Not crash-tested! 😀
Deleted User 1874

Post by Deleted User 1874 »

Dandock wrote: Tue Jun 12, 2018 8:58 am Ditto.

Moral 1... never underestimate the stupidity and gullibility of Joe & Josie Driver.
Moral 2... never underestimate the willingness of providers to promote and defend their offerings.

https://www.theguardian.com/technology/ ... SApp_Other
You forgot to add:-

Moral 3... never underestimate the media in exaggerating a story for effect.

Most people are not as dumb as they make out. Only the handful of idiots who make the headlines and those are the kind of people who don't need a semi-autonomous driving aid to drive dangerously. People crash their cars into all kinds of objects every single day while simply not paying attention or deliberately driving dangerously.

The question is how many crashes could these driving aids prevent vs crashes they may cause? It's a hard question to answer since prevented accidents are never reported, but there are an awful lot of crashes where some form of semi-autonomous driving aid may have prevented the accident.

What we do know is that any significant incident involving a Tesla (even if Autopilot was not engaged) will be major news and there are hundreds of thousands of them on the roads. I've heard of maybe half a dozen crashes where Autopilot may have been a factor i.e. system engaged, but the driver clearly paying no attention to the road or using Autopilot somewhere obviously inappropriate. There have also been news stories where Autopilot was initially blamed in the press and then later found to be switched off, but the media never seem to go back and correct those headline grabbers!
Dandock
Posts: 4096
Joined: Sat Jun 06, 2015 7:29 pm

Post by Dandock »

I think all the above are true. What I would also say is that people are very capable of being that dumb and that capacity should not under any circumstances be underestimated. As the Guardian piece says there are inherent dangers in suggesting full autonomy. In doing so it creates a new level of risk. Those manufacturers are rightly proud of their respective systems and naturally talk them up. Its simple marketing but mustn’t be over-egged.
Yes the media loves a good story buy I’m sure you’d agree that the Guardian, for eco reasons, is on the autonomous side.
It’s also to be remembered that Tesla are just like any other big business when it comes to protecting their reputation. They/he will talk their product up at every opportunity but have been, it appears, less that cooperative and truthful in respect to some investigations.
We must also remember that what accidents there have been are embryonic, more akin to air accidents than conventional car crashes.
What we do now know is that there are some common factors in the current systems fallibility which relates to following a vehicle which then moves to reveal a parked vehicle. Or something like that.
I think there’s a huge degree of learning ahead. Just look at mobile phone use in cars. You’d imagine the message would have got through by now but...!
VG Petrol S http://www.porsche-code.com/PHIVCQU7           And a GT3 RS... by Lego! Not crash-tested! 😀
Deleted User 1874

Post by Deleted User 1874 »

Dandock wrote: Tue Jun 12, 2018 1:37 pm
What we do now know is that there are some common factors in the current systems fallibility which relates to following a vehicle which then moves to reveal a parked vehicle. Or something like that.
That's a good example as shown in the video. Now imagine an average driver in the same situation without any form of driver aid. The car in front swerved literally at the last fraction of a second without even braking first (look how close it gets to the back of the stationary car) and left the following Tesla totally unsighted until it was too late to react. At least the Autopilot slammed on the brakes before hitting the back of the stationary car. I think quite a few manual drivers would have crashed anyway in that specific scenario, unless they were being extremely attentive. In fact that is quite a common accident to have approaching a motorway line up. Just because the semi-autonomous system can't deal with marginal cases like that, doesn't mean humans can either. I'd love to see that test repeated with ordinary drivers without telling them what was coming up. Would they all swerve to miss the stationary car? I very much doubt it!
Col Lamb
Posts: 9323
Joined: Fri Oct 30, 2015 8:38 pm
Location: Lancashire

Post by Col Lamb »

All it showed to me was that the Tesla system leaves a lot to be desired, no way should the S have crashed into the stationary car. No amount of at least it applied the barakes can ever justify a control system that fails to work which the Teslas did.

The other clip of a Beamer was potentially dangerous as the BMW system totally failed to keep the car in the lane when there was a quick deviation in the road layout, it moved over to its left by over a full lanes width, definately a Mway pile up in the making.

Put a dumb driver aid in a car and some driver will be dumber.

No autopilot system will be safe until they are all integrated with each othet and in each vehicle there are sensors on the steering wheel to ensure hands are where they should be and that this is integrated with eye position recognition system which checks that the driver is looking where they should be looking. Even then crashes will occur as there are just to many variables to make the system safe.
Col
Macan Turbo
Air, 20” wheels, ACC, Pano, SurCam, 14w, LEDs, PS+, Int Light Pack, Heated seats and Steering, spare wheel, SC, Privacy glass, PDK gear, SD mirrors, Met Black, rear airbags
Dandock
Posts: 4096
Joined: Sat Jun 06, 2015 7:29 pm

Post by Dandock »

Peteski wrote: Tue Jun 12, 2018 2:27 pm
Dandock wrote: Tue Jun 12, 2018 1:37 pm
What we do now know is that there are some common factors in the current systems fallibility which relates to following a vehicle which then moves to reveal a parked vehicle. Or something like that.
That's a good example as shown in the video. Now imagine an average driver in the same situation without any form of driver aid. The car in front swerved literally at the last fraction of a second without even braking first (look how close it gets to the back of the stationary car) and left the following Tesla totally unsighted until it was too late to react. At least the Autopilot slammed on the brakes before hitting the back of the stationary car. I think quite a few manual drivers would have crashed anyway in that specific scenario, unless they were being extremely attentive. In fact that is quite a common accident to have approaching a motorway line up. Just because the semi-autonomous system can't deal with marginal cases like that, doesn't mean humans can either. I'd love to see that test repeated with ordinary drivers without telling them what was coming up. Would they all swerve to miss the stationary car? I very much doubt it!
That of course is speculation and subject to too many variables. You could equally well argue that a good driver would leave sufficient space as natural caution. As I said these situations are embryonic but based on the number of autopilot updates issued still work in progress. It’s all very well Tesla blaming drivers for what may or may not have happened immediately prior to any given situation but what is clear and what the updates tell us is that the system has by Tesla’s own admission, been found to be fallible.
And now Mr M is claiming ‘full’ autonomy later this year. Yet greater licence for abuse or greater prescience? Probably both! But if he were building planes I don’t think he’d get the approvals - yet.
As Colin says... still work in progress.
VG Petrol S http://www.porsche-code.com/PHIVCQU7           And a GT3 RS... by Lego! Not crash-tested! 😀
User avatar
Guy
Posts: 2148
Joined: Tue Nov 04, 2014 11:06 am
Location: Warwickshire

Post by Guy »

Two observations from the video:

- The Tesla didn't apply the 2 second rule.
- The Tesla didn't appear to 'see' that the car in front was indicating (it didn't need to brake) and slow as a precaution.

Easy software fix?
Dandock
Posts: 4096
Joined: Sat Jun 06, 2015 7:29 pm

Post by Dandock »

Guy wrote: Tue Jun 12, 2018 3:32 pm Two observations from the video:

- The Tesla didn't apply the 2 second rule.
- The Tesla didn't appear to 'see' that the car in front was indicating (it didn't need to brake) and slow as a precaution.

Easy software fix?
Beta version issues?
VG Petrol S http://www.porsche-code.com/PHIVCQU7           And a GT3 RS... by Lego! Not crash-tested! 😀
Deleted User 1874

Post by Deleted User 1874 »

Dandock wrote: Tue Jun 12, 2018 3:24 pm
That of course is speculation and subject to too many variables. You could equally well argue that a good driver would leave sufficient space as natural caution. As I said these situations are embryonic but based on the number of autopilot updates issued still work in progress. It’s all very well Tesla blaming drivers for what may or may not have happened immediately prior to any given situation but what is clear and what the updates tell us is that the system has by Tesla’s own admission, been found to be fallible.
And now Mr M is claiming ‘full’ autonomy later this year. Yet greater licence for abuse or greater prescience? Probably both! But if he were building planes I don’t think he’d get the approvals - yet.
As Colin says... still work in progress.
With the Tesla system you can adjust the following distance and I always use maximum distance on a motorway to provide a safe gap. This was definitely not set to the maximum following distance. They could make it more foolproof by not allowing close following at higher speeds, but they leave it to the driver's discretion as it stands. Tesla owners often complain that when they do set follow to a safe distance everyone just cuts in front of them and I've had this issue myself. Basically a LOT of people don't understand what a safe following distance actually is!
Post Reply

  • Similar Topics
    Replies
    Views
    Last post