As was foretold, we've added advertisements to the forums! If you have questions, or if you encounter any bugs, please visit this thread: https://forums.penny-arcade.com/discussion/240191/forum-advertisement-faq-and-reports-thread/
Options

[Autonomous Transportation] When the cars have all the jobs, the poor will walk the earth

1131416181948

Posts

  • Options
    Giggles_FunsworthGiggles_Funsworth Blight on Discourse Bay Area SprawlRegistered User regular
    Aioua wrote: »
    Ok another example.

    Most of the fancier car computers these days will severely restrict their functionally while the car is moving. Maybe you can charge the song but you can't sync a new phone. You can pull up the map but not enter in an address for the gps directions.
    Why? They could just say the driver should never use the system while the car is moving and leave it at that.

    Because humans gonna human. If the driver assist does such a good job that the average person is going to (eventually) forget to pay attention to the road, but the driver assist isn't good enough to be trusted to drive the car unsupervised, then it is a dangerous feature and shouldn't be allowed.

    We're not wasting time blaming the guy because he paid the price. We're trying to prevent an accident like this from happening again.

    I'm not even sure that they shouldn't be allowed. If driver assist systems like this result in net fewer severe accidents with the exception of a tiny increase in decapitations, bring on the driver assist. What really upsets me here is we don't have data on that because Tesla chose to beta potentially fatal equipment on public roads. I also have a hard time believing that if this has undergone even a little bit more testing they would have realized that they should probably have radar that sweeps the full outline of the car in front, but if it results in fewer accidents even with the crappy hardware then I'm still pro driver assist.

  • Options
    honoverehonovere Registered User regular
    shryke wrote: »
    KetBra wrote: »
    shryke wrote: »
    Aioua wrote: »
    Dedwrekka wrote: »
    Aioua wrote: »
    Visual light 2D image processing isn't really an acceptable way for an auto driving system to be looking to its front.

    Tesla said it's because the trailer blended I with the sky. Which means it's radar rangefinder doesn't sweep an area large enough to actually avoid collision.

    Like it's not some crazy circumstance that a vehicle was colored similarly to its background, the problem is that they were relying on inadequate systems to map the environment.

    I don't think that in the time Tesla's Autopilot has been running this is the first time that it has ever come across a white 18 wheeler.

    most of the time that stuff is going to be caught by the radar

    my problem is that their rangefinder isn't adequate to actually detect things in front of the car, and you can't rely solely on image processing to detect shit

    EDIT: like, to further explain

    by design, it sounds like their rangefinding systems only scan at about bumper-level, which means to detect anything higher than that falls back onto camera image processing
    I don't think this is a case of a freak accident where it couldn't see white-on-white. I don't think it can hardly ever see white-on-white, but this was a time it really mattered

    By design the driver is there to see the truck.

    See above references to attention spans for things you aren't controlling

    See the Tesla documentation on the thing where they say you have to pay attention.

    This whole "OMG, it didn't see the truck!" stuff is a bunch of nonsense. It's based on blaming the system for something it is explicitly not designed to do.

    We don't blame LG because some morons crashes his car cause he was texting while driving. "We know people can't pay attention to the road while using a cellphone and here they went and made cellphones for people and even made them accept texts and calls while driving! And look, most manufacturers have in-built hands-free-calling features to encourage cellphone use in the car even though that's still very distracting to the driver! Those bastards!" This argument is nonsense.

    The system tells you how it's designed to be used. If you can't use it properly, then don't. But ya can't blame the system for failing at something it explicitly tells you it doesn't do.


    PS - And please, stop waterboarding statistics in here. It doesn't know the launch codes.

    I think the comparison doesn't with cell phones doesn't work that well. The cell phone isn't advertised as as improving the driving and safety of your car. Tesla's sutopilot is.

    And judging from videos of the autopilot in action it clearly gives drivers the impression that it can do those things it isn't designed to do, like driving without constant driver input. It seems like it is just too easy to use it an improper way.

  • Options
    NyysjanNyysjan FinlandRegistered User regular
    There is what a feature can do, what feature is adverticed to do, and what idiot in a hurry could reasonably expect it to do.
    And i think Tesla's autopilot may have gone wrong in the last part.

  • Options
    DedwrekkaDedwrekka Metal Hell adjacentRegistered User regular
    edited July 2016
    Nyysjan wrote: »
    There is what a feature can do, what feature is adverticed to do, and what idiot in a hurry could reasonably expect it to do.
    And i think Tesla's autopilot may have gone wrong in the last part.

    Idiot in a hurry isn't a legal term in the US like it is in the UK, but it probably should be.

    Though that only applies to copyright law and not to expectations of what a product can do.

    Dedwrekka on
  • Options
    QuidQuid Definitely not a banana Registered User regular
    Dedwrekka wrote: »
    Nyysjan wrote: »
    There is what a feature can do, what feature is adverticed to do, and what idiot in a hurry could reasonably expect it to do.
    And i think Tesla's autopilot may have gone wrong in the last part.

    Idiot in a hurry isn't a legal term in the US like it is in the UK, but it probably should be.

    We have speeding and the increasingly common distracted driving as offenses though.

    Meanwhile I prefer companies to thoroughly test stuff like "autopilot" before they release it. They don't have to, obviously, but it definitely helps decide which brand to go with next.

  • Options
    japanjapan Registered User regular
    edited July 2016
    The legal term in England and Wales is "the man on the Clapham omnibus".

    Applied in this case in the sense of what that person would understand from the sum of Tesla's marketing and other communication.

    japan on
  • Options
    NyysjanNyysjan FinlandRegistered User regular
    Dedwrekka wrote: »
    Nyysjan wrote: »
    There is what a feature can do, what feature is adverticed to do, and what idiot in a hurry could reasonably expect it to do.
    And i think Tesla's autopilot may have gone wrong in the last part.

    Idiot in a hurry isn't a legal term in the US like it is in the UK, but it probably should be.

    Though that only applies to copyright law and not to expectations of what a product can do.
    Was not talking about legal liability, used the "idiot in a hurry" mainly because i like the term.

  • Options
    DedwrekkaDedwrekka Metal Hell adjacentRegistered User regular
    japan wrote: »
    The legal term in England and Wales is "the man on the Clapham omnibus".

    Applied in this case in the sense of what that person would understand from the sum of Tesla's marketing and other communication.

    In this case Tesla was incredibly regular with it's communication on the subject. Keep hands on the wheel, be ready to take over from the computer.

  • Options
    japanjapan Registered User regular
    edited July 2016
    Dedwrekka wrote: »
    japan wrote: »
    The legal term in England and Wales is "the man on the Clapham omnibus".

    Applied in this case in the sense of what that person would understand from the sum of Tesla's marketing and other communication.

    In this case Tesla was incredibly regular with it's communication on the subject. Keep hands on the wheel, be ready to take over from the computer.

    I think they may be running into fundamental limitations of the human attention span, though, so arguments about the reasonableness of their communication may be beside the point.

    A reasonable person may have understood the advice and yet been incapable of complying with it.

    I tend to think that, if litigated, it would be approached from a product safety angle - that is, there is no safe way to use the product as it is intended to be used - rather than taking the tack that the user of the product wasn't sufficiently informed of the correct way to use it.

    It's also worth bearing in mind that the driver behaving negligently doesn't preclude the product being unsafe, and finding that the product is unsafe doesn't mean that the driver wasn't behaving negligently. They're two separate and unconnected questions.

    The part that sticks out for me, and my issue with the autopilot system as it exists, is that there doesn't seem to be a clear boundary where the driver is expected to take over. Even assuming that the driver was paying attention, chances are that the point at which you realise that the car has failed to recognise a hazard could very well be after the point of realistically being able to do anything about it.

    Edit: most of that has little to do with what I quoted

    japan on
  • Options
    AbsoluteZeroAbsoluteZero The new film by Quentin Koopantino Registered User regular
    kime wrote: »
    The Ender wrote: »
    Dedwrekka wrote: »
    The Ender wrote: »
    kime wrote:
    That last paragraph doesn't really count if we're looking at the data. Like, sure, he'd have caught that maybe ("if his eyes are on the road"), but Autopilot may have prevented other, earlier accidents

    While that may be true, 'this car cannot detect something as conspicuous as an 18 wheeler crossing the highway' is waaaay below the threshold I think we should be willing to accept for any sort of autopilot functionality.


    That seems like an oversimplification or a misunderstanding of the situation. Even Tesla has admitted that there was something specific about the exact set of circumstances that led to the trailer not being recognized by the system. The facts we have right now say that for some reason that system was unable to recognize the 18 wheeler under those specific circumstances, but the way you're wording it is that the system can't recognize an 18 wheeler under any circumstances. That's simply not the case.

    The Tesla Autosteer needs to be modified to require continuous driver input, yes. The system needs to be examined to find out exactly what went wrong under those circumstances, yes. We don't need to pretend that the system is incapable of functioning, or inflate the issue to do that.

    Tesla claims that the car couldn't see the trailer because it's color scheme made it blend into the sky. That's completely unacceptable for any sort of autopilot system. As others have said, the hardware that the car is using are simply insufficient for building an accurate model of the environment.


    The trade-off for driver assistance features is that drivers will rely on those systems. So, the question to ask before implementing them is, 'Do these systems have a positive enough impact on driving that the reduction in driver awareness / participation is offset?'

    Here the answer was, 'Nope!' and the cost for learning that answer was a fatality because they decided to test the product on the highway before testing in a controlled environment how people would actually use it. It was more important to Tesla to make an immediate return on their technology investment than to actually turnaround something safe.


    Honestly their reaction to what happened is extremely scummy to me, absolving themselves of all blame because 'well gee whiz we told people to watch the road!'


    EDIT: Also, I think you are exaggerating how complex or rare the incident was.

    Are 18 wheelers rare? No. Is it unusual to come across an 18 wheeler doing a U-turn (or any number of other dumb things) on a highway? No. Are 18 wheelers hard to spot? No.

    Is it acceptable that the autopilot then slammed into an 18 wheeler because it couldn't properly see a relatively common vehicle (a high ride trailer) doing something quite typical and thus didn't react at all? Of course not.

    Autopilot is glorified cruise control. I think expectations are way too high on that system, and Tesla doesn't make it a secret that just like if you were using cruise control, you need to be ready to take control of the vehicle at any moment.

    Apparently this guy was watching a harry potter movie on a portable dvd player at the time of the crash. At what point do we start laying some of the responsibility on him?

    Human beings are not capable of paying attention to boring things that don't (apparently) require our attention.

    Admitting that feels like we are just removing blame for something that people should be able to control (if they weren't lazy or whatever), but it looks like our brains just literally don't work like that.

    So do we lay responsibility on someone for not doing something they can't do?

    This line of reasoning is ridiculous. We are talking about a full grown human being. Not a baby. Not a chimp. Not a goldfish.

    Lets say you were driving a normal car, and you crossed the center line because you weren't paying attention when you should have been. You plow into a minivan and kill an entire family. Do you think the courts would let you walk because "humans can't be expected to pay attention"? No, your stupid fucking ass would be in prison.

    cs6f034fsffl.jpg
  • Options
    japanjapan Registered User regular
    It's telling that most other manufacturers pitch their systems, which don't appear to be any less advanced than Tesla's system, as aids that intervene when the driver fails to detect a hazard that the system does.

    Tesla turns that the other way around and says to leave everything to the system, but be ready to intervene if it misses something.

    In respect of this:
    Lets say you were driving a normal car, and you crossed the center line because you weren't paying attention when you should have been. You plow into a minivan and kill an entire family. Do you think the courts would let you walk because "humans can't be expected to pay attention"? No, your stupid fucking ass would be in prison.

    A situation that relies on a human operator responding perfectly at all times is an inherently hazardous one.

    An example that's been used in litigation in the UK is the hypothetical of parking a car in the fast lane of a motorway: in principle, any driver that hits it should be deemed negligent because they should be driving within their own braking distance and be alert to any hazards.

    It is nevertheless also the case that if someone does park a car in such a position then it simply becomes a matter of time before someone hits it.

    You can act to remove a foreseeable hazard even if in theory anyone falling victim to it is negligent.

  • Options
    tinwhiskerstinwhiskers Registered User regular
    japan wrote: »
    It's telling that most other manufacturers pitch their systems, which don't appear to be any less advanced than Tesla's system, as aids that intervene when the driver fails to detect a hazard that the system does.

    Tesla turns that the other way around and says to leave everything to the system, but be ready to intervene if it misses something.

    In respect of this:
    Lets say you were driving a normal car, and you crossed the center line because you weren't paying attention when you should have been. You plow into a minivan and kill an entire family. Do you think the courts would let you walk because "humans can't be expected to pay attention"? No, your stupid fucking ass would be in prison.

    A situation that relies on a human operator responding perfectly at all times is an inherently hazardous one.


    An example that's been used in litigation in the UK is the hypothetical of parking a car in the fast lane of a motorway: in principle, any driver that hits it should be deemed negligent because they should be driving within their own braking distance and be alert to any hazards.

    It is nevertheless also the case that if someone does park a car in such a position then it simply becomes a matter of time before someone hits it.

    You can act to remove a foreseeable hazard even if in theory anyone falling victim to it is negligent.

    aka driving a car the old fashioned way with no driver assistance.

    6ylyzxlir2dz.png
  • Options
    japanjapan Registered User regular
    japan wrote: »
    It's telling that most other manufacturers pitch their systems, which don't appear to be any less advanced than Tesla's system, as aids that intervene when the driver fails to detect a hazard that the system does.

    Tesla turns that the other way around and says to leave everything to the system, but be ready to intervene if it misses something.

    In respect of this:
    Lets say you were driving a normal car, and you crossed the center line because you weren't paying attention when you should have been. You plow into a minivan and kill an entire family. Do you think the courts would let you walk because "humans can't be expected to pay attention"? No, your stupid fucking ass would be in prison.

    A situation that relies on a human operator responding perfectly at all times is an inherently hazardous one.


    An example that's been used in litigation in the UK is the hypothetical of parking a car in the fast lane of a motorway: in principle, any driver that hits it should be deemed negligent because they should be driving within their own braking distance and be alert to any hazards.

    It is nevertheless also the case that if someone does park a car in such a position then it simply becomes a matter of time before someone hits it.

    You can act to remove a foreseeable hazard even if in theory anyone falling victim to it is negligent.

    aka driving a car the old fashioned way with no driver assistance.

    We have very carefully spent decades designing vehicles and transport infrastructure around the assumption that drivers won't respond perfectly most of the time.

    The point of the parked-car-in-the-fast-lane analogy is that people respond poorly to uncommon events - in theory every driver should be aware at all times of the possibility of unexpected obstructions, but in reality they occur so rarely that is probable that on encountering one, a driver will not be able to react properly. Accordingly, the task of making sure that drivers don't encounter such obstructions is taken pretty seriously.

    That's what makes the Tesla system not really good enough to be what they paint it as. Given Tesla brought up the comparison in fatality rates it would be more interesting and/or useful to compare stats with cars that have the same or similar systems that act passively instead of actively. My hunch is that they would be at least as effective (assuming they are effective - there really isn't enough data to draw a conclusion) in reducing rates of accidents vs a car with no driver aids, because they can intervene to prevent the same classes of accidents without presenting a hazard arising from inattentiveness.

  • Options
    Emissary42Emissary42 Registered User regular
    edited July 2016
    Is there a more general Automation thread? Because this is now happening: Momentum Machines is now hiring for their first restaurant in San Francisco. The tl;dr on the company is they've been developing a machine that can produce custom burgers - even down to custom-blended patties - at a dispensing rate of one every 60-seconds at its fastest. It looks like they've completed development.

    Emissary42 on
  • Options
    PaladinPaladin Registered User regular
    edited July 2016
    kime wrote: »
    The Ender wrote: »
    Dedwrekka wrote: »
    The Ender wrote: »
    kime wrote:
    That last paragraph doesn't really count if we're looking at the data. Like, sure, he'd have caught that maybe ("if his eyes are on the road"), but Autopilot may have prevented other, earlier accidents

    While that may be true, 'this car cannot detect something as conspicuous as an 18 wheeler crossing the highway' is waaaay below the threshold I think we should be willing to accept for any sort of autopilot functionality.


    That seems like an oversimplification or a misunderstanding of the situation. Even Tesla has admitted that there was something specific about the exact set of circumstances that led to the trailer not being recognized by the system. The facts we have right now say that for some reason that system was unable to recognize the 18 wheeler under those specific circumstances, but the way you're wording it is that the system can't recognize an 18 wheeler under any circumstances. That's simply not the case.

    The Tesla Autosteer needs to be modified to require continuous driver input, yes. The system needs to be examined to find out exactly what went wrong under those circumstances, yes. We don't need to pretend that the system is incapable of functioning, or inflate the issue to do that.

    Tesla claims that the car couldn't see the trailer because it's color scheme made it blend into the sky. That's completely unacceptable for any sort of autopilot system. As others have said, the hardware that the car is using are simply insufficient for building an accurate model of the environment.


    The trade-off for driver assistance features is that drivers will rely on those systems. So, the question to ask before implementing them is, 'Do these systems have a positive enough impact on driving that the reduction in driver awareness / participation is offset?'

    Here the answer was, 'Nope!' and the cost for learning that answer was a fatality because they decided to test the product on the highway before testing in a controlled environment how people would actually use it. It was more important to Tesla to make an immediate return on their technology investment than to actually turnaround something safe.


    Honestly their reaction to what happened is extremely scummy to me, absolving themselves of all blame because 'well gee whiz we told people to watch the road!'


    EDIT: Also, I think you are exaggerating how complex or rare the incident was.

    Are 18 wheelers rare? No. Is it unusual to come across an 18 wheeler doing a U-turn (or any number of other dumb things) on a highway? No. Are 18 wheelers hard to spot? No.

    Is it acceptable that the autopilot then slammed into an 18 wheeler because it couldn't properly see a relatively common vehicle (a high ride trailer) doing something quite typical and thus didn't react at all? Of course not.

    Autopilot is glorified cruise control. I think expectations are way too high on that system, and Tesla doesn't make it a secret that just like if you were using cruise control, you need to be ready to take control of the vehicle at any moment.

    Apparently this guy was watching a harry potter movie on a portable dvd player at the time of the crash. At what point do we start laying some of the responsibility on him?

    Human beings are not capable of paying attention to boring things that don't (apparently) require our attention.

    Admitting that feels like we are just removing blame for something that people should be able to control (if they weren't lazy or whatever), but it looks like our brains just literally don't work like that.

    So do we lay responsibility on someone for not doing something they can't do?

    This line of reasoning is ridiculous. We are talking about a full grown human being. Not a baby. Not a chimp. Not a goldfish.

    Lets say you were driving a normal car, and you crossed the center line because you weren't paying attention when you should have been. You plow into a minivan and kill an entire family. Do you think the courts would let you walk because "humans can't be expected to pay attention"? No, your stupid fucking ass would be in prison.

    More likely you'd be dead. Which is why this sort of stuff is resolved by adding restrictions on the general driving public and not just resolving each case as it comes along.

    Paladin on
    Marty: The future, it's where you're going?
    Doc: That's right, twenty five years into the future. I've always dreamed on seeing the future, looking beyond my years, seeing the progress of mankind. I'll also be able to see who wins the next twenty-five world series.
  • Options
    The EnderThe Ender Registered User regular
    edited July 2016
    kime wrote: »
    The Ender wrote: »
    Dedwrekka wrote: »
    The Ender wrote: »
    kime wrote:
    That last paragraph doesn't really count if we're looking at the data. Like, sure, he'd have caught that maybe ("if his eyes are on the road"), but Autopilot may have prevented other, earlier accidents

    While that may be true, 'this car cannot detect something as conspicuous as an 18 wheeler crossing the highway' is waaaay below the threshold I think we should be willing to accept for any sort of autopilot functionality.


    That seems like an oversimplification or a misunderstanding of the situation. Even Tesla has admitted that there was something specific about the exact set of circumstances that led to the trailer not being recognized by the system. The facts we have right now say that for some reason that system was unable to recognize the 18 wheeler under those specific circumstances, but the way you're wording it is that the system can't recognize an 18 wheeler under any circumstances. That's simply not the case.

    The Tesla Autosteer needs to be modified to require continuous driver input, yes. The system needs to be examined to find out exactly what went wrong under those circumstances, yes. We don't need to pretend that the system is incapable of functioning, or inflate the issue to do that.

    Tesla claims that the car couldn't see the trailer because it's color scheme made it blend into the sky. That's completely unacceptable for any sort of autopilot system. As others have said, the hardware that the car is using are simply insufficient for building an accurate model of the environment.


    The trade-off for driver assistance features is that drivers will rely on those systems. So, the question to ask before implementing them is, 'Do these systems have a positive enough impact on driving that the reduction in driver awareness / participation is offset?'

    Here the answer was, 'Nope!' and the cost for learning that answer was a fatality because they decided to test the product on the highway before testing in a controlled environment how people would actually use it. It was more important to Tesla to make an immediate return on their technology investment than to actually turnaround something safe.


    Honestly their reaction to what happened is extremely scummy to me, absolving themselves of all blame because 'well gee whiz we told people to watch the road!'


    EDIT: Also, I think you are exaggerating how complex or rare the incident was.

    Are 18 wheelers rare? No. Is it unusual to come across an 18 wheeler doing a U-turn (or any number of other dumb things) on a highway? No. Are 18 wheelers hard to spot? No.

    Is it acceptable that the autopilot then slammed into an 18 wheeler because it couldn't properly see a relatively common vehicle (a high ride trailer) doing something quite typical and thus didn't react at all? Of course not.

    Autopilot is glorified cruise control. I think expectations are way too high on that system, and Tesla doesn't make it a secret that just like if you were using cruise control, you need to be ready to take control of the vehicle at any moment.

    Apparently this guy was watching a harry potter movie on a portable dvd player at the time of the crash. At what point do we start laying some of the responsibility on him?

    Human beings are not capable of paying attention to boring things that don't (apparently) require our attention.

    Admitting that feels like we are just removing blame for something that people should be able to control (if they weren't lazy or whatever), but it looks like our brains just literally don't work like that.

    So do we lay responsibility on someone for not doing something they can't do?

    This line of reasoning is ridiculous. We are talking about a full grown human being. Not a baby. Not a chimp. Not a goldfish.

    Lets say you were driving a normal car, and you crossed the center line because you weren't paying attention when you should have been. You plow into a minivan and kill an entire family. Do you think the courts would let you walk because "humans can't be expected to pay attention"? No, your stupid fucking ass would be in prison.

    ...I missed the part of this post that was an actual argument?

    Yes, we're talking about a fully grown-up human being. They also have limitations on their attention span & what is or is not reasonable for them to expect to do in a given situation.


    Real, grown-up human beings were tested in a controlled environment by Google to see how they would respond to being asked to serve as a back-up to an early model self driving system. Guess what? All of them failed to be effective back-up drivers, their attention ultimately fading because that's apparently a limitation for most people.


    Saying, 'WELL IT SHOULDN'T BE THAT WAY, HMPH!!' is an ideological / religious opinion that has no place in automotive design.

    The Ender on
    With Love and Courage
  • Options
    Giggles_FunsworthGiggles_Funsworth Blight on Discourse Bay Area SprawlRegistered User regular
    The Ender wrote: »
    kime wrote: »
    The Ender wrote: »
    Dedwrekka wrote: »
    The Ender wrote: »
    kime wrote:
    That last paragraph doesn't really count if we're looking at the data. Like, sure, he'd have caught that maybe ("if his eyes are on the road"), but Autopilot may have prevented other, earlier accidents

    While that may be true, 'this car cannot detect something as conspicuous as an 18 wheeler crossing the highway' is waaaay below the threshold I think we should be willing to accept for any sort of autopilot functionality.


    That seems like an oversimplification or a misunderstanding of the situation. Even Tesla has admitted that there was something specific about the exact set of circumstances that led to the trailer not being recognized by the system. The facts we have right now say that for some reason that system was unable to recognize the 18 wheeler under those specific circumstances, but the way you're wording it is that the system can't recognize an 18 wheeler under any circumstances. That's simply not the case.

    The Tesla Autosteer needs to be modified to require continuous driver input, yes. The system needs to be examined to find out exactly what went wrong under those circumstances, yes. We don't need to pretend that the system is incapable of functioning, or inflate the issue to do that.

    Tesla claims that the car couldn't see the trailer because it's color scheme made it blend into the sky. That's completely unacceptable for any sort of autopilot system. As others have said, the hardware that the car is using are simply insufficient for building an accurate model of the environment.


    The trade-off for driver assistance features is that drivers will rely on those systems. So, the question to ask before implementing them is, 'Do these systems have a positive enough impact on driving that the reduction in driver awareness / participation is offset?'

    Here the answer was, 'Nope!' and the cost for learning that answer was a fatality because they decided to test the product on the highway before testing in a controlled environment how people would actually use it. It was more important to Tesla to make an immediate return on their technology investment than to actually turnaround something safe.


    Honestly their reaction to what happened is extremely scummy to me, absolving themselves of all blame because 'well gee whiz we told people to watch the road!'


    EDIT: Also, I think you are exaggerating how complex or rare the incident was.

    Are 18 wheelers rare? No. Is it unusual to come across an 18 wheeler doing a U-turn (or any number of other dumb things) on a highway? No. Are 18 wheelers hard to spot? No.

    Is it acceptable that the autopilot then slammed into an 18 wheeler because it couldn't properly see a relatively common vehicle (a high ride trailer) doing something quite typical and thus didn't react at all? Of course not.

    Autopilot is glorified cruise control. I think expectations are way too high on that system, and Tesla doesn't make it a secret that just like if you were using cruise control, you need to be ready to take control of the vehicle at any moment.

    Apparently this guy was watching a harry potter movie on a portable dvd player at the time of the crash. At what point do we start laying some of the responsibility on him?

    Human beings are not capable of paying attention to boring things that don't (apparently) require our attention.

    Admitting that feels like we are just removing blame for something that people should be able to control (if they weren't lazy or whatever), but it looks like our brains just literally don't work like that.

    So do we lay responsibility on someone for not doing something they can't do?

    This line of reasoning is ridiculous. We are talking about a full grown human being. Not a baby. Not a chimp. Not a goldfish.

    Lets say you were driving a normal car, and you crossed the center line because you weren't paying attention when you should have been. You plow into a minivan and kill an entire family. Do you think the courts would let you walk because "humans can't be expected to pay attention"? No, your stupid fucking ass would be in prison.

    ...I missed the part of this post that was an actual argument?

    Yes, we're talking about a fully grown-up human being. They also have limitations on their attention span & what is or is not reasonable for them to expect to do in a given situation.


    Real, grown-up human beings were tested in a controlled environment by Google to see how they would respond to being asked to serve as a back-up to an early model self driving system. Guess what? All of them failed to be effective back-up drivers, their attention ultimately fading because that's apparently a limitation for most people.


    Saying, 'WELL IT SHOULDN'T BE THAT WAY, HMPH!!' is an ideological / religious opinion that has no place in automotive design.

    At this point I'm just ignoring people who are outraged that we aren't angry at people for not bootstrapping the responsibility to pay attention-- kind of like how they ignored all the posts referencing actual data that suggests this is a physical limitation of the way our brains are wired.

  • Options
    CambiataCambiata Commander Shepard The likes of which even GAWD has never seenRegistered User regular
    The Ender wrote: »
    kime wrote: »
    The Ender wrote: »
    Dedwrekka wrote: »
    The Ender wrote: »
    kime wrote:
    That last paragraph doesn't really count if we're looking at the data. Like, sure, he'd have caught that maybe ("if his eyes are on the road"), but Autopilot may have prevented other, earlier accidents

    While that may be true, 'this car cannot detect something as conspicuous as an 18 wheeler crossing the highway' is waaaay below the threshold I think we should be willing to accept for any sort of autopilot functionality.


    That seems like an oversimplification or a misunderstanding of the situation. Even Tesla has admitted that there was something specific about the exact set of circumstances that led to the trailer not being recognized by the system. The facts we have right now say that for some reason that system was unable to recognize the 18 wheeler under those specific circumstances, but the way you're wording it is that the system can't recognize an 18 wheeler under any circumstances. That's simply not the case.

    The Tesla Autosteer needs to be modified to require continuous driver input, yes. The system needs to be examined to find out exactly what went wrong under those circumstances, yes. We don't need to pretend that the system is incapable of functioning, or inflate the issue to do that.

    Tesla claims that the car couldn't see the trailer because it's color scheme made it blend into the sky. That's completely unacceptable for any sort of autopilot system. As others have said, the hardware that the car is using are simply insufficient for building an accurate model of the environment.


    The trade-off for driver assistance features is that drivers will rely on those systems. So, the question to ask before implementing them is, 'Do these systems have a positive enough impact on driving that the reduction in driver awareness / participation is offset?'

    Here the answer was, 'Nope!' and the cost for learning that answer was a fatality because they decided to test the product on the highway before testing in a controlled environment how people would actually use it. It was more important to Tesla to make an immediate return on their technology investment than to actually turnaround something safe.


    Honestly their reaction to what happened is extremely scummy to me, absolving themselves of all blame because 'well gee whiz we told people to watch the road!'


    EDIT: Also, I think you are exaggerating how complex or rare the incident was.

    Are 18 wheelers rare? No. Is it unusual to come across an 18 wheeler doing a U-turn (or any number of other dumb things) on a highway? No. Are 18 wheelers hard to spot? No.

    Is it acceptable that the autopilot then slammed into an 18 wheeler because it couldn't properly see a relatively common vehicle (a high ride trailer) doing something quite typical and thus didn't react at all? Of course not.

    Autopilot is glorified cruise control. I think expectations are way too high on that system, and Tesla doesn't make it a secret that just like if you were using cruise control, you need to be ready to take control of the vehicle at any moment.

    Apparently this guy was watching a harry potter movie on a portable dvd player at the time of the crash. At what point do we start laying some of the responsibility on him?

    Human beings are not capable of paying attention to boring things that don't (apparently) require our attention.

    Admitting that feels like we are just removing blame for something that people should be able to control (if they weren't lazy or whatever), but it looks like our brains just literally don't work like that.

    So do we lay responsibility on someone for not doing something they can't do?

    This line of reasoning is ridiculous. We are talking about a full grown human being. Not a baby. Not a chimp. Not a goldfish.

    Lets say you were driving a normal car, and you crossed the center line because you weren't paying attention when you should have been. You plow into a minivan and kill an entire family. Do you think the courts would let you walk because "humans can't be expected to pay attention"? No, your stupid fucking ass would be in prison.

    ...I missed the part of this post that was an actual argument?

    Yes, we're talking about a fully grown-up human being. They also have limitations on their attention span & what is or is not reasonable for them to expect to do in a given situation.


    Real, grown-up human beings were tested in a controlled environment by Google to see how they would respond to being asked to serve as a back-up to an early model self driving system. Guess what? All of them failed to be effective back-up drivers, their attention ultimately fading because that's apparently a limitation for most people.


    Saying, 'WELL IT SHOULDN'T BE THAT WAY, HMPH!!' is an ideological / religious opinion that has no place in automotive design.

    Yeah, I gotta say, AbsoluteZero, your argument makes no sense.

    It's not a regular driver not paying enough attention while driving.

    It's a specific situation that the human brain is incapable of handling.

    "If you divide the whole world into just enemies and friends, you'll end up destroying everything" --Nausicaa of the Valley of Wind
  • Options
    Commander ZoomCommander Zoom Registered User regular
    edited July 2016
    I suspect that the part of the argument that's setting people off is that it's being presented as an absolute: no human is capable of performing this task, period.
    And it doesn't even have to be every single human, or every single driver, to make it an intractable problem in actual use; merely a majority.
    But you dangle an absolute in front of people, especially (IMO) those somewhere on "the spectrum", and they have to have a go at it. "Well, what about...?"
    "Crazy Eddie", remember? Tell a human, "X is impossible", and they get positively mental about it.

    Commander Zoom on
  • Options
    AbsoluteZeroAbsoluteZero The new film by Quentin Koopantino Registered User regular
    edited July 2016
    You're telling me humans are incapable of paying attention despite enormous evidence to the contrary. Planes aren't dropping out of the sky. Surgeons operate for hours on end. Etc etc. You can't tell me that humans aren't capable of paying attention because a handfull of mooks decided to ignore basic instructions. Your sample size is positively tiny to be making that conclusion and as I said earlier, the courts likely wouldn't look so kindly on an accident that could have been prevented had the operator paid attention as instructed.

    AbsoluteZero on
    cs6f034fsffl.jpg
  • Options
    mcdermottmcdermott Registered User regular
    You're telling me humans are incapable of paying attention despite enormous evidence to the contrary. Planes aren't dropping out of the sky. Surgeons operate for hours on end. Etc etc. You can't tell me that humans aren't capable of paying attention because a handfull of mooks decided to ignore basic instructions. Your sample size is positively tiny to be making that conclusion and as I said earlier, the courts likely wouldn't look so kindly on an accident that could have been prevented had the operator paid attention as instructed.

    Funny thing about that...

    http://www.newyorker.com/science/maria-konnikova/hazards-automation

    This issue has been well know in relation to airplane autopilot for a while now.

    I think you can be safely ignored, really. You clearly can't be reasoned with.

  • Options
    PaladinPaladin Registered User regular
    Well the attitude here has shifted me to the other side. There are people perfectly capable of using the technology as it is now safely. Maybe not anybody here, but they can be identified.

    Marty: The future, it's where you're going?
    Doc: That's right, twenty five years into the future. I've always dreamed on seeing the future, looking beyond my years, seeing the progress of mankind. I'll also be able to see who wins the next twenty-five world series.
  • Options
    redxredx I(x)=2(x)+1 whole numbersRegistered User regular
    Paladin wrote: »
    Well the attitude here has shifted me to the other side. There are people perfectly capable of using the technology as it is now safely. Maybe not anybody here, but they can be identified.

    Right. People who when they pay attention to something, and are not externally reinforced, will continue to pay the same amount of attention to the thing.

    We just need obsessively compulsive attentive drivers, or maybe we can associate paying attention to the road with acute chemical dependence.

    Then we will have people who using this system won't diminish the amount of attention they pay.

    They moistly come out at night, moistly.
  • Options
    mcdermottmcdermott Registered User regular
    Paladin wrote: »
    Well the attitude here has shifted me to the other side. There are people perfectly capable of using the technology as it is now safely. Maybe not anybody here, but they can be identified.

    Individuals exist, sure.

    But apparently even a commercial pilot's license may not be an effective filter. A drivers license certainly isn't.

  • Options
    shrykeshryke Member of the Beast Registered User regular
    mcdermott wrote: »
    You're telling me humans are incapable of paying attention despite enormous evidence to the contrary. Planes aren't dropping out of the sky. Surgeons operate for hours on end. Etc etc. You can't tell me that humans aren't capable of paying attention because a handfull of mooks decided to ignore basic instructions. Your sample size is positively tiny to be making that conclusion and as I said earlier, the courts likely wouldn't look so kindly on an accident that could have been prevented had the operator paid attention as instructed.

    Funny thing about that...

    http://www.newyorker.com/science/maria-konnikova/hazards-automation

    This issue has been well know in relation to airplane autopilot for a while now.

    I think you can be safely ignored, really. You clearly can't be reasoned with.

    And yet planes are still one of the safest ways to travel and no one is tearing autopilots out.

    Nor, again, are they tearing out hands-free calling systems (and not-so-hands-free everything else systems) that are in basically every car these days.

  • Options
    PaladinPaladin Registered User regular
    mcdermott wrote: »
    Paladin wrote: »
    Well the attitude here has shifted me to the other side. There are people perfectly capable of using the technology as it is now safely. Maybe not anybody here, but they can be identified.

    Individuals exist, sure.

    But apparently even a commercial pilot's license may not be an effective filter. A drivers license certainly isn't.

    A license is as much a contract as it is certification. As an owner of a license to operate dangerous equipment, you are not programmed with immutable prime directives guaranteeing your compliance with all certification requirements. A license is the best effort to ensure that you understand every clause of a detailed list of demands that you sign, and you're the one that gets dinged should you forget any part of it.

    Perhaps a Driver's License with Automation Privileges should have more stringent and effective testing, but instituting a no fails allowed policy isn't realistic.

    Marty: The future, it's where you're going?
    Doc: That's right, twenty five years into the future. I've always dreamed on seeing the future, looking beyond my years, seeing the progress of mankind. I'll also be able to see who wins the next twenty-five world series.
  • Options
    redxredx I(x)=2(x)+1 whole numbersRegistered User regular
    Paladin wrote: »
    mcdermott wrote: »
    Paladin wrote: »
    Well the attitude here has shifted me to the other side. There are people perfectly capable of using the technology as it is now safely. Maybe not anybody here, but they can be identified.

    Individuals exist, sure.

    But apparently even a commercial pilot's license may not be an effective filter. A drivers license certainly isn't.

    A license is as much a contract as it is certification. As an owner of a license to operate dangerous equipment, you are not programmed with immutable prime directives guaranteeing your compliance with all certification requirements. A license is the best effort to ensure that you understand every clause of a detailed list of demands that you sign, and you're the one that gets dinged should you forget any part of it.

    Perhaps a Driver's License with Automation Privileges should have more stringent and effective testing, but instituting a no fails allowed policy isn't realistic.

    uh... his point is it isn't really something you can test for. After someone is used to the system and trusts it, and they are trying to pay attention, what the person had for breakfast will probably be a better indicator of their success than any test that could be administered in whatever amount of time you want to imagine.

    So, either the cars need to be safe without driver attention, or the cars need to successfully ensure the driver is paying attention, or people are going to die at a rate only somewhat less than they would in average car.

    They moistly come out at night, moistly.
  • Options
    kimekime Queen of Blades Registered User regular
    redx wrote: »
    Paladin wrote: »
    mcdermott wrote: »
    Paladin wrote: »
    Well the attitude here has shifted me to the other side. There are people perfectly capable of using the technology as it is now safely. Maybe not anybody here, but they can be identified.

    Individuals exist, sure.

    But apparently even a commercial pilot's license may not be an effective filter. A drivers license certainly isn't.

    A license is as much a contract as it is certification. As an owner of a license to operate dangerous equipment, you are not programmed with immutable prime directives guaranteeing your compliance with all certification requirements. A license is the best effort to ensure that you understand every clause of a detailed list of demands that you sign, and you're the one that gets dinged should you forget any part of it.

    Perhaps a Driver's License with Automation Privileges should have more stringent and effective testing, but instituting a no fails allowed policy isn't realistic.

    uh... his point is it isn't really something you can test for. After someone is used to the system and trusts it, and they are trying to pay attention, what the person had for breakfast will probably be a better indicator of their success than any test that could be administered in whatever amount of time you want to imagine.

    So, either the cars need to be safe without driver attention, or the cars need to successfully ensure the driver is paying attention, or people are going to die at a rate only somewhat less than they would in average car.

    Which, honestly, is something we as a society may decide we're ok with.

    At least until the really good automation comes out

    Battle.net ID: kime#1822
    3DS Friend Code: 3110-5393-4113
    Steam profile
  • Options
    AbsoluteZeroAbsoluteZero The new film by Quentin Koopantino Registered User regular
    mcdermott wrote: »
    You're telling me humans are incapable of paying attention despite enormous evidence to the contrary. Planes aren't dropping out of the sky. Surgeons operate for hours on end. Etc etc. You can't tell me that humans aren't capable of paying attention because a handfull of mooks decided to ignore basic instructions. Your sample size is positively tiny to be making that conclusion and as I said earlier, the courts likely wouldn't look so kindly on an accident that could have been prevented had the operator paid attention as instructed.

    Funny thing about that...

    http://www.newyorker.com/science/maria-konnikova/hazards-automation

    This issue has been well know in relation to airplane autopilot for a while now.

    I think you can be safely ignored, really. You clearly can't be reasoned with.

    Big surprise. You don't want to take responsibility for not paying attention when you should, and you don't want to take responsibility for your ineffective argument.

    cs6f034fsffl.jpg
  • Options
    KhavallKhavall British ColumbiaRegistered User regular
    The thing that always gets me during discussions of autonomous cars and ethical dilemmas and all that is that people are really bad at driving cars.

    Like people are really bad at driving cars. They lose concentration on the road. They especially lose concentration on the road if it's a route they've driven before. And people like to drink, and they often don't make plans and drive drunk. And that's bad, but it turns out even when you put penalties on that it still happens.

    People have blind spots that radar doesn't. People often don't follow the rules of the road for a bunch of reasons. People kill other people often because they're really, really, really bad at driving cars.

    If robots are just even 1% better at driving cars than people, then what's the problem? And the data is that they're way better at driving cars than people, even with just autonomous mode from Tesla, it's killed, like, one person so far.

    Sure, "Robots killing people" might sound bad, but why is it worse for people than "People killing people"?

  • Options
    PaladinPaladin Registered User regular
    redx wrote: »
    Paladin wrote: »
    mcdermott wrote: »
    Paladin wrote: »
    Well the attitude here has shifted me to the other side. There are people perfectly capable of using the technology as it is now safely. Maybe not anybody here, but they can be identified.

    Individuals exist, sure.

    But apparently even a commercial pilot's license may not be an effective filter. A drivers license certainly isn't.

    A license is as much a contract as it is certification. As an owner of a license to operate dangerous equipment, you are not programmed with immutable prime directives guaranteeing your compliance with all certification requirements. A license is the best effort to ensure that you understand every clause of a detailed list of demands that you sign, and you're the one that gets dinged should you forget any part of it.

    Perhaps a Driver's License with Automation Privileges should have more stringent and effective testing, but instituting a no fails allowed policy isn't realistic.

    uh... his point is it isn't really something you can test for. After someone is used to the system and trusts it, and they are trying to pay attention, what the person had for breakfast will probably be a better indicator of their success than any test that could be administered in whatever amount of time you want to imagine.

    So, either the cars need to be safe without driver attention, or the cars need to successfully ensure the driver is paying attention, or people are going to die at a rate only somewhat less than they would in average car.

    If I had an engineering / psychology degree, I could probably get a million dollars to spearhead a training program that is basically desert bus except you have to dodge a truck at random intervals. There's some other cognitive psych tests that have a really high attention performance threshold. External reinforcements, like you talked about earlier, could also be included, as well as other interventions that could possibly increase attention span. It's basically turning what was supposed to be a convenient feature into Chinese Water Torture, but that's usually how technology goes.

    It isn't an unsolvable problem because there are people who can pay attention to make split second reactions after long periods of not doing anything. Medical students are expected to scrub in to surgeries and aren't allowed to touch anything or even scratch their nose for hours at a time, yet they are required to answer challenging questions about the stage of the procedure sporadically.

    I don't understand the second part, because if people die at a rate somewhat less than the average car, isn't that the ball game? Sure, if you can increase the effect size, you waste less money, but that's Elon Musk's problem, not mine.

    Marty: The future, it's where you're going?
    Doc: That's right, twenty five years into the future. I've always dreamed on seeing the future, looking beyond my years, seeing the progress of mankind. I'll also be able to see who wins the next twenty-five world series.
  • Options
    redxredx I(x)=2(x)+1 whole numbersRegistered User regular
    People have blind spots that radar doesn't

    hilarious.

    They moistly come out at night, moistly.
  • Options
    KhavallKhavall British ColumbiaRegistered User regular
    redx wrote: »
    People have blind spots that radar doesn't

    hilarious.

    Radar also has blind spots that people don't, but in the last week I saw an accident that involved someone changing lanes directly into somebody into their blind spot. Which wouldn't have happened with even current autonomous mode

  • Options
    kimekime Queen of Blades Registered User regular
    I don't think the medical student/doctor ability fits. The whole problem with cars is that it's often something you've done for hundreds of hours in your life where nothing interesting or unexpected happens. It's not just the long period of having to pay attention, it's having to pay attention to something exceedingly dull.

    That probably doesn't match most surgeries. I'm not a doctor though, so maybe it does!

    Battle.net ID: kime#1822
    3DS Friend Code: 3110-5393-4113
    Steam profile
  • Options
    redxredx I(x)=2(x)+1 whole numbersRegistered User regular
    edited July 2016
    kime wrote: »
    I don't think the medical student/doctor ability fits. The whole problem with cars is that it's often something you've done for hundreds of hours in your life where nothing interesting or unexpected happens. It's not just the long period of having to pay attention, it's having to pay attention to something exceedingly dull.

    That probably doesn't match most surgeries. I'm not a doctor though, so maybe it does!

    No, it matches being an ICU nurse.
    Which is informative, because all these issues with attention and how much messaging is the correct amount to maintain attention, and check lists to ensure attention. It's a huge thing, lot's of studies because as morally fibered and well meaning as people can be, people die cause the proper response does occur in emergencies.

    redx on
    They moistly come out at night, moistly.
  • Options
    hippofanthippofant ティンク Registered User regular
    shryke wrote: »
    mcdermott wrote: »
    You're telling me humans are incapable of paying attention despite enormous evidence to the contrary. Planes aren't dropping out of the sky. Surgeons operate for hours on end. Etc etc. You can't tell me that humans aren't capable of paying attention because a handfull of mooks decided to ignore basic instructions. Your sample size is positively tiny to be making that conclusion and as I said earlier, the courts likely wouldn't look so kindly on an accident that could have been prevented had the operator paid attention as instructed.

    Funny thing about that...

    http://www.newyorker.com/science/maria-konnikova/hazards-automation

    This issue has been well know in relation to airplane autopilot for a while now.

    I think you can be safely ignored, really. You clearly can't be reasoned with.

    And yet planes are still one of the safest ways to travel and no one is tearing autopilots out.

    Nor, again, are they tearing out hands-free calling systems (and not-so-hands-free everything else systems) that are in basically every car these days.

    I legitimately wonder how much of that is financially motivated.

    Take-offs and landings are the most dangerous parts of commercial jet flights, and those are handled manually by pilots. Hell, IIRC, taxiing is more dangerous than regular high-altitude flight if you count by # of incidents. I'm not sure how much argument there is that having autopilot for high-altitude flights have improved safety. It has definitely lowered the amount of training pilots receive and increased the smoothness of flights as well as fuel-efficiency.

  • Options
    CambiataCambiata Commander Shepard The likes of which even GAWD has never seenRegistered User regular
    mcdermott wrote: »
    You're telling me humans are incapable of paying attention despite enormous evidence to the contrary. Planes aren't dropping out of the sky. Surgeons operate for hours on end. Etc etc. You can't tell me that humans aren't capable of paying attention because a handfull of mooks decided to ignore basic instructions. Your sample size is positively tiny to be making that conclusion and as I said earlier, the courts likely wouldn't look so kindly on an accident that could have been prevented had the operator paid attention as instructed.

    Funny thing about that...

    http://www.newyorker.com/science/maria-konnikova/hazards-automation

    This issue has been well know in relation to airplane autopilot for a while now.

    I think you can be safely ignored, really. You clearly can't be reasoned with.

    Big surprise. You don't want to take responsibility for not paying attention when you should, and you don't want to take responsibility for your ineffective argument.

    Let me try to couch it a different way.

    Let's say (improbably) that a technology was devised that required the driver to pay attention after having stayed awake for 40 straight hours. Do you think it would be the driver's fault if in that circumstance they simply could not pay enough attention to drive a car? Would you scold them as you are doing in the other circumstance, as being (I guess? ) undisciplined because they couldn't pay attention while not having had enough sleep?

    "If you divide the whole world into just enemies and friends, you'll end up destroying everything" --Nausicaa of the Valley of Wind
  • Options
    tinwhiskerstinwhiskers Registered User regular
    redx wrote: »
    Paladin wrote: »
    mcdermott wrote: »
    Paladin wrote: »
    Well the attitude here has shifted me to the other side. There are people perfectly capable of using the technology as it is now safely. Maybe not anybody here, but they can be identified.

    Individuals exist, sure.

    But apparently even a commercial pilot's license may not be an effective filter. A drivers license certainly isn't.

    A license is as much a contract as it is certification. As an owner of a license to operate dangerous equipment, you are not programmed with immutable prime directives guaranteeing your compliance with all certification requirements. A license is the best effort to ensure that you understand every clause of a detailed list of demands that you sign, and you're the one that gets dinged should you forget any part of it.

    Perhaps a Driver's License with Automation Privileges should have more stringent and effective testing, but instituting a no fails allowed policy isn't realistic.

    uh... his point is it isn't really something you can test for. After someone is used to the system and trusts it, and they are trying to pay attention, what the person had for breakfast will probably be a better indicator of their success than any test that could be administered in whatever amount of time you want to imagine.

    So, either the cars need to be safe without driver attention, or the cars need to successfully ensure the driver is paying attention, or people are going to die at a rate only somewhat less than they would in average car.

    And that's a problem how?

    6ylyzxlir2dz.png
  • Options
    redxredx I(x)=2(x)+1 whole numbersRegistered User regular
    redx wrote: »
    Paladin wrote: »
    mcdermott wrote: »
    Paladin wrote: »
    Well the attitude here has shifted me to the other side. There are people perfectly capable of using the technology as it is now safely. Maybe not anybody here, but they can be identified.

    Individuals exist, sure.

    But apparently even a commercial pilot's license may not be an effective filter. A drivers license certainly isn't.

    A license is as much a contract as it is certification. As an owner of a license to operate dangerous equipment, you are not programmed with immutable prime directives guaranteeing your compliance with all certification requirements. A license is the best effort to ensure that you understand every clause of a detailed list of demands that you sign, and you're the one that gets dinged should you forget any part of it.

    Perhaps a Driver's License with Automation Privileges should have more stringent and effective testing, but instituting a no fails allowed policy isn't realistic.

    uh... his point is it isn't really something you can test for. After someone is used to the system and trusts it, and they are trying to pay attention, what the person had for breakfast will probably be a better indicator of their success than any test that could be administered in whatever amount of time you want to imagine.

    So, either the cars need to be safe without driver attention, or the cars need to successfully ensure the driver is paying attention, or people are going to die at a rate only somewhat less than they would in average car.

    And that's a problem how?

    well... tesla's could save even more people, obviously.

    They moistly come out at night, moistly.
  • Options
    The WolfmanThe Wolfman Registered User regular
    I'm willing to believe the notion that, unless you are physically and consciously driving a car and in complete control of it, it is impossible to dedicate the same level of attentiveness if you weren't. That the act of directly doing something has a fundamental effect on your attention. Probably because there's a bunch of actions a driver is juggling subconsciously. Keeping the wheel steady, even pressure on the pedals, quick glances at the speedometer, and so on. It's all vital sensory input that goes in, and it's all input you do not have when simply sitting in the seat.

    "The sausage of Green Earth explodes with flavor like the cannon of culinary delight."
Sign In or Register to comment.