Options

[Autonomous Transportation] When the cars have all the jobs, the poor will walk the earth

1111214161748

Posts

  • Options
    a5ehrena5ehren AtlantaRegistered User regular
    Jubal77 wrote: »
    That sounds like a weird area. Where left turns are allowed on a divided highway? Edit: And from the sounds of it without a stop light.

    Not really? I've driven hundred of miles of 4-lane US and state highways in semi-rural areas that are divided with occasional intersections that look pretty much just like the ones in the picture.

  • Options
    Giggles_FunsworthGiggles_Funsworth Blight on Discourse Bay Area SprawlRegistered User regular
    a5ehren wrote: »
    Jubal77 wrote: »
    That sounds like a weird area. Where left turns are allowed on a divided highway? Edit: And from the sounds of it without a stop light.

    Not really? I've driven hundred of miles of 4-lane US and state highways in semi-rural areas that are divided with occasional intersections that look pretty much just like the ones in the picture.

    Yeah, this is mad common in California.

  • Options
    Knuckle DraggerKnuckle Dragger Explosive Ovine Disposal Registered User regular
    Elvenshae wrote: »
    honovere wrote: »
    Elvenshae wrote: »
    zepherin wrote: »
    zepherin wrote: »
    honovere wrote: »
    Unrelated to the autopilot part of the crash, the car it the trailer in the worst possible way and from the sound of it the trailer didn't have underride protection which is something that really should be standard equipment by now in my opinion.

    Not going to happen until it is mandated by law. And that probably won't happen until they exhaust every possible study, from optimal mounting points and height to how they interact with the aerodynamic devices mounted under the trailer. The carriers, themselves, aren't about to strap an extra thousand pounds of steel to the trailer for safety devices that, at the gruesome bottom line, are going to make these accidents more costly to the company.
    Underride guard mounts on the rear and isn't foolproof, and don't help if a tractor trailer swings sideways into you. Most trucks have them though because some states do mandate it, but if you don't hit the guard square it can be bad.
    Mansfield bars are federally mandated. And even hitting them at an angle is better than running under the rear of the trailer.
    I wasn't trying to say they weren't, I was just saying they aren't fullproof. Although the skirts you see on the side of trucks are for aerodynamic purposes, not to keep people from driving under it, but it's nice when something happens to be duel purpose.

    The trailer wings really aren't, unless they have changed the design since I last saw one up close. They are just fiberglass sheets, maybe with a few thin struts if they are really fancy.

    Yeah - they're designed to change the airflow and to otherwise be as light as possible to maximize fuel savings. They aren't properly reinforced safety equipment by any stretch of the imagination.

    I was talking about something like this. That's not for aerodynamics, but maybe not a thing in the US?
    Mehrachsiger_Anhaenger.jpg


    I might better stop with that tangent though. Don't want to stray to far.

    I think Knuckle and I are talking about these:

    atd_1.jpg
    Yeah, those are what I am talking about. The metal bars I have never seen in the US. I'm not sure how they would implement them, since the law is trending towards requiring the aerodynamic trailer wings.

    Let not any one pacify his conscience by the delusion that he can do no harm if he takes no part, and forms no opinion.

    - John Stuart Mill
  • Options
    Giggles_FunsworthGiggles_Funsworth Blight on Discourse Bay Area SprawlRegistered User regular
    Ironically the lower clearance of the wings would have saved the person in question even though us plebes without autopiloting cars would still be boned.

  • Options
    AbsoluteZeroAbsoluteZero The new film by Quentin Koopantino Registered User regular
    I find it odd that Tesla's radar is not equipped to deal with objects that are off the ground only a couple feet, but not high enough to clear the vehicle entirely. You'd think it would be designed such that it would detect obstructions to any part of the vehicle's passage, and not just the bottom half. Seems like a good way to get decapitated.

    I trust there is some engineering or technical hurdle I'm not aware of, rather than this being a case of negligence.

    cs6f034fsffl.jpg
  • Options
    mcdermottmcdermott Registered User regular
    I find it odd that Tesla's radar is not equipped to deal with objects that are off the ground only a couple feet, but not high enough to clear the vehicle entirely. You'd think it would be designed such that it would detect obstructions to any part of the vehicle's passage, and not just the bottom half. Seems like a good way to get decapitated.

    I trust there is some engineering or technical hurdle I'm not aware of, rather than this being a case of negligence.

    No so much negligence as a trade off. To achieve the detection performance they needed at a price point they'd accept they had to limit the vertical scanning angle. That this radar will still function and increase overall safety the other 99.999% of the time is the benefit.

  • Options
    NyysjanNyysjan FinlandRegistered User regular
    mcdermott wrote: »
    I find it odd that Tesla's radar is not equipped to deal with objects that are off the ground only a couple feet, but not high enough to clear the vehicle entirely. You'd think it would be designed such that it would detect obstructions to any part of the vehicle's passage, and not just the bottom half. Seems like a good way to get decapitated.

    I trust there is some engineering or technical hurdle I'm not aware of, rather than this being a case of negligence.

    No so much negligence as a trade off. To achieve the detection performance they needed at a price point they'd accept they had to limit the vertical scanning angle. That this radar will still function and increase overall safety the other 99.999% of the time is the benefit.

    If it was just for automatic break, then sure.
    But if it's a "self driving", then it needs to be able to detect anything that does not clear the cars profile, otherwise it's just asking to hit anything that protrudes from the cars ahead.

  • Options
    AiouaAioua Ora Occidens Ora OptimaRegistered User regular
    My guess is the radar was intended to be for adaptive cruise control... You know the kind that will see a car directly in front of you and match its speed and stay back a given distance. Not one that's trying to build a 3D model of its surroundings.

    life's a game that you're bound to lose / like using a hammer to pound in screws
    fuck up once and you break your thumb / if you're happy at all then you're god damn dumb
    that's right we're on a fucked up cruise / God is dead but at least we have booze
    bad things happen, no one knows why / the sun burns out and everyone dies
  • Options
    kimekime Queen of Blades Registered User regular
    Nyysjan wrote: »
    mcdermott wrote: »
    I find it odd that Tesla's radar is not equipped to deal with objects that are off the ground only a couple feet, but not high enough to clear the vehicle entirely. You'd think it would be designed such that it would detect obstructions to any part of the vehicle's passage, and not just the bottom half. Seems like a good way to get decapitated.

    I trust there is some engineering or technical hurdle I'm not aware of, rather than this being a case of negligence.

    No so much negligence as a trade off. To achieve the detection performance they needed at a price point they'd accept they had to limit the vertical scanning angle. That this radar will still function and increase overall safety the other 99.999% of the time is the benefit.

    If it was just for automatic break, then sure.
    But if it's a "self driving", then it needs to be able to detect anything that does not clear the cars profile, otherwise it's just asking to hit anything that protrudes from the cars ahead.

    It sounds like they were mostly just counting on the camera or such to catch anything higher, since they specifically said the way the light coloring of the vehicle against the bright sky made it hard to see. Which seems like an obvious oversight, but eh.

    Battle.net ID: kime#1822
    3DS Friend Code: 3110-5393-4113
    Steam profile
  • Options
    Giggles_FunsworthGiggles_Funsworth Blight on Discourse Bay Area SprawlRegistered User regular
    kime wrote: »
    Nyysjan wrote: »
    mcdermott wrote: »
    I find it odd that Tesla's radar is not equipped to deal with objects that are off the ground only a couple feet, but not high enough to clear the vehicle entirely. You'd think it would be designed such that it would detect obstructions to any part of the vehicle's passage, and not just the bottom half. Seems like a good way to get decapitated.

    I trust there is some engineering or technical hurdle I'm not aware of, rather than this being a case of negligence.

    No so much negligence as a trade off. To achieve the detection performance they needed at a price point they'd accept they had to limit the vertical scanning angle. That this radar will still function and increase overall safety the other 99.999% of the time is the benefit.

    If it was just for automatic break, then sure.
    But if it's a "self driving", then it needs to be able to detect anything that does not clear the cars profile, otherwise it's just asking to hit anything that protrudes from the cars ahead.

    It sounds like they were mostly just counting on the camera or such to catch anything higher, since they specifically said the way the light coloring of the vehicle against the bright sky made it hard to see. Which seems like an obvious oversight, but eh.

    Honestly without having had a chance to talk to the people I know at the company, my take on it is that the thing was kludged together at the lowest available cost and this is the stupidest West Coast Libertarian disruptiest thing Musk has ever done. There's a reason they beat Google to market despite getting into the game later.

  • Options
    AiouaAioua Ora Occidens Ora OptimaRegistered User regular
    kime wrote: »
    Nyysjan wrote: »
    mcdermott wrote: »
    I find it odd that Tesla's radar is not equipped to deal with objects that are off the ground only a couple feet, but not high enough to clear the vehicle entirely. You'd think it would be designed such that it would detect obstructions to any part of the vehicle's passage, and not just the bottom half. Seems like a good way to get decapitated.

    I trust there is some engineering or technical hurdle I'm not aware of, rather than this being a case of negligence.

    No so much negligence as a trade off. To achieve the detection performance they needed at a price point they'd accept they had to limit the vertical scanning angle. That this radar will still function and increase overall safety the other 99.999% of the time is the benefit.

    If it was just for automatic break, then sure.
    But if it's a "self driving", then it needs to be able to detect anything that does not clear the cars profile, otherwise it's just asking to hit anything that protrudes from the cars ahead.

    It sounds like they were mostly just counting on the camera or such to catch anything higher, since they specifically said the way the light coloring of the vehicle against the bright sky made it hard to see. Which seems like an obvious oversight, but eh.

    Honestly without having had a chance to talk to the people I know at the company, my take on it is that the thing was kludged together at the lowest available cost and this is the stupidest West Coast Libertarian disruptiest thing Musk has ever done. There's a reason they beat Google to market despite getting into the game later.

    again, this is a system they pushed out via a software update

    I'm like 95% certain they took the existing sensors designed for parking cameras/cruise control radar/etc and made a 'good enough' autopilot out of it

    compared to the actual self-driving cars with their roof-mounted LIDAR arrays

    life's a game that you're bound to lose / like using a hammer to pound in screws
    fuck up once and you break your thumb / if you're happy at all then you're god damn dumb
    that's right we're on a fucked up cruise / God is dead but at least we have booze
    bad things happen, no one knows why / the sun burns out and everyone dies
  • Options
    Giggles_FunsworthGiggles_Funsworth Blight on Discourse Bay Area SprawlRegistered User regular
    edited July 2016
    Aioua wrote: »
    kime wrote: »
    Nyysjan wrote: »
    mcdermott wrote: »
    I find it odd that Tesla's radar is not equipped to deal with objects that are off the ground only a couple feet, but not high enough to clear the vehicle entirely. You'd think it would be designed such that it would detect obstructions to any part of the vehicle's passage, and not just the bottom half. Seems like a good way to get decapitated.

    I trust there is some engineering or technical hurdle I'm not aware of, rather than this being a case of negligence.

    No so much negligence as a trade off. To achieve the detection performance they needed at a price point they'd accept they had to limit the vertical scanning angle. That this radar will still function and increase overall safety the other 99.999% of the time is the benefit.

    If it was just for automatic break, then sure.
    But if it's a "self driving", then it needs to be able to detect anything that does not clear the cars profile, otherwise it's just asking to hit anything that protrudes from the cars ahead.

    It sounds like they were mostly just counting on the camera or such to catch anything higher, since they specifically said the way the light coloring of the vehicle against the bright sky made it hard to see. Which seems like an obvious oversight, but eh.

    Honestly without having had a chance to talk to the people I know at the company, my take on it is that the thing was kludged together at the lowest available cost and this is the stupidest West Coast Libertarian disruptiest thing Musk has ever done. There's a reason they beat Google to market despite getting into the game later.

    again, this is a system they pushed out via a software update

    I'm like 95% certain they took the existing sensors designed for parking cameras/cruise control radar/etc and made a 'good enough' autopilot out of it

    compared to the actual self-driving cars with their roof-mounted LIDAR arrays

    I am 100% certain that is what they did and it was a bad. Their hubris is really gonna fuck them here, which is a shame, Tesla's a great company for the most part.

    EDIT: Autopilot press kit which is still up whoopslol https://www.teslamotors.com/presskit/autopilot

    Giggles_Funsworth on
  • Options
    shrykeshryke Member of the Beast Registered User regular
    Hubris seems rather an odd term for this.

    Like, at the end of the day, it seems like all that happened is the driver wasn't paying enough attention to the road. Tesla's system says you need to still be watching the road afaik.

  • Options
    redxredx I(x)=2(x)+1 whole numbersRegistered User regular
    edited July 2016
    Aioua wrote: »
    My guess is the radar was intended to be for adaptive cruise control... You know the kind that will see a car directly in front of you and match its speed and stay back a given distance. Not one that's trying to build a 3D model of its surroundings.

    Booooo...

    Our highway system should be a massive distributed synthetic aperture radar.

    redx on
    They moistly come out at night, moistly.
  • Options
    Giggles_FunsworthGiggles_Funsworth Blight on Discourse Bay Area SprawlRegistered User regular
    shryke wrote: »
    Hubris seems rather an odd term for this.

    Like, at the end of the day, it seems like all that happened is the driver wasn't paying enough attention to the road. Tesla's system says you need to still be watching the road afaik.

    Hubris is thinking having an Autosteer public beta via a software push and existing instruments is a good idea when Google's been working at this shit for years and still hasn't released it. Also calling a consumer driver assist product Autopilot and not expecting disaster. Call it Driver Assist, Cruise Control+, literally anything that doesn't set the expectation that the car is now hands and attention free.

  • Options
    Giggles_FunsworthGiggles_Funsworth Blight on Discourse Bay Area SprawlRegistered User regular
    Really why the fuck would we want a public beta anywhere near 5000 pound hunks of steel and glass hurtling through our streets at high speeds. This was negligent IMO. What if somebody uncovered a bug that wasn't related to faulty instrumentation. On the freeway. Or maybe the Pacific Coast Highway. The company is in CA, Musk has a place in Malibu. Dude knows damn well how fucked our roads are.

  • Options
    DedwrekkaDedwrekka Metal Hell adjacentRegistered User regular
    From the sounds of it, it had nothing to do with the radar not being able to scan above a certain height, and to do with the program hitting an issue differentiating the white trailer from the sky in bright conditions.
    What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S."
    It was the deadliness of the crash that was impacted by the height of the trailer, not the Tesla's ability to see it.

  • Options
    Giggles_FunsworthGiggles_Funsworth Blight on Discourse Bay Area SprawlRegistered User regular
    Dedwrekka wrote: »
    From the sounds of it, it had nothing to do with the radar not being able to scan above a certain height, and to do with the program hitting an issue differentiating the white trailer from the sky in bright conditions.
    What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S."
    It was the deadliness of the crash that was impacted by the height of the trailer, not the Tesla's ability to see it.

    I think if dude was relying on autopilot to drive for him it's a safe bet that the driver wasn't fully engaged and might have seen the trailer otherwise, unless they have a black box that was taking brain scans and eye movement captures it's pretty hard to say whether the human eye had problems seeing the trailer or not because the driver died. Either way I think it's pretty irresponsible to ask live bodies to participate in a potentially lethal beta of a technology with an incredibly misleading name to anybody not intimately familiar with aviation.

  • Options
    electricitylikesmeelectricitylikesme Registered User regular
    Dedwrekka wrote: »
    From the sounds of it, it had nothing to do with the radar not being able to scan above a certain height, and to do with the program hitting an issue differentiating the white trailer from the sky in bright conditions.
    What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S."
    It was the deadliness of the crash that was impacted by the height of the trailer, not the Tesla's ability to see it.

    I think the point is that if the radar had enough vertical scan height, then it would be able too see a giant object moving across the profile of the car and would not depend on image processing to see it.

    But like, maybe? This is pretty much a text book example of the types of crashes and decision making autodriving cars are actually going to be involved in all the time - edge cases where events conspire to fool sensors, not idiot "no-win" scenarios.

  • Options
    AiouaAioua Ora Occidens Ora OptimaRegistered User regular
    edited July 2016
    Dedwrekka wrote: »
    From the sounds of it, it had nothing to do with the radar not being able to scan above a certain height, and to do with the program hitting an issue differentiating the white trailer from the sky in bright conditions.
    What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S."
    It was the deadliness of the crash that was impacted by the height of the trailer, not the Tesla's ability to see it.

    well, no we knew that, but it shouldn't be relying on visible-light cameras to image the area directly in front of the car, for just this reason.

    Had it a proper radar/lidar 3d imaging system the color of the truck wouldn't have mattered.


    EDIT: what I'm saying is their sensor package is inadequate for the amount of automation provided by the system, the number of disclaimers and warnings be dammed

    Aioua on
    life's a game that you're bound to lose / like using a hammer to pound in screws
    fuck up once and you break your thumb / if you're happy at all then you're god damn dumb
    that's right we're on a fucked up cruise / God is dead but at least we have booze
    bad things happen, no one knows why / the sun burns out and everyone dies
  • Options
    electricitylikesmeelectricitylikesme Registered User regular
    I will say that I'd be curious to know how other systems with features like this perform in the same conditions. Tesla's autopilot isn't that unique - Mercedes have their cruise control and lane-follow system, for example, which does a lot of the same things.

  • Options
    Giggles_FunsworthGiggles_Funsworth Blight on Discourse Bay Area SprawlRegistered User regular
    Aioua wrote: »
    Dedwrekka wrote: »
    From the sounds of it, it had nothing to do with the radar not being able to scan above a certain height, and to do with the program hitting an issue differentiating the white trailer from the sky in bright conditions.
    What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S."
    It was the deadliness of the crash that was impacted by the height of the trailer, not the Tesla's ability to see it.

    well, no we knew that, but it shouldn't be relying on visible-light cameras to image the area directly in front of the car, for just this reason.

    Had it a proper radar/lidar 3d imaging system the color of the truck wouldn't have mattered.


    EDIT: what I'm saying is their sensor package is inadequate for the amount of automation provided by the system, the number of disclaimers and warnings be dammed

    Which is why betas for anything that has high crossover with public safety are fucking stupid.

    Can't wait until the water treatment plant gets new beta software for their PLC controllers. Shit's gonna be so cash.

    I am reminded of the Farsight reservation from Transmetropolitan "Well the life expectancy's low as shit but what're you gonna do? Can't stop technology lol."

  • Options
    XeddicusXeddicus Registered User regular
    I so want this crap to happen, but look at this. "People need to pay attention." negates the entire point. On the other side "He should be paying attention" is no shit. So again why have the system at all?

    We need really strong vacuum tubes connecting everything.

  • Options
    mcdermottmcdermott Registered User regular
    edited July 2016
    I will repeat myself: people get distracted while driving regular old dumb cars. They get engrossed in a conversation with the passenger, they daydream, if they're particularly stupid they text or just straight up fall asleep.

    If you expect that to get anything but worse once you start designing the car to handle all the moment-to-moment tasks on a regular basis, you're delusional. "People just need to pay attention" is a laughable position here, because it completely ignores how people work. Even good, smart, caring, attentive people may find themselves lulled into a situation where their brain isn't capable of taking back over the task of driving on a moment's notice in an emergency. Now apply that to the median driver, who is none of those things.

    Doing a "beta test" of this is hilariously negligent. Well, it's less hilarious an more terrifying, because I share the road with these things.

    mcdermott on
  • Options
    Jubal77Jubal77 Registered User regular
    edited July 2016
    http://www.reuters.com/article/us-tesla-autopilot-dvd-idUSKCN0ZH5BW

    You know what is more negligent? People doing people things.
    DVD player found in Tesla car in May crash: Florida officials

    "As to the video, there was a witness who came to the scene immediately after the accident occurred, and we can't verify it at this point," said Paul Weekley of Tampa, the lawyer for the truck driver. "But what we have been told is that he saw a Harry Potter video still playing when he got to the scene.

    Jubal77 on
  • Options
    kimekime Queen of Blades Registered User regular
    Xeddicus wrote: »
    I so want this crap to happen, but look at this. "People need to pay attention." negates the entire point. On the other side "He should be paying attention" is no shit. So again why have the system at all?

    We need really strong vacuum tubes connecting everything.

    You may be conflating self driving cars, which will be awesome, with Tesla's current "Autopilot" feature, which sounds like it should be but I'd not the same as the first.

    I agree, actually, that Autopilot doesn't have a place on the roads, really. Either the driver needs to pay attention or they don't, not this halfway thing

    Battle.net ID: kime#1822
    3DS Friend Code: 3110-5393-4113
    Steam profile
  • Options
    Giggles_FunsworthGiggles_Funsworth Blight on Discourse Bay Area SprawlRegistered User regular
    kime wrote: »
    Xeddicus wrote: »
    I so want this crap to happen, but look at this. "People need to pay attention." negates the entire point. On the other side "He should be paying attention" is no shit. So again why have the system at all?

    We need really strong vacuum tubes connecting everything.

    You may be conflating self driving cars, which will be awesome, with Tesla's current "Autopilot" feature, which sounds like it should be but I'd not the same as the first.

    I agree, actually, that Autopilot doesn't have a place on the roads, really. Either the driver needs to pay attention or they don't, not this halfway thing

    I don't even use the cruise control if I'm sleepy because I know it makes me less attentive. I can't imagine trying to respond to an emergency after hours of zoning out doing nothing (optimally). I'm a pretty good driver too! Used to do it commercially. Did wonder was responsible for this not see all the studies showing that fiddling with the radio reduces awareness for a little bit after you're done fiddling?

    I guess if you are a terrible driver could be a good thing, would rather just have emergency braking though... Which come to think of it I don't see working terribly well on some of the hairpin mountain roads I learned to drive on... Anybody driven a Mercedes and know how it would react to say, approaching a wall at 30-40 MPH?

  • Options
    DedwrekkaDedwrekka Metal Hell adjacentRegistered User regular
    edited July 2016
    mcdermott wrote: »
    I will repeat myself: people get distracted while driving regular old dumb cars. They get engrossed in a conversation with the passenger, they daydream, if they're particularly stupid they text or just straight up fall asleep.

    If you expect that to get anything but worse once you start designing the car to handle all the moment-to-moment tasks on a regular basis, you're delusional. "People just need to pay attention" is a laughable position here, because it completely ignores how people work. Even good, smart, caring, attentive people may find themselves lulled into a situation where their brain isn't capable of taking back over the task of driving on a moment's notice in an emergency. Now apply that to the median driver, who is none of those things.

    Doing a "beta test" of this is hilariously negligent. Well, it's less hilarious an more terrifying, because I share the road with these things.

    Tesla's "autopilot" is stupidly named, because it isn't, and it doesn't intended to be, fully autonomous. It's a slightly smarter cruise control, not an auto. Which is why they launch a big warning screen when you activate it.

    It also isn't a beta test, at worst a "public beta" is multiple steps away from a beta. That's a complete misunderstanding of what a beta test is.
    Aioua wrote: »
    Dedwrekka wrote: »
    From the sounds of it, it had nothing to do with the radar not being able to scan above a certain height, and to do with the program hitting an issue differentiating the white trailer from the sky in bright conditions.
    What we know is that the vehicle was on a divided highway with Autopilot engaged when a tractor trailer drove across the highway perpendicular to the Model S. Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied. The high ride height of the trailer combined with its positioning across the road and the extremely rare circumstances of the impact caused the Model S to pass under the trailer, with the bottom of the trailer impacting the windshield of the Model S."
    It was the deadliness of the crash that was impacted by the height of the trailer, not the Tesla's ability to see it.

    well, no we knew that, but it shouldn't be relying on visible-light cameras to image the area directly in front of the car, for just this reason.

    Had it a proper radar/lidar 3d imaging system the color of the truck wouldn't have mattered.


    EDIT: what I'm saying is their sensor package is inadequate for the amount of automation provided by the system, the number of disclaimers and warnings be dammed

    It isn't relying purely on visible light cameras, it's a camera, a radar, and acoustic sensors. But from the press releases from the NHTSA and Tesla this is a case where all of the sensors failed to recognize the truck because of the circumstances, not because of the height of the truck.
    As for the automation this is apparently only labeled at a 3/5 by the NHTSA, which is that it isn't fully autonomous and requires people to be paying attention to the road while driving, much the same as with cruise control. That's pretty forgiving of human nature, but it's also been a complete misunderstanding by people of what this thing can do.

    Dedwrekka on
  • Options
    Giggles_FunsworthGiggles_Funsworth Blight on Discourse Bay Area SprawlRegistered User regular
    At what point does most people being confused about what the new feature in their space car can do become deceptive marketing though?

  • Options
    DedwrekkaDedwrekka Metal Hell adjacentRegistered User regular
    At what point does most people being confused about what the new feature in their space car can do become deceptive marketing though?

    Everything about their autopilot reitterates that drivers have to keep their hands on the wheel and be ready to take over if they see a problem.

    This is a Tesla commercial for the system
    https://www.youtube.com/watch?v=AGA7vtatiqo

  • Options
    OneAngryPossumOneAngryPossum Registered User regular
    I'm not sure it's enough to compensate for wavering attention caused by a car that largely drives itself, but if I'm not mistaken Tesla's system will in fact warn the user and eventually pull to the side of the road if hands aren't present on the wheel.

  • Options
    SiliconStewSiliconStew Registered User regular
    If anyone's curious about the stats: Collisions with the sides of tractor-trailers resulted in about 500 deaths each year and that many of these deaths involved side underride. This is out of an estimated 15,000 side-impact collisions with tractor-trailers each year.

    Just remember that half the people you meet are below average intelligence.
  • Options
    kimekime Queen of Blades Registered User regular
    I'm not sure it's enough to compensate for wavering attention caused by a car that largely drives itself, but if I'm not mistaken Tesla's system will in fact warn the user and eventually pull to the side of the road if hands aren't present on the wheel.

    I'd say no.

    This is just unsafe, period. People already get distracted and zone out while driving. And brains are not suited for the task of "pay attention for long periods of time to this thing that doesn't actually need your attention."

    Autopilot combines both of these in the worst way.

    I say this as someone who really really really wants self driving cars, but the more I think about Autopilot the more I want it off the road

    Battle.net ID: kime#1822
    3DS Friend Code: 3110-5393-4113
    Steam profile
  • Options
    VeagleVeagle Registered User regular
    kime wrote: »
    Xeddicus wrote: »
    I so want this crap to happen, but look at this. "People need to pay attention." negates the entire point. On the other side "He should be paying attention" is no shit. So again why have the system at all?

    We need really strong vacuum tubes connecting everything.

    You may be conflating self driving cars, which will be awesome, with Tesla's current "Autopilot" feature, which sounds like it should be but I'd not the same as the first.

    I agree, actually, that Autopilot doesn't have a place on the roads, really. Either the driver needs to pay attention or they don't, not this halfway thing

    Yeah, people have spent this entire thread saying that it needs to be all or nothing with these cars. You can't have a system that lets the car do most of the driving, but still expect the driver to take over at a moments notice for unexpected dangers. Also shows why we need to get started on legislating this stuff, cause otherwise we'll get more companies willing to run their 'beta tests' on public roads to save a few dollars.

    steam_sig.png
  • Options
    Void SlayerVoid Slayer Very Suspicious Registered User regular
    Hell, since the cars are automated couldn't they set up completely human free testing ranges to drive them on and then have risk management people run them through lots of likely and rare scenarios?

    He's a shy overambitious dog-catcher on the wrong side of the law. She's an orphaned psychic mercenary with the power to bend men's minds. They fight crime!
  • Options
    OneAngryPossumOneAngryPossum Registered User regular
    kime wrote: »
    I'm not sure it's enough to compensate for wavering attention caused by a car that largely drives itself, but if I'm not mistaken Tesla's system will in fact warn the user and eventually pull to the side of the road if hands aren't present on the wheel.

    I'd say no.

    This is just unsafe, period. People already get distracted and zone out while driving. And brains are not suited for the task of "pay attention for long periods of time to this thing that doesn't actually need your attention."

    Autopilot combines both of these in the worst way.

    I say this as someone who really really really wants self driving cars, but the more I think about Autopilot the more I want it off the road

    Yeah, I've got no intention of absolving or judging anything here. Just something I've seen mentioned elsewhere that seems relevant, not exculpatory.

  • Options
    spool32spool32 Contrary Library Registered User regular
    It seems to me that anything that delivers a net decrease in accidents should be fine, even if it isn't 100% perfect.

    Or is there data to suggest that Tesla drivers get in more accidents than the average? A lot of these objections seem like speculation about what future bad thing might happen, when current good things are already happening...

  • Options
    Giggles_FunsworthGiggles_Funsworth Blight on Discourse Bay Area SprawlRegistered User regular
    spool32 wrote: »
    It seems to me that anything that delivers a net decrease in accidents should be fine, even if it isn't 100% perfect.

    Or is there data to suggest that Tesla drivers get in more accidents than the average? A lot of these objections seem like speculation about what future bad thing might happen, when current good things are already happening...

    My objections are more that they should have done the beta internally with smaller amounts of cars and more controlled conditions initially, because right now they've just kind of rolled the dice on the driver assistance being more powerful at preventing accidents than the increased inattentiveness. That is not a thing I am okay with corporations rolling the dice on. They only got those (limited, initial) stats about miles per accident after they turned the fucker on. Will it hold up once the novelty wears off and people start being more reckless with their smart cars?

    Who the fuck knows, Tesla sure doesn't. They (hopefully) have some better educated guesses than me, but what I said about the water treatment doing a beta with public infrastructure wasn't entirely facetious. Fuck man, anybody that's ever worked in IT long enough to see how badly an untested change to prod can go should see what a retrograde myopic fucking idea a public beta for autocars is.

  • Options
    shrykeshryke Member of the Beast Registered User regular
    edited July 2016
    shryke wrote: »
    Hubris seems rather an odd term for this.

    Like, at the end of the day, it seems like all that happened is the driver wasn't paying enough attention to the road. Tesla's system says you need to still be watching the road afaik.

    Hubris is thinking having an Autosteer public beta via a software push and existing instruments is a good idea when Google's been working at this shit for years and still hasn't released it. Also calling a consumer driver assist product Autopilot and not expecting disaster. Call it Driver Assist, Cruise Control+, literally anything that doesn't set the expectation that the car is now hands and attention free.

    Google hasn't been working at this shit for years though. They've been working at a completely different, much more complicated and automated system. Which is exactly the whole point.

    Like, you linked the presskit directly above my post but apparently did not read it:
    Tesla requires drivers to remain engaged and aware when Autosteer is enabled. Drivers must keep their hands on the steering wheel.

    They make it quite clear that this ain't driving for you.

    This isn't hubris. The feature works really well as far as we know. It just doesn't do what they don't claim it does, which some people in this thread keep acting like it does. This isn't a "beta test gone wrong" because the system is not meant to do what you claim it failed at.

    Tesla's system says you gotta pay the fuck attention to the road. The driver not noticing a giant fuck-off trailer right in front of him is the problem here. Telsa explicitly says you need to be doing that.

    This is not a result of this feature failing, it's the result of the human driver failing. Cause, again, "Tesla requires drivers to remain engaged and aware". This person didn't. And that's why this happened.

    shryke on
  • Options
    Giggles_FunsworthGiggles_Funsworth Blight on Discourse Bay Area SprawlRegistered User regular
    shryke wrote: »
    shryke wrote: »
    Hubris seems rather an odd term for this.

    Like, at the end of the day, it seems like all that happened is the driver wasn't paying enough attention to the road. Tesla's system says you need to still be watching the road afaik.

    Hubris is thinking having an Autosteer public beta via a software push and existing instruments is a good idea when Google's been working at this shit for years and still hasn't released it. Also calling a consumer driver assist product Autopilot and not expecting disaster. Call it Driver Assist, Cruise Control+, literally anything that doesn't set the expectation that the car is now hands and attention free.

    Google hasn't been working at this shit for years though. They've been working at a completely different, much more complicated and automated system. Which is exactly the whole point.

    Like, you linked the presskit directly above my post but apparently did not read it:
    Tesla requires drivers to remain engaged and aware when Autosteer is enabled. Drivers must keep their hands on the steering wheel.

    They make it quite clear that this ain't driving for you.

    This isn't hubris. The feature works really well as far as we know. It just doesn't do what they don't claim it does, which some people in this thread keep acting like it does.

    Tesla's system says you gotta pay the fuck attention to the road. The driver not noticing a giant fuck-off trailer right in front of him is the problem here. Telsa explicitly says you need to be doing that.

    This is not a result of this feature failing, it's the result of the human driver failing. Cause, again, "Tesla requires drivers to remain engaged and aware". This person didn't. And that's why this happened.

    Tesla is guessing at what is within the bounds of human ability with a public beta.

Sign In or Register to comment.