Options

[Autonomous Transportation] When the cars have all the jobs, the poor will walk the earth

1121315171848

Posts

  • Options
    shrykeshryke Member of the Beast Registered User regular
    Xeddicus wrote: »
    I so want this crap to happen, but look at this. "People need to pay attention." negates the entire point. On the other side "He should be paying attention" is no shit. So again why have the system at all?

    For the same reason you have cruise control even though you could just hold your foot on the gas pedal. Cause it's fun and cool and convenient. But that does not resolve you of responsibility for your own conduct behind the wheel.

    Shitty driver gonna shitty driver.

  • Options
    AiouaAioua Ora Occidens Ora OptimaRegistered User regular
    shryke wrote: »
    Xeddicus wrote: »
    I so want this crap to happen, but look at this. "People need to pay attention." negates the entire point. On the other side "He should be paying attention" is no shit. So again why have the system at all?

    For the same reason you have cruise control even though you could just hold your foot on the gas pedal. Cause it's fun and cool and convenient. But that does not resolve you of responsibility for your own conduct behind the wheel.

    Shitty driver gonna shitty driver.

    We don't let people talk on their cell phones while driving because it's too distracting. In theory a good driver won't get distracted but we don't blame it on bad drivers, we outlaw the distraction.

    I don't think it's unreasonable to suggest that a driver-assist that is too good (yet not good enough to operate independently) might also end up being too distracting to be allowed.

    life's a game that you're bound to lose / like using a hammer to pound in screws
    fuck up once and you break your thumb / if you're happy at all then you're god damn dumb
    that's right we're on a fucked up cruise / God is dead but at least we have booze
    bad things happen, no one knows why / the sun burns out and everyone dies
  • Options
    Marty81Marty81 Registered User regular
    Aioua wrote: »
    shryke wrote: »
    Xeddicus wrote: »
    I so want this crap to happen, but look at this. "People need to pay attention." negates the entire point. On the other side "He should be paying attention" is no shit. So again why have the system at all?

    For the same reason you have cruise control even though you could just hold your foot on the gas pedal. Cause it's fun and cool and convenient. But that does not resolve you of responsibility for your own conduct behind the wheel.

    Shitty driver gonna shitty driver.

    We don't let people talk on their cell phones while driving because it's too distracting.

    Man, if only. Cell phones while driving are currently legislated on a state by state, and in some places, city by city, basis.

  • Options
    KrieghundKrieghund Registered User regular
    And I can talk on my cell through my car with both hands on the wheel. My car doesn't let you see your texts if it's moving, though.

  • Options
    shrykeshryke Member of the Beast Registered User regular
    Aioua wrote: »
    shryke wrote: »
    Xeddicus wrote: »
    I so want this crap to happen, but look at this. "People need to pay attention." negates the entire point. On the other side "He should be paying attention" is no shit. So again why have the system at all?

    For the same reason you have cruise control even though you could just hold your foot on the gas pedal. Cause it's fun and cool and convenient. But that does not resolve you of responsibility for your own conduct behind the wheel.

    Shitty driver gonna shitty driver.

    We don't let people talk on their cell phones while driving because it's too distracting. In theory a good driver won't get distracted but we don't blame it on bad drivers, we outlaw the distraction.

    I don't think it's unreasonable to suggest that a driver-assist that is too good (yet not good enough to operate independently) might also end up being too distracting to be allowed.

    Actually we do both. Both legally and otherwise, we 100% blame the distracted driver on his or her phone for not paying the fuck attention.

  • Options
    AiouaAioua Ora Occidens Ora OptimaRegistered User regular
    shryke wrote: »
    Aioua wrote: »
    shryke wrote: »
    Xeddicus wrote: »
    I so want this crap to happen, but look at this. "People need to pay attention." negates the entire point. On the other side "He should be paying attention" is no shit. So again why have the system at all?

    For the same reason you have cruise control even though you could just hold your foot on the gas pedal. Cause it's fun and cool and convenient. But that does not resolve you of responsibility for your own conduct behind the wheel.

    Shitty driver gonna shitty driver.

    We don't let people talk on their cell phones while driving because it's too distracting. In theory a good driver won't get distracted but we don't blame it on bad drivers, we outlaw the distraction.

    I don't think it's unreasonable to suggest that a driver-assist that is too good (yet not good enough to operate independently) might also end up being too distracting to be allowed.

    Actually we do both. Both legally and otherwise, we 100% blame the distracted driver on his or her phone for not paying the fuck attention.

    fair enough

    life's a game that you're bound to lose / like using a hammer to pound in screws
    fuck up once and you break your thumb / if you're happy at all then you're god damn dumb
    that's right we're on a fucked up cruise / God is dead but at least we have booze
    bad things happen, no one knows why / the sun burns out and everyone dies
  • Options
    kimekime Queen of Blades Registered User regular
    spool32 wrote: »
    It seems to me that anything that delivers a net decrease in accidents should be fine, even if it isn't 100% perfect.

    Or is there data to suggest that Tesla drivers get in more accidents than the average? A lot of these objections seem like speculation about what future bad thing might happen, when current good things are already happening...

    That's a good point... I feel like instinctively it's more dangerous, but it actually maybe isn't. I wonder if there is real data.

    Battle.net ID: kime#1822
    3DS Friend Code: 3110-5393-4113
    Steam profile
  • Options
    AiouaAioua Ora Occidens Ora OptimaRegistered User regular
    kime wrote: »
    spool32 wrote: »
    It seems to me that anything that delivers a net decrease in accidents should be fine, even if it isn't 100% perfect.

    Or is there data to suggest that Tesla drivers get in more accidents than the average? A lot of these objections seem like speculation about what future bad thing might happen, when current good things are already happening...

    That's a good point... I feel like instinctively it's more dangerous, but it actually maybe isn't. I wonder if there is real data.

    Well we have the numbers from Tesla for fatalities but the sample size isn't really big enough.

    If they had numbers for all accidents (not just fatalities) that would be something I'd like to see.

    life's a game that you're bound to lose / like using a hammer to pound in screws
    fuck up once and you break your thumb / if you're happy at all then you're god damn dumb
    that's right we're on a fucked up cruise / God is dead but at least we have booze
    bad things happen, no one knows why / the sun burns out and everyone dies
  • Options
    shrykeshryke Member of the Beast Registered User regular
    edited July 2016
    shryke wrote: »
    shryke wrote: »
    Hubris seems rather an odd term for this.

    Like, at the end of the day, it seems like all that happened is the driver wasn't paying enough attention to the road. Tesla's system says you need to still be watching the road afaik.

    Hubris is thinking having an Autosteer public beta via a software push and existing instruments is a good idea when Google's been working at this shit for years and still hasn't released it. Also calling a consumer driver assist product Autopilot and not expecting disaster. Call it Driver Assist, Cruise Control+, literally anything that doesn't set the expectation that the car is now hands and attention free.

    Google hasn't been working at this shit for years though. They've been working at a completely different, much more complicated and automated system. Which is exactly the whole point.

    Like, you linked the presskit directly above my post but apparently did not read it:
    Tesla requires drivers to remain engaged and aware when Autosteer is enabled. Drivers must keep their hands on the steering wheel.

    They make it quite clear that this ain't driving for you.

    This isn't hubris. The feature works really well as far as we know. It just doesn't do what they don't claim it does, which some people in this thread keep acting like it does.

    Tesla's system says you gotta pay the fuck attention to the road. The driver not noticing a giant fuck-off trailer right in front of him is the problem here. Telsa explicitly says you need to be doing that.

    This is not a result of this feature failing, it's the result of the human driver failing. Cause, again, "Tesla requires drivers to remain engaged and aware". This person didn't. And that's why this happened.

    Tesla is guessing at what is within the bounds of human ability with a public beta.

    No, they aren't. WTF are you even talking about?

    Tesla, again from your own goosing link, is saying "Pay the fuck attention to the road". So no, they aren't guessing at what is within the bounds of human ability because we already know "driving" is well within those bounds.

    shryke on
  • Options
    mcdermottmcdermott Registered User regular

    No, they aren't. WTF are you even talking about?

    Tesla, again from your own goosing link, is saying "Pay the fuck attention to the road". So no, they aren't guessing at what is within the bounds of human ability because we already know "driving" is well within those bounds.

    It's entirely possible that "pay the fuck attention to the road" isn't actually within the bounds of human ability once you're no longer actively driving.

  • Options
    shrykeshryke Member of the Beast Registered User regular
    mcdermott wrote: »

    No, they aren't. WTF are you even talking about?

    Tesla, again from your own goosing link, is saying "Pay the fuck attention to the road". So no, they aren't guessing at what is within the bounds of human ability because we already know "driving" is well within those bounds.

    It's entirely possible that "pay the fuck attention to the road" isn't actually within the bounds of human ability once you're no longer actively driving.

    We know paying the fuck attention to the road is "out of the bounds of human ability" while talking on a cellphone, but we still have cellphones around. We just expect people to, you know, still drive.

  • Options
    PaladinPaladin Registered User regular
    shryke wrote: »
    mcdermott wrote: »

    No, they aren't. WTF are you even talking about?

    Tesla, again from your own goosing link, is saying "Pay the fuck attention to the road". So no, they aren't guessing at what is within the bounds of human ability because we already know "driving" is well within those bounds.

    It's entirely possible that "pay the fuck attention to the road" isn't actually within the bounds of human ability once you're no longer actively driving.

    We know paying the fuck attention to the road is "out of the bounds of human ability" while talking on a cellphone, but we still have cellphones around. We just expect people to, you know, still drive.

    And they will

    Once we severely limit the features of assisted driving and implement additional restrictions to make sure the driver is paying full attention to the road and has their hands on the wheel at all times

    Marty: The future, it's where you're going?
    Doc: That's right, twenty five years into the future. I've always dreamed on seeing the future, looking beyond my years, seeing the progress of mankind. I'll also be able to see who wins the next twenty-five world series.
  • Options
    Knight_Knight_ Dead Dead Dead Registered User regular
    Yea, if Tesla wants you to be engaged, then the system should shut off if the driver isn't holding on to the wheel. I've seen enough videos of people using that Tesla autopilot with absolutely zero input that it's clearly not requiring it.

    aeNqQM9.jpg
  • Options
    DedwrekkaDedwrekka Metal Hell adjacentRegistered User regular
    Paladin wrote: »
    shryke wrote: »
    mcdermott wrote: »

    No, they aren't. WTF are you even talking about?

    Tesla, again from your own goosing link, is saying "Pay the fuck attention to the road". So no, they aren't guessing at what is within the bounds of human ability because we already know "driving" is well within those bounds.

    It's entirely possible that "pay the fuck attention to the road" isn't actually within the bounds of human ability once you're no longer actively driving.

    We know paying the fuck attention to the road is "out of the bounds of human ability" while talking on a cellphone, but we still have cellphones around. We just expect people to, you know, still drive.

    And they will

    Once we severely limit the features of assisted driving and implement additional restrictions to make sure the driver is paying full attention to the road and has their hands on the wheel at all times

    I don't think "take the system and make it less responsive" is the answer here. Especially because this seems to also have been a failure of the sensors as well as driver inattention. This isn't all one thing or the other.

    The truck driver failed to yield the right of way (according to at least one article).

    The sensors on the car failed to detect the truck.

    The driver failed to detect the truck.

    The height of the truck undercarriage made the accident immediately deadly.

    All of these need to be addressed, not just one or the other. Using this as an example of only a failure of one thing is premature.

  • Options
    honoverehonovere Registered User regular
    edited July 2016
    Knight_ wrote: »
    Yea, if Tesla wants you to be engaged, then the system should shut off if the driver isn't holding on to the wheel. I've seen enough videos of people using that Tesla autopilot with absolutely zero input that it's clearly not requiring it.

    I think I read at least one tester of a Tesla say that Tesla might use a system where the software adjusts the interval of checking for your hand on the wheel. For example that it checks less often when driving long stretches on the highway.

    honovere on
  • Options
    Phoenix-DPhoenix-D Registered User regular
    honovere wrote: »
    Knight_ wrote: »
    Yea, if Tesla wants you to be engaged, then the system should shut off if the driver isn't holding on to the wheel. I've seen enough videos of people using that Tesla autopilot with absolutely zero input that it's clearly not requiring it.

    I think I read at least one tester of a Tesla say that Tesla might use a system where the software adjusts the interval of checking for your hand on the wheel. For example that it checks less often when driving long stretches on the highway.

    That would be a really really bad idea, because it invites less attention which even for boring highway drives is dangerous.

  • Options
    The EnderThe Ender Registered User regular
    edited July 2016
    shryke wrote: »
    mcdermott wrote: »

    No, they aren't. WTF are you even talking about?

    Tesla, again from your own goosing link, is saying "Pay the fuck attention to the road". So no, they aren't guessing at what is within the bounds of human ability because we already know "driving" is well within those bounds.

    It's entirely possible that "pay the fuck attention to the road" isn't actually within the bounds of human ability once you're no longer actively driving.

    We know paying the fuck attention to the road is "out of the bounds of human ability" while talking on a cellphone, but we still have cellphones around. We just expect people to, you know, still drive.

    Do we build cellphones right into the damn cars and advertise them as a great car-centric feature, while also attaching a little disclaimer about how you should also totes pay attention to the road?


    Google did this testing already (Tesla didn't): they repeatedly had their own drivers tasked with paying attention to the road while in an autocar, and then monitored the drivers. They even let the drivers know they were being monitored.

    The drivers didn't pay attention.


    It had nothing to do with being a shitty driver or not. Everyone is a shitty driver. Asking people to pay attention isn't good enough; people just won't, even if they believe they are.

    spool32 wrote: »
    It seems to me that anything that delivers a net decrease in accidents should be fine, even if it isn't 100% perfect.

    Or is there data to suggest that Tesla drivers get in more accidents than the average? A lot of these objections seem like speculation about what future bad thing might happen, when current good things are already happening...

    Tesla autopilot currently has 130 miles driven, 1 fatality.

    National average in the U.S. with shitty human drivers in normal cars is about 95~ million miles driven per fatality.


    That is an absolutely unacceptable, completely shit statistic, IMHO (particularly when you consider that Tesla's cars only operate in one specific region of the U.S.). Autocars have a potential for decreasing road accidents by an order of magnitude; Tesla's product does marginally better than shitty human drivers, at an exorbitant price. If that's all we want, there's no point in investing in AI; just load everyone's cars with impact dampening technology and call it a day.


    EDIT: I mean, consider that Tesla's futuristic wonder cars are driving on some of the best maintained road in America, and that the current 95~ million miles per fatality statistic is improving mostly because old vehicles are being taken out of circulation. If we compare Tesla's score with the average score from modern vehicles using the same roads, how does Tesla do? My money is on 'about on par with everyone else, maybe worse'.

    Also consider that without autopilot, the driver would have probably been paying attention to the road & would not have died. It's not like a semi truck is easy for a person to miss if their eyes are on the road.

    The Ender on
    With Love and Courage
  • Options
    kimekime Queen of Blades Registered User regular
    The Ender wrote: »
    shryke wrote: »
    mcdermott wrote: »

    No, they aren't. WTF are you even talking about?

    Tesla, again from your own goosing link, is saying "Pay the fuck attention to the road". So no, they aren't guessing at what is within the bounds of human ability because we already know "driving" is well within those bounds.

    It's entirely possible that "pay the fuck attention to the road" isn't actually within the bounds of human ability once you're no longer actively driving.

    We know paying the fuck attention to the road is "out of the bounds of human ability" while talking on a cellphone, but we still have cellphones around. We just expect people to, you know, still drive.

    Do we build cellphones right into the damn cars and advertise them as a great car-centric feature, while also attaching a little disclaimer about how you should also totes pay attention to the road?


    Google did this testing already (Tesla didn't): they repeatedly had their own drivers tasked with paying attention to the road while in an autocar, and then monitored the drivers. They even let the drivers know they were being monitored.

    The drivers didn't pay attention.


    It had nothing to do with being a shitty driver or not. Everyone is a shitty driver. Asking people to pay attention isn't good enough; people just won't, even if they believe they are.

    spool32 wrote: »
    It seems to me that anything that delivers a net decrease in accidents should be fine, even if it isn't 100% perfect.

    Or is there data to suggest that Tesla drivers get in more accidents than the average? A lot of these objections seem like speculation about what future bad thing might happen, when current good things are already happening...

    Tesla autopilot currently has 130 miles driven, 1 fatality.

    National average in the U.S. with shitty human drivers in normal cars is about 95~ million miles driven per fatality.


    That is an absolutely unacceptable, completely shit statistic, IMHO (particularly when you consider that Tesla's cars only operate in one specific region of the U.S.). Autocars have a potential for decreasing road accidents by an order of magnitude; Tesla's product does marginally better than shitty human drivers, at an exorbitant price. If that's all we want, there's no point in investing in AI; just load everyone's cars with impact dampening technology and call it a day.


    EDIT: I mean, consider that Tesla's futuristic wonder cars are driving on some of the best maintained road in America, and that the current 95~ million miles per fatality statistic is improving mostly because old vehicles are being taken out of circulation. If we compare Tesla's score with the average score from modern vehicles using the same roads, how does Tesla do? My money is on 'about on par with everyone else, maybe worse'.

    Also consider that without autopilot, the driver would have probably been paying attention to the road & would not have died. It's not like a semi truck is easy for a person to miss if their eyes are on the road.

    That last paragraph doesn't really count if we're looking at the data. Like, sure, he'd have caught that maybe ("if his eyes are on the road"), but Autopilot may have prevented other, earlier accidents

    Battle.net ID: kime#1822
    3DS Friend Code: 3110-5393-4113
    Steam profile
  • Options
    honoverehonovere Registered User regular
    edited July 2016
    Yeah, the mileage should really be compared with something like a modern S-class or similar vehicles that also use similar systems but are currently only at max level two of autonomy.

    honovere on
  • Options
    The EnderThe Ender Registered User regular
    kime wrote:
    That last paragraph doesn't really count if we're looking at the data. Like, sure, he'd have caught that maybe ("if his eyes are on the road"), but Autopilot may have prevented other, earlier accidents

    While that may be true, 'this car cannot detect something as conspicuous as an 18 wheeler crossing the highway' is waaaay below the threshold I think we should be willing to accept for any sort of autopilot functionality.


    With Love and Courage
  • Options
    CoinageCoinage Heaviside LayerRegistered User regular
    shryke wrote: »
    mcdermott wrote: »

    No, they aren't. WTF are you even talking about?

    Tesla, again from your own goosing link, is saying "Pay the fuck attention to the road". So no, they aren't guessing at what is within the bounds of human ability because we already know "driving" is well within those bounds.

    It's entirely possible that "pay the fuck attention to the road" isn't actually within the bounds of human ability once you're no longer actively driving.

    We know paying the fuck attention to the road is "out of the bounds of human ability" while talking on a cellphone, but we still have cellphones around. We just expect people to, you know, still drive.
    There is absolutely no way I could pay attention to the road if I'm not actually in control of the car.

  • Options
    The EnderThe Ender Registered User regular
    edited July 2016
    Coinage wrote: »
    shryke wrote: »
    mcdermott wrote: »

    No, they aren't. WTF are you even talking about?

    Tesla, again from your own goosing link, is saying "Pay the fuck attention to the road". So no, they aren't guessing at what is within the bounds of human ability because we already know "driving" is well within those bounds.

    It's entirely possible that "pay the fuck attention to the road" isn't actually within the bounds of human ability once you're no longer actively driving.

    We know paying the fuck attention to the road is "out of the bounds of human ability" while talking on a cellphone, but we still have cellphones around. We just expect people to, you know, still drive.
    There is absolutely no way I could pay attention to the road if I'm not actually in control of the car.

    Literally nobody can, according to what testing we've done.

    So don't feel too bad about that.


    EDIT: This is the bedrock of Google's entire program. It's the reason they decided against a graduated approach, from driver assistance to eventual automation; the car must be able to handle the whole job, because humans are not acceptable back-up drivers.

    Tesla's now managed to kill someone because they didn't take the research seriously. I've always thought of Musk as a great engineer but a shitty scientist, and this highlights why.

    The Ender on
    With Love and Courage
  • Options
    Giggles_FunsworthGiggles_Funsworth Blight on Discourse Bay Area SprawlRegistered User regular
    The Ender wrote: »
    Coinage wrote: »
    shryke wrote: »
    mcdermott wrote: »

    No, they aren't. WTF are you even talking about?

    Tesla, again from your own goosing link, is saying "Pay the fuck attention to the road". So no, they aren't guessing at what is within the bounds of human ability because we already know "driving" is well within those bounds.

    It's entirely possible that "pay the fuck attention to the road" isn't actually within the bounds of human ability once you're no longer actively driving.

    We know paying the fuck attention to the road is "out of the bounds of human ability" while talking on a cellphone, but we still have cellphones around. We just expect people to, you know, still drive.
    There is absolutely no way I could pay attention to the road if I'm not actually in control of the car.

    Literally nobody can, according to what testing we've done.

    So don't feel too bad about that.


    EDIT: This is the bedrock of Google's entire program. It's the reason they decided against a graduated approach, from driver assistance to eventual automation; the car must be able to handle the whole job, because humans are not acceptable back-up drivers.

    Tesla's now managed to kill someone because they didn't take the research seriously. I've always thought of Musk as a great engineer but a shitty scientist, and this highlights why.

    Musk isn't even a great engineer. He's a fantastic futurist and generally ethical forward thinking businessman. His knowledge of even programming, where he in theory made his bones with PayPal, is very high level anymore. Can say this with a fair amount of certainty, one of my peers and fairly good friends was refferred to as "crazy smart" by Musk in an interview. My friend is definitely a sharp guy, but not really top talent in his field.

  • Options
    DedwrekkaDedwrekka Metal Hell adjacentRegistered User regular
    The Ender wrote: »
    kime wrote:
    That last paragraph doesn't really count if we're looking at the data. Like, sure, he'd have caught that maybe ("if his eyes are on the road"), but Autopilot may have prevented other, earlier accidents

    While that may be true, 'this car cannot detect something as conspicuous as an 18 wheeler crossing the highway' is waaaay below the threshold I think we should be willing to accept for any sort of autopilot functionality.


    That seems like an oversimplification or a misunderstanding of the situation. Even Tesla has admitted that there was something specific about the exact set of circumstances that led to the trailer not being recognized by the system. The facts we have right now say that for some reason that system was unable to recognize the 18 wheeler under those specific circumstances, but the way you're wording it is that the system can't recognize an 18 wheeler under any circumstances. That's simply not the case.

    The Tesla Autosteer needs to be modified to require continuous driver input, yes. The system needs to be examined to find out exactly what went wrong under those circumstances, yes. We don't need to pretend that the system is incapable of functioning, or inflate the issue to do that.

  • Options
    AiouaAioua Ora Occidens Ora OptimaRegistered User regular
    Visual light 2D image processing isn't really an acceptable way for an auto driving system to be looking to its front.

    Tesla said it's because the trailer blended I with the sky. Which means it's radar rangefinder doesn't sweep an area large enough to actually avoid collision.

    Like it's not some crazy circumstance that a vehicle was colored similarly to its background, the problem is that they were relying on inadequate systems to map the environment.

    life's a game that you're bound to lose / like using a hammer to pound in screws
    fuck up once and you break your thumb / if you're happy at all then you're god damn dumb
    that's right we're on a fucked up cruise / God is dead but at least we have booze
    bad things happen, no one knows why / the sun burns out and everyone dies
  • Options
    DedwrekkaDedwrekka Metal Hell adjacentRegistered User regular
    Aioua wrote: »
    Visual light 2D image processing isn't really an acceptable way for an auto driving system to be looking to its front.

    Tesla said it's because the trailer blended I with the sky. Which means it's radar rangefinder doesn't sweep an area large enough to actually avoid collision.

    Like it's not some crazy circumstance that a vehicle was colored similarly to its background, the problem is that they were relying on inadequate systems to map the environment.

    I don't think that in the time Tesla's Autopilot has been running this is the first time that it has ever come across a white 18 wheeler.

  • Options
    AiouaAioua Ora Occidens Ora OptimaRegistered User regular
    edited July 2016
    Dedwrekka wrote: »
    Aioua wrote: »
    Visual light 2D image processing isn't really an acceptable way for an auto driving system to be looking to its front.

    Tesla said it's because the trailer blended I with the sky. Which means it's radar rangefinder doesn't sweep an area large enough to actually avoid collision.

    Like it's not some crazy circumstance that a vehicle was colored similarly to its background, the problem is that they were relying on inadequate systems to map the environment.

    I don't think that in the time Tesla's Autopilot has been running this is the first time that it has ever come across a white 18 wheeler.

    most of the time that stuff is going to be caught by the radar

    my problem is that their rangefinder isn't adequate to actually detect things in front of the car, and you can't rely solely on image processing to detect shit

    EDIT: like, to further explain

    by design, it sounds like their rangefinding systems only scan at about bumper-level, which means to detect anything higher than that falls back onto camera image processing
    I don't think this is a case of a freak accident where it couldn't see white-on-white. I don't think it can hardly ever see white-on-white, but this was a time it really mattered

    Aioua on
    life's a game that you're bound to lose / like using a hammer to pound in screws
    fuck up once and you break your thumb / if you're happy at all then you're god damn dumb
    that's right we're on a fucked up cruise / God is dead but at least we have booze
    bad things happen, no one knows why / the sun burns out and everyone dies
  • Options
    The EnderThe Ender Registered User regular
    edited July 2016
    Dedwrekka wrote: »
    The Ender wrote: »
    kime wrote:
    That last paragraph doesn't really count if we're looking at the data. Like, sure, he'd have caught that maybe ("if his eyes are on the road"), but Autopilot may have prevented other, earlier accidents

    While that may be true, 'this car cannot detect something as conspicuous as an 18 wheeler crossing the highway' is waaaay below the threshold I think we should be willing to accept for any sort of autopilot functionality.


    That seems like an oversimplification or a misunderstanding of the situation. Even Tesla has admitted that there was something specific about the exact set of circumstances that led to the trailer not being recognized by the system. The facts we have right now say that for some reason that system was unable to recognize the 18 wheeler under those specific circumstances, but the way you're wording it is that the system can't recognize an 18 wheeler under any circumstances. That's simply not the case.

    The Tesla Autosteer needs to be modified to require continuous driver input, yes. The system needs to be examined to find out exactly what went wrong under those circumstances, yes. We don't need to pretend that the system is incapable of functioning, or inflate the issue to do that.

    Tesla claims that the car couldn't see the trailer because it's color scheme made it blend into the sky. That's completely unacceptable for any sort of autopilot system. As others have said, the hardware that the car is using are simply insufficient for building an accurate model of the environment.


    The trade-off for driver assistance features is that drivers will rely on those systems. So, the question to ask before implementing them is, 'Do these systems have a positive enough impact on driving that the reduction in driver awareness / participation is offset?'

    Here the answer was, 'Nope!' and the cost for learning that answer was a fatality because they decided to test the product on the highway before testing in a controlled environment how people would actually use it. It was more important to Tesla to make an immediate return on their technology investment than to actually turnaround something safe.


    Honestly their reaction to what happened is extremely scummy to me, absolving themselves of all blame because 'well gee whiz we told people to watch the road!'


    EDIT: Also, I think you are exaggerating how complex or rare the incident was.

    Are 18 wheelers rare? No. Is it unusual to come across an 18 wheeler doing a U-turn (or any number of other dumb things) on a highway? No. Are 18 wheelers hard to spot? No.

    Is it acceptable that the autopilot then slammed into an 18 wheeler because it couldn't properly see a relatively common vehicle (a high ride trailer) doing something quite typical and thus didn't react at all? Of course not.

    The Ender on
    With Love and Courage
  • Options
    PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    The Ender wrote: »
    shryke wrote: »
    mcdermott wrote: »

    No, they aren't. WTF are you even talking about?

    Tesla, again from your own goosing link, is saying "Pay the fuck attention to the road". So no, they aren't guessing at what is within the bounds of human ability because we already know "driving" is well within those bounds.

    It's entirely possible that "pay the fuck attention to the road" isn't actually within the bounds of human ability once you're no longer actively driving.

    We know paying the fuck attention to the road is "out of the bounds of human ability" while talking on a cellphone, but we still have cellphones around. We just expect people to, you know, still drive.

    Do we build cellphones right into the damn cars and advertise them as a great car-centric feature, while also attaching a little disclaimer about how you should also totes pay attention to the road?


    Google did this testing already (Tesla didn't): they repeatedly had their own drivers tasked with paying attention to the road while in an autocar, and then monitored the drivers. They even let the drivers know they were being monitored.

    The drivers didn't pay attention.


    It had nothing to do with being a shitty driver or not. Everyone is a shitty driver. Asking people to pay attention isn't good enough; people just won't, even if they believe they are.

    spool32 wrote: »
    It seems to me that anything that delivers a net decrease in accidents should be fine, even if it isn't 100% perfect.

    Or is there data to suggest that Tesla drivers get in more accidents than the average? A lot of these objections seem like speculation about what future bad thing might happen, when current good things are already happening...

    Tesla autopilot currently has 130 miles driven, 1 fatality.

    National average in the U.S. with shitty human drivers in normal cars is about 95~ million miles driven per fatality.


    That is an absolutely unacceptable, completely shit statistic, IMHO (particularly when you consider that Tesla's cars only operate in one specific region of the U.S.). Autocars have a potential for decreasing road accidents by an order of magnitude; Tesla's product does marginally better than shitty human drivers, at an exorbitant price. If that's all we want, there's no point in investing in AI; just load everyone's cars with impact dampening technology and call it a day.


    EDIT: I mean, consider that Tesla's futuristic wonder cars are driving on some of the best maintained road in America, and that the current 95~ million miles per fatality statistic is improving mostly because old vehicles are being taken out of circulation. If we compare Tesla's score with the average score from modern vehicles using the same roads, how does Tesla do? My money is on 'about on par with everyone else, maybe worse'.

    Also consider that without autopilot, the driver would have probably been paying attention to the road & would not have died. It's not like a semi truck is easy for a person to miss if their eyes are on the road.

    The problem is that drawing any conclusions from n=1 data is a terrible way to do it. We don't know that their fatality rate really is 1/130 million and it's impossible to really know that until more data is obtained

  • Options
    shrykeshryke Member of the Beast Registered User regular
    Aioua wrote: »
    Dedwrekka wrote: »
    Aioua wrote: »
    Visual light 2D image processing isn't really an acceptable way for an auto driving system to be looking to its front.

    Tesla said it's because the trailer blended I with the sky. Which means it's radar rangefinder doesn't sweep an area large enough to actually avoid collision.

    Like it's not some crazy circumstance that a vehicle was colored similarly to its background, the problem is that they were relying on inadequate systems to map the environment.

    I don't think that in the time Tesla's Autopilot has been running this is the first time that it has ever come across a white 18 wheeler.

    most of the time that stuff is going to be caught by the radar

    my problem is that their rangefinder isn't adequate to actually detect things in front of the car, and you can't rely solely on image processing to detect shit

    EDIT: like, to further explain

    by design, it sounds like their rangefinding systems only scan at about bumper-level, which means to detect anything higher than that falls back onto camera image processing
    I don't think this is a case of a freak accident where it couldn't see white-on-white. I don't think it can hardly ever see white-on-white, but this was a time it really mattered

    By design the driver is there to see the truck.

  • Options
    zakkielzakkiel Registered User regular
    edited July 2016
    The Ender wrote: »
    Coinage wrote: »
    shryke wrote: »
    mcdermott wrote: »

    No, they aren't. WTF are you even talking about?

    Tesla, again from your own goosing link, is saying "Pay the fuck attention to the road". So no, they aren't guessing at what is within the bounds of human ability because we already know "driving" is well within those bounds.

    It's entirely possible that "pay the fuck attention to the road" isn't actually within the bounds of human ability once you're no longer actively driving.

    We know paying the fuck attention to the road is "out of the bounds of human ability" while talking on a cellphone, but we still have cellphones around. We just expect people to, you know, still drive.
    There is absolutely no way I could pay attention to the road if I'm not actually in control of the car.

    Literally nobody can, according to what testing we've done.

    So don't feel too bad about that.


    EDIT: This is the bedrock of Google's entire program. It's the reason they decided against a graduated approach, from driver assistance to eventual automation; the car must be able to handle the whole job, because humans are not acceptable back-up drivers.

    Tesla's now managed to kill someone because they didn't take the research seriously. I've always thought of Musk as a great engineer but a shitty scientist, and this highlights why.

    Musk isn't even a great engineer. He's a fantastic futurist and generally ethical forward thinking businessman. His knowledge of even programming, where he in theory made his bones with PayPal, is very high level anymore. Can say this with a fair amount of certainty, one of my peers and fairly good friends was refferred to as "crazy smart" by Musk in an interview. My friend is definitely a sharp guy, but not really top talent in his field.

    If by "futurist" you mean "increasingly unhinged purveyor of bullshit," then yes, I suppose he is fantastic at that.
    In a freewheeling talk before shareholders Tuesday, Musk said he and his Tesla team will completely rethink the factory process, hoping to bring “factors of 10 or even 100 times” in improvements in efficiency to the manner in which “you build the machines that build the machine.”

    Musk, returning repeatedly to the idea of “physics-first principles,” said he no longer uses an office at Tesla, but spends all of his time on the production line.

    That exercise has shown him methods by which production capacity could be increased exponentially, he said, by applying those principles.
    Technology entrepreneur Elon Musk is really excited about getting the first humans to land on Mars in 2025 with the view to establishing a colony, but in case you didn't realise this already, he is warning that pioneering a new planet probably won't be much fun.

    "It's dangerous and probably people will die – and they'll know that. And then they'll pave the way, and ultimately it will be very safe to go to Mars, and it will be very comfortable. But that will be many years in the future," Musk told the Washington Post in a new interview detailing how the Mission to Mars technical journey is likely to evolve.
    "There's a billion to one chance we're living in base reality," Elon Musk said tonight on stage at Recode's Code Conference, meaning that one of the most influential and powerful figures in tech thinks that it's overwhelmingly likely we're just characters living inside a simulation.
    This is the sort of grandiosity you expect from Pixar villains, not actual CEOs. Dude has clearly bought into the nerd-cult surrounding him in a major way. I wonder how long his companies will survive his personal descent into almost Trumpian crackpottedness. I mean, on the one hand, he's assembled a lot of really exceptional talent. On the other, that talent is organized in a strict dictatorship with Dear Leader at the top.

    zakkiel on
    Account not recoverable. So long.
  • Options
    GoumindongGoumindong Registered User regular
    The Ender wrote: »
    shryke wrote: »
    mcdermott wrote: »

    No, they aren't. WTF are you even talking about?

    Tesla, again from your own goosing link, is saying "Pay the fuck attention to the road". So no, they aren't guessing at what is within the bounds of human ability because we already know "driving" is well within those bounds.

    It's entirely possible that "pay the fuck attention to the road" isn't actually within the bounds of human ability once you're no longer actively driving.

    We know paying the fuck attention to the road is "out of the bounds of human ability" while talking on a cellphone, but we still have cellphones around. We just expect people to, you know, still drive.

    Do we build cellphones right into the damn cars and advertise them as a great car-centric feature, while also attaching a little disclaimer about how you should also totes pay attention to the road?


    Google did this testing already (Tesla didn't): they repeatedly had their own drivers tasked with paying attention to the road while in an autocar, and then monitored the drivers. They even let the drivers know they were being monitored.

    The drivers didn't pay attention.


    It had nothing to do with being a shitty driver or not. Everyone is a shitty driver. Asking people to pay attention isn't good enough; people just won't, even if they believe they are.

    spool32 wrote: »
    It seems to me that anything that delivers a net decrease in accidents should be fine, even if it isn't 100% perfect.

    Or is there data to suggest that Tesla drivers get in more accidents than the average? A lot of these objections seem like speculation about what future bad thing might happen, when current good things are already happening...

    Tesla autopilot currently has 130 miles driven, 1 fatality.

    National average in the U.S. with shitty human drivers in normal cars is about 95~ million miles driven per fatality.


    That is an absolutely unacceptable, completely shit statistic, IMHO (particularly when you consider that Tesla's cars only operate in one specific region of the U.S.). Autocars have a potential for decreasing road accidents by an order of magnitude; Tesla's product does marginally better than shitty human drivers, at an exorbitant price. If that's all we want, there's no point in investing in AI; just load everyone's cars with impact dampening technology and call it a day.


    EDIT: I mean, consider that Tesla's futuristic wonder cars are driving on some of the best maintained road in America, and that the current 95~ million miles per fatality statistic is improving mostly because old vehicles are being taken out of circulation. If we compare Tesla's score with the average score from modern vehicles using the same roads, how does Tesla do? My money is on 'about on par with everyone else, maybe worse'.

    Also consider that without autopilot, the driver would have probably been paying attention to the road & would not have died. It's not like a semi truck is easy for a person to miss if their eyes are on the road.

    N=1 (and or estimated P is very low, same difference) so it's hard to say that this is an accurate number

    wbBv3fj.png
  • Options
    KetBraKetBra Dressed Ridiculously Registered User regular
    shryke wrote: »
    Aioua wrote: »
    Dedwrekka wrote: »
    Aioua wrote: »
    Visual light 2D image processing isn't really an acceptable way for an auto driving system to be looking to its front.

    Tesla said it's because the trailer blended I with the sky. Which means it's radar rangefinder doesn't sweep an area large enough to actually avoid collision.

    Like it's not some crazy circumstance that a vehicle was colored similarly to its background, the problem is that they were relying on inadequate systems to map the environment.

    I don't think that in the time Tesla's Autopilot has been running this is the first time that it has ever come across a white 18 wheeler.

    most of the time that stuff is going to be caught by the radar

    my problem is that their rangefinder isn't adequate to actually detect things in front of the car, and you can't rely solely on image processing to detect shit

    EDIT: like, to further explain

    by design, it sounds like their rangefinding systems only scan at about bumper-level, which means to detect anything higher than that falls back onto camera image processing
    I don't think this is a case of a freak accident where it couldn't see white-on-white. I don't think it can hardly ever see white-on-white, but this was a time it really mattered

    By design the driver is there to see the truck.

    See above references to attention spans for things you aren't controlling

    KGMvDLc.jpg?1
  • Options
    AbsoluteZeroAbsoluteZero The new film by Quentin Koopantino Registered User regular
    The Ender wrote: »
    Dedwrekka wrote: »
    The Ender wrote: »
    kime wrote:
    That last paragraph doesn't really count if we're looking at the data. Like, sure, he'd have caught that maybe ("if his eyes are on the road"), but Autopilot may have prevented other, earlier accidents

    While that may be true, 'this car cannot detect something as conspicuous as an 18 wheeler crossing the highway' is waaaay below the threshold I think we should be willing to accept for any sort of autopilot functionality.


    That seems like an oversimplification or a misunderstanding of the situation. Even Tesla has admitted that there was something specific about the exact set of circumstances that led to the trailer not being recognized by the system. The facts we have right now say that for some reason that system was unable to recognize the 18 wheeler under those specific circumstances, but the way you're wording it is that the system can't recognize an 18 wheeler under any circumstances. That's simply not the case.

    The Tesla Autosteer needs to be modified to require continuous driver input, yes. The system needs to be examined to find out exactly what went wrong under those circumstances, yes. We don't need to pretend that the system is incapable of functioning, or inflate the issue to do that.

    Tesla claims that the car couldn't see the trailer because it's color scheme made it blend into the sky. That's completely unacceptable for any sort of autopilot system. As others have said, the hardware that the car is using are simply insufficient for building an accurate model of the environment.


    The trade-off for driver assistance features is that drivers will rely on those systems. So, the question to ask before implementing them is, 'Do these systems have a positive enough impact on driving that the reduction in driver awareness / participation is offset?'

    Here the answer was, 'Nope!' and the cost for learning that answer was a fatality because they decided to test the product on the highway before testing in a controlled environment how people would actually use it. It was more important to Tesla to make an immediate return on their technology investment than to actually turnaround something safe.


    Honestly their reaction to what happened is extremely scummy to me, absolving themselves of all blame because 'well gee whiz we told people to watch the road!'


    EDIT: Also, I think you are exaggerating how complex or rare the incident was.

    Are 18 wheelers rare? No. Is it unusual to come across an 18 wheeler doing a U-turn (or any number of other dumb things) on a highway? No. Are 18 wheelers hard to spot? No.

    Is it acceptable that the autopilot then slammed into an 18 wheeler because it couldn't properly see a relatively common vehicle (a high ride trailer) doing something quite typical and thus didn't react at all? Of course not.

    Autopilot is glorified cruise control. I think expectations are way too high on that system, and Tesla doesn't make it a secret that just like if you were using cruise control, you need to be ready to take control of the vehicle at any moment.

    Apparently this guy was watching a harry potter movie on a portable dvd player at the time of the crash. At what point do we start laying some of the responsibility on him?

    cs6f034fsffl.jpg
  • Options
    Knight_Knight_ Dead Dead Dead Registered User regular
    I think he certainly gets some of the blame and I don't know that people are trying to absolve him of such.

    But Tesla certainly is also at fault for marketing the system like this, and for not enforcing controls on the system that would prevent this sort of thing from happening.

    So basically, like most car accidents, everyone's at fault!

    aeNqQM9.jpg
  • Options
    kimekime Queen of Blades Registered User regular
    The Ender wrote: »
    Dedwrekka wrote: »
    The Ender wrote: »
    kime wrote:
    That last paragraph doesn't really count if we're looking at the data. Like, sure, he'd have caught that maybe ("if his eyes are on the road"), but Autopilot may have prevented other, earlier accidents

    While that may be true, 'this car cannot detect something as conspicuous as an 18 wheeler crossing the highway' is waaaay below the threshold I think we should be willing to accept for any sort of autopilot functionality.


    That seems like an oversimplification or a misunderstanding of the situation. Even Tesla has admitted that there was something specific about the exact set of circumstances that led to the trailer not being recognized by the system. The facts we have right now say that for some reason that system was unable to recognize the 18 wheeler under those specific circumstances, but the way you're wording it is that the system can't recognize an 18 wheeler under any circumstances. That's simply not the case.

    The Tesla Autosteer needs to be modified to require continuous driver input, yes. The system needs to be examined to find out exactly what went wrong under those circumstances, yes. We don't need to pretend that the system is incapable of functioning, or inflate the issue to do that.

    Tesla claims that the car couldn't see the trailer because it's color scheme made it blend into the sky. That's completely unacceptable for any sort of autopilot system. As others have said, the hardware that the car is using are simply insufficient for building an accurate model of the environment.


    The trade-off for driver assistance features is that drivers will rely on those systems. So, the question to ask before implementing them is, 'Do these systems have a positive enough impact on driving that the reduction in driver awareness / participation is offset?'

    Here the answer was, 'Nope!' and the cost for learning that answer was a fatality because they decided to test the product on the highway before testing in a controlled environment how people would actually use it. It was more important to Tesla to make an immediate return on their technology investment than to actually turnaround something safe.


    Honestly their reaction to what happened is extremely scummy to me, absolving themselves of all blame because 'well gee whiz we told people to watch the road!'


    EDIT: Also, I think you are exaggerating how complex or rare the incident was.

    Are 18 wheelers rare? No. Is it unusual to come across an 18 wheeler doing a U-turn (or any number of other dumb things) on a highway? No. Are 18 wheelers hard to spot? No.

    Is it acceptable that the autopilot then slammed into an 18 wheeler because it couldn't properly see a relatively common vehicle (a high ride trailer) doing something quite typical and thus didn't react at all? Of course not.

    Autopilot is glorified cruise control. I think expectations are way too high on that system, and Tesla doesn't make it a secret that just like if you were using cruise control, you need to be ready to take control of the vehicle at any moment.

    Apparently this guy was watching a harry potter movie on a portable dvd player at the time of the crash. At what point do we start laying some of the responsibility on him?

    Human beings are not capable of paying attention to boring things that don't (apparently) require our attention.

    Admitting that feels like we are just removing blame for something that people should be able to control (if they weren't lazy or whatever), but it looks like our brains just literally don't work like that.

    So do we lay responsibility on someone for not doing something they can't do?

    Battle.net ID: kime#1822
    3DS Friend Code: 3110-5393-4113
    Steam profile
  • Options
    PaladinPaladin Registered User regular
    The Ender wrote: »
    Dedwrekka wrote: »
    The Ender wrote: »
    kime wrote:
    That last paragraph doesn't really count if we're looking at the data. Like, sure, he'd have caught that maybe ("if his eyes are on the road"), but Autopilot may have prevented other, earlier accidents

    While that may be true, 'this car cannot detect something as conspicuous as an 18 wheeler crossing the highway' is waaaay below the threshold I think we should be willing to accept for any sort of autopilot functionality.


    That seems like an oversimplification or a misunderstanding of the situation. Even Tesla has admitted that there was something specific about the exact set of circumstances that led to the trailer not being recognized by the system. The facts we have right now say that for some reason that system was unable to recognize the 18 wheeler under those specific circumstances, but the way you're wording it is that the system can't recognize an 18 wheeler under any circumstances. That's simply not the case.

    The Tesla Autosteer needs to be modified to require continuous driver input, yes. The system needs to be examined to find out exactly what went wrong under those circumstances, yes. We don't need to pretend that the system is incapable of functioning, or inflate the issue to do that.

    Tesla claims that the car couldn't see the trailer because it's color scheme made it blend into the sky. That's completely unacceptable for any sort of autopilot system. As others have said, the hardware that the car is using are simply insufficient for building an accurate model of the environment.


    The trade-off for driver assistance features is that drivers will rely on those systems. So, the question to ask before implementing them is, 'Do these systems have a positive enough impact on driving that the reduction in driver awareness / participation is offset?'

    Here the answer was, 'Nope!' and the cost for learning that answer was a fatality because they decided to test the product on the highway before testing in a controlled environment how people would actually use it. It was more important to Tesla to make an immediate return on their technology investment than to actually turnaround something safe.


    Honestly their reaction to what happened is extremely scummy to me, absolving themselves of all blame because 'well gee whiz we told people to watch the road!'


    EDIT: Also, I think you are exaggerating how complex or rare the incident was.

    Are 18 wheelers rare? No. Is it unusual to come across an 18 wheeler doing a U-turn (or any number of other dumb things) on a highway? No. Are 18 wheelers hard to spot? No.

    Is it acceptable that the autopilot then slammed into an 18 wheeler because it couldn't properly see a relatively common vehicle (a high ride trailer) doing something quite typical and thus didn't react at all? Of course not.

    Autopilot is glorified cruise control. I think expectations are way too high on that system, and Tesla doesn't make it a secret that just like if you were using cruise control, you need to be ready to take control of the vehicle at any moment.

    Apparently this guy was watching a harry potter movie on a portable dvd player at the time of the crash. At what point do we start laying some of the responsibility on him?

    When we find anything out that can verify this circumstantial evidence.

    Marty: The future, it's where you're going?
    Doc: That's right, twenty five years into the future. I've always dreamed on seeing the future, looking beyond my years, seeing the progress of mankind. I'll also be able to see who wins the next twenty-five world series.
  • Options
    shrykeshryke Member of the Beast Registered User regular
    KetBra wrote: »
    shryke wrote: »
    Aioua wrote: »
    Dedwrekka wrote: »
    Aioua wrote: »
    Visual light 2D image processing isn't really an acceptable way for an auto driving system to be looking to its front.

    Tesla said it's because the trailer blended I with the sky. Which means it's radar rangefinder doesn't sweep an area large enough to actually avoid collision.

    Like it's not some crazy circumstance that a vehicle was colored similarly to its background, the problem is that they were relying on inadequate systems to map the environment.

    I don't think that in the time Tesla's Autopilot has been running this is the first time that it has ever come across a white 18 wheeler.

    most of the time that stuff is going to be caught by the radar

    my problem is that their rangefinder isn't adequate to actually detect things in front of the car, and you can't rely solely on image processing to detect shit

    EDIT: like, to further explain

    by design, it sounds like their rangefinding systems only scan at about bumper-level, which means to detect anything higher than that falls back onto camera image processing
    I don't think this is a case of a freak accident where it couldn't see white-on-white. I don't think it can hardly ever see white-on-white, but this was a time it really mattered

    By design the driver is there to see the truck.

    See above references to attention spans for things you aren't controlling

    See the Tesla documentation on the thing where they say you have to pay attention.

    This whole "OMG, it didn't see the truck!" stuff is a bunch of nonsense. It's based on blaming the system for something it is explicitly not designed to do.

    We don't blame LG because some morons crashes his car cause he was texting while driving. "We know people can't pay attention to the road while using a cellphone and here they went and made cellphones for people and even made them accept texts and calls while driving! And look, most manufacturers have in-built hands-free-calling features to encourage cellphone use in the car even though that's still very distracting to the driver! Those bastards!" This argument is nonsense.

    The system tells you how it's designed to be used. If you can't use it properly, then don't. But ya can't blame the system for failing at something it explicitly tells you it doesn't do.


    PS - And please, stop waterboarding statistics in here. It doesn't know the launch codes.

  • Options
    AiouaAioua Ora Occidens Ora OptimaRegistered User regular
    Ok another example.

    Most of the fancier car computers these days will severely restrict their functionally while the car is moving. Maybe you can charge the song but you can't sync a new phone. You can pull up the map but not enter in an address for the gps directions.
    Why? They could just say the driver should never use the system while the car is moving and leave it at that.

    Because humans gonna human. If the driver assist does such a good job that the average person is going to (eventually) forget to pay attention to the road, but the driver assist isn't good enough to be trusted to drive the car unsupervised, then it is a dangerous feature and shouldn't be allowed.

    We're not wasting time blaming the guy because he paid the price. We're trying to prevent an accident like this from happening again.

    life's a game that you're bound to lose / like using a hammer to pound in screws
    fuck up once and you break your thumb / if you're happy at all then you're god damn dumb
    that's right we're on a fucked up cruise / God is dead but at least we have booze
    bad things happen, no one knows why / the sun burns out and everyone dies
  • Options
    Giggles_FunsworthGiggles_Funsworth Blight on Discourse Bay Area SprawlRegistered User regular
    zakkiel wrote: »
    The Ender wrote: »
    Coinage wrote: »
    shryke wrote: »
    mcdermott wrote: »

    No, they aren't. WTF are you even talking about?

    Tesla, again from your own goosing link, is saying "Pay the fuck attention to the road". So no, they aren't guessing at what is within the bounds of human ability because we already know "driving" is well within those bounds.

    It's entirely possible that "pay the fuck attention to the road" isn't actually within the bounds of human ability once you're no longer actively driving.

    We know paying the fuck attention to the road is "out of the bounds of human ability" while talking on a cellphone, but we still have cellphones around. We just expect people to, you know, still drive.
    There is absolutely no way I could pay attention to the road if I'm not actually in control of the car.

    Literally nobody can, according to what testing we've done.

    So don't feel too bad about that.


    EDIT: This is the bedrock of Google's entire program. It's the reason they decided against a graduated approach, from driver assistance to eventual automation; the car must be able to handle the whole job, because humans are not acceptable back-up drivers.

    Tesla's now managed to kill someone because they didn't take the research seriously. I've always thought of Musk as a great engineer but a shitty scientist, and this highlights why.

    Musk isn't even a great engineer. He's a fantastic futurist and generally ethical forward thinking businessman. His knowledge of even programming, where he in theory made his bones with PayPal, is very high level anymore. Can say this with a fair amount of certainty, one of my peers and fairly good friends was refferred to as "crazy smart" by Musk in an interview. My friend is definitely a sharp guy, but not really top talent in his field.

    If by "futurist" you mean "increasingly unhinged purveyor of bullshit," then yes, I suppose he is fantastic at that.
    In a freewheeling talk before shareholders Tuesday, Musk said he and his Tesla team will completely rethink the factory process, hoping to bring “factors of 10 or even 100 times” in improvements in efficiency to the manner in which “you build the machines that build the machine.”

    Musk, returning repeatedly to the idea of “physics-first principles,” said he no longer uses an office at Tesla, but spends all of his time on the production line.

    That exercise has shown him methods by which production capacity could be increased exponentially, he said, by applying those principles.
    Technology entrepreneur Elon Musk is really excited about getting the first humans to land on Mars in 2025 with the view to establishing a colony, but in case you didn't realise this already, he is warning that pioneering a new planet probably won't be much fun.

    "It's dangerous and probably people will die – and they'll know that. And then they'll pave the way, and ultimately it will be very safe to go to Mars, and it will be very comfortable. But that will be many years in the future," Musk told the Washington Post in a new interview detailing how the Mission to Mars technical journey is likely to evolve.
    "There's a billion to one chance we're living in base reality," Elon Musk said tonight on stage at Recode's Code Conference, meaning that one of the most influential and powerful figures in tech thinks that it's overwhelmingly likely we're just characters living inside a simulation.
    This is the sort of grandiosity you expect from Pixar villains, not actual CEOs. Dude has clearly bought into the nerd-cult surrounding him in a major way. I wonder how long his companies will survive his personal descent into almost Trumpian crackpottedness. I mean, on the one hand, he's assembled a lot of really exceptional talent. On the other, that talent is organized in a strict dictatorship with Dear Leader at the top.

    Oh Jesus, I hadn't seen those quotes. Nah, I was more saying that he took all the money from a fantastic stroke of luck, and then reinvested it in three interrelated high risk forward thinking businesses that benefit society, and so far has executed on all of them relatively well, largely through assembling fantastic talent, most of which are far better engineers than him.

Sign In or Register to comment.