Options

[Uber]: Disrupting Livery Service (And Ethics)

1616264666781

Posts

  • Options
    MortiousMortious The Nightmare Begins Move to New ZealandRegistered User regular
    Dedwrekka wrote: »
    This may be the most well documented pedestrian accident due to the car's sensors and cameras. It may be safe to say that we can wait and see where the fault lies.

    *looks at all the other stories about Uber co-operating with Law Enforcement*

    Move to New Zealand
    It’s not a very important country most of the time
    http://steamcommunity.com/id/mortious
  • Options
    PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    milski wrote: »
    Phyphor wrote: »
    milski wrote: »
    tsmvengy wrote: »
    kime wrote: »
    I understand the history of Jaywalking. I didn't watch the video, but I've read about it before. It doesn't change the fact that currently, something like 80% of accidents involving pedestrians are with pedestrians that were not on a crosswalk.

    Which brings me back to my statement that Jaywalking is dangerous and you shouldn't do it.
    tsmvengy wrote: »
    Veevee wrote: »
    The woman was crossing the street outside a crosswalk when she was hit, the spokesperson said.

    This has been one of the big fears in autonomous vehicle research, and has put the program once again in hiatus.

    Why? This happens all the time and the driver is not held responsible because the pedestrian was outside of the cross walk. From this last friday:

    https://www.channel3000.com/news/crash-involving-pedestrian-closes-part-of-park-st-in-madison/717093601
    The woman was crossing Park Street near Hughes Place when the crash happened, officers told News 3. The pedestrian was wearing all dark clothing and was not in a crosswalk at the time.

    The driver stayed on scene and is cooperating with police, officials said. The driver will not be cited, according to a release from Madison police.

    So why would this kill autonomous vehicles? The fact it hasn't happened before should be a testament to the effectiveness of autonomous vehicles.

    Aren't driverless vehicles supposed to be bringing all kinds of safety benefits? Doesn't the use of LIDAR mean that vehicles can "see" pedestrians even if they are wearing dark clothing at night? Never mind that we shouldn't require people to wear something special just to be outside at night.

    Also, let's talk about how the police immediately go to "crossing the street outside the crosswalk" and then if you watch the video here (https://www.abc15.com/news/region-southeast-valley/tempe/tempe-police-investigating-self-driving-uber-car-involved-in-crash-overnight) you can see a smashed up bicycle 12 seconds in. Hmm.

    "More safe" does not mean "perfectly safe." Autonomous vehicles should definitely be the former! But they shouldn't be held accountable for the latter. There should be some work here on Uber's side to see why their system failed, but it is not an indictment on autonomous vehicles that they were involved in an accident.

    So far, autonomous vehicles have not even proven to be "more safe" than unimpaired human drivers. Human drivers in the US average about 1 fatality per 100 million miles, and that includes the 30% of fatalities that are drunk driving related. So they have to way more safe than the existing average rates. Here's fatality #1 for AVs, who have not even racked up 100 million miles yet (Waymo just hit 4 million miles total in November, Uber hit 2 million in December.)

    Well with a sample of a single incident, that hasn't been investigated at all, not controlling for any statistical factors you've convinced me.

    Even with a single incident, you can predict a MTBF. If the first failure was at, generously, 20 million miles, it is extremely unlikely the MTBF is greater than 100 million miles.

    Additionally, investigation doesn't change the case at all if we're simply referring to all fatalities; regardless of whether the woman is at-fault or not, we're comparing apples to apples here.

    If this were the case we would expect a large increase in collisions of all types with self driving cars hitting things all over, but a study by Virginia Tech found a slight overall decrease in collision rate (for Google's tech anyway, and everybody's tech is likely to produce different rates). If the woman had survived after going to the hospital but was still hit and nothing else had changed we would be talking about injuries not fatalities

    This would be assuming that all accident rates would increase or decrease in the same proportion and not that e.g. self driving technology in its present state is more likely to catastrophically fail by never recognizing a pedestrian compared to other modes of failure.
    milski wrote: »
    tsmvengy wrote: »
    kime wrote: »
    I understand the history of Jaywalking. I didn't watch the video, but I've read about it before. It doesn't change the fact that currently, something like 80% of accidents involving pedestrians are with pedestrians that were not on a crosswalk.

    Which brings me back to my statement that Jaywalking is dangerous and you shouldn't do it.
    tsmvengy wrote: »
    Veevee wrote: »
    The woman was crossing the street outside a crosswalk when she was hit, the spokesperson said.

    This has been one of the big fears in autonomous vehicle research, and has put the program once again in hiatus.

    Why? This happens all the time and the driver is not held responsible because the pedestrian was outside of the cross walk. From this last friday:

    https://www.channel3000.com/news/crash-involving-pedestrian-closes-part-of-park-st-in-madison/717093601
    The woman was crossing Park Street near Hughes Place when the crash happened, officers told News 3. The pedestrian was wearing all dark clothing and was not in a crosswalk at the time.

    The driver stayed on scene and is cooperating with police, officials said. The driver will not be cited, according to a release from Madison police.

    So why would this kill autonomous vehicles? The fact it hasn't happened before should be a testament to the effectiveness of autonomous vehicles.

    Aren't driverless vehicles supposed to be bringing all kinds of safety benefits? Doesn't the use of LIDAR mean that vehicles can "see" pedestrians even if they are wearing dark clothing at night? Never mind that we shouldn't require people to wear something special just to be outside at night.

    Also, let's talk about how the police immediately go to "crossing the street outside the crosswalk" and then if you watch the video here (https://www.abc15.com/news/region-southeast-valley/tempe/tempe-police-investigating-self-driving-uber-car-involved-in-crash-overnight) you can see a smashed up bicycle 12 seconds in. Hmm.

    "More safe" does not mean "perfectly safe." Autonomous vehicles should definitely be the former! But they shouldn't be held accountable for the latter. There should be some work here on Uber's side to see why their system failed, but it is not an indictment on autonomous vehicles that they were involved in an accident.

    So far, autonomous vehicles have not even proven to be "more safe" than unimpaired human drivers. Human drivers in the US average about 1 fatality per 100 million miles, and that includes the 30% of fatalities that are drunk driving related. So they have to way more safe than the existing average rates. Here's fatality #1 for AVs, who have not even racked up 100 million miles yet (Waymo just hit 4 million miles total in November, Uber hit 2 million in December.)

    Well with a sample of a single incident, that hasn't been investigated at all, not controlling for any statistical factors you've convinced me.

    Even with a single incident, you can predict a MTBF. If the first failure was at, generously, 20 million miles, it is extremely unlikely the MTBF is greater than 100 million miles.

    Additionally, investigation doesn't change the case at all if we're simply referring to all fatalities; regardless of whether the woman is at-fault or not, we're comparing apples to apples here.

    Of course the investigation will change things. If it's determined that she stepped into the road closer than the stopping distance of the car, then the collision was physically impossible to avoid and thus it's not a "failure" of the autonomous system.

    Cars kill 4500 and injure 70000-79000 pedestrians each year. So "apples-to-apples" statistically, for this 1 death, self-driving vehicles should have also had 15 non-fatal collisions with pedestrians.

    Human driven vehicle accident stats already include "impossible to avoid" scenarios, so no, it would not affect the stats. That is not to say such a scenario means the vehicle is at fault, but that you can't remove those from MTBF calcs.

    "impossible to avoid" scenarios are still very relevant though because at n=0 you're flawless and at n=1 you've basically failed. The overall stats are large enough that we don't need to calculate a confidence interval for them but we very much do given the comparatively tiny sample size

    And I think focusing strictly on fatalities is a mistake (since there would be no reason a self driving car would be more likely to kill pedestrians but not injure them). If we expand it to pedestrian hospitalizations due to crashes, there were about 135k in 2015 which would make it around 1 per 20 million miles. First failure at a third of the MTBF is certainly possible

  • Options
    DedwrekkaDedwrekka Metal Hell adjacentRegistered User regular
    Mortious wrote: »
    Dedwrekka wrote: »
    This may be the most well documented pedestrian accident due to the car's sensors and cameras. It may be safe to say that we can wait and see where the fault lies.

    *looks at all the other stories about Uber co-operating with Law Enforcement*

    NTSB is getting involved, there's not a whole lot Uber can do about it. If they don't cooperate they don't get to keep their autonomous program.

  • Options
    Jebus314Jebus314 Registered User regular
    edited March 2018
    Quid wrote: »
    Astaereth wrote: »
    The difference between human drivers and autonomous vehicles is where the liability lies. When I kill someone in my Ford, Ford doesn’t bear responsibility. When I happen to be inside the vehicle when my Uber self-driving car kills someone, Uber bears responsibility.

    If you kill someone driving for another company that company does bear the cost. It’s really not that different.

    That creates a way different dynamic for personal car sales though. Companies like UPS or taxi services make money from someone driving their cars. This income allows them to purchase the insurance and assume liability, and even still they have a bunch of added restrictions (who can drive, and when, and in what ways, ect...).

    No way is google just going to accept liability for my personal vehicle because it is self-driven. So either I still have to pay for the insurance (which basically means I am paying for accidents even if google was driving), or google is going to have to find some way to ensure I can't fuck-up their self driving car (by taking control at a bad moment, or getting improper maintenance, or trying to scam them, ect...).

    Neither seems like an ideal situation.

    Jebus314 on
    "The world is a mess, and I just need to rule it" - Dr Horrible
  • Options
    GoumindongGoumindong Registered User regular
    Google will accept the liability yes. They will not have a choice unless legislation explicitly strips them of such liability.

    Additionally it’s good that self driving car companies accept the liability. It reduces the insurance inefficiencies when large companies can insure themselves(essentially by being large)

    wbBv3fj.png
  • Options
    ZekZek Registered User regular
    Even if this scenario is very specifically a car-at-fault software bug, it was going to happen eventually. The liability question needs a definitive answer. Hopefully Uber can restrain themselves from fucking it up for everybody else somehow.

  • Options
    milskimilski Poyo! Registered User regular
    Phyphor wrote: »
    milski wrote: »
    Phyphor wrote: »
    milski wrote: »
    tsmvengy wrote: »
    kime wrote: »
    I understand the history of Jaywalking. I didn't watch the video, but I've read about it before. It doesn't change the fact that currently, something like 80% of accidents involving pedestrians are with pedestrians that were not on a crosswalk.

    Which brings me back to my statement that Jaywalking is dangerous and you shouldn't do it.
    tsmvengy wrote: »
    Veevee wrote: »
    The woman was crossing the street outside a crosswalk when she was hit, the spokesperson said.

    This has been one of the big fears in autonomous vehicle research, and has put the program once again in hiatus.

    Why? This happens all the time and the driver is not held responsible because the pedestrian was outside of the cross walk. From this last friday:

    https://www.channel3000.com/news/crash-involving-pedestrian-closes-part-of-park-st-in-madison/717093601
    The woman was crossing Park Street near Hughes Place when the crash happened, officers told News 3. The pedestrian was wearing all dark clothing and was not in a crosswalk at the time.

    The driver stayed on scene and is cooperating with police, officials said. The driver will not be cited, according to a release from Madison police.

    So why would this kill autonomous vehicles? The fact it hasn't happened before should be a testament to the effectiveness of autonomous vehicles.

    Aren't driverless vehicles supposed to be bringing all kinds of safety benefits? Doesn't the use of LIDAR mean that vehicles can "see" pedestrians even if they are wearing dark clothing at night? Never mind that we shouldn't require people to wear something special just to be outside at night.

    Also, let's talk about how the police immediately go to "crossing the street outside the crosswalk" and then if you watch the video here (https://www.abc15.com/news/region-southeast-valley/tempe/tempe-police-investigating-self-driving-uber-car-involved-in-crash-overnight) you can see a smashed up bicycle 12 seconds in. Hmm.

    "More safe" does not mean "perfectly safe." Autonomous vehicles should definitely be the former! But they shouldn't be held accountable for the latter. There should be some work here on Uber's side to see why their system failed, but it is not an indictment on autonomous vehicles that they were involved in an accident.

    So far, autonomous vehicles have not even proven to be "more safe" than unimpaired human drivers. Human drivers in the US average about 1 fatality per 100 million miles, and that includes the 30% of fatalities that are drunk driving related. So they have to way more safe than the existing average rates. Here's fatality #1 for AVs, who have not even racked up 100 million miles yet (Waymo just hit 4 million miles total in November, Uber hit 2 million in December.)

    Well with a sample of a single incident, that hasn't been investigated at all, not controlling for any statistical factors you've convinced me.

    Even with a single incident, you can predict a MTBF. If the first failure was at, generously, 20 million miles, it is extremely unlikely the MTBF is greater than 100 million miles.

    Additionally, investigation doesn't change the case at all if we're simply referring to all fatalities; regardless of whether the woman is at-fault or not, we're comparing apples to apples here.

    If this were the case we would expect a large increase in collisions of all types with self driving cars hitting things all over, but a study by Virginia Tech found a slight overall decrease in collision rate (for Google's tech anyway, and everybody's tech is likely to produce different rates). If the woman had survived after going to the hospital but was still hit and nothing else had changed we would be talking about injuries not fatalities

    This would be assuming that all accident rates would increase or decrease in the same proportion and not that e.g. self driving technology in its present state is more likely to catastrophically fail by never recognizing a pedestrian compared to other modes of failure.
    milski wrote: »
    tsmvengy wrote: »
    kime wrote: »
    I understand the history of Jaywalking. I didn't watch the video, but I've read about it before. It doesn't change the fact that currently, something like 80% of accidents involving pedestrians are with pedestrians that were not on a crosswalk.

    Which brings me back to my statement that Jaywalking is dangerous and you shouldn't do it.
    tsmvengy wrote: »
    Veevee wrote: »
    The woman was crossing the street outside a crosswalk when she was hit, the spokesperson said.

    This has been one of the big fears in autonomous vehicle research, and has put the program once again in hiatus.

    Why? This happens all the time and the driver is not held responsible because the pedestrian was outside of the cross walk. From this last friday:

    https://www.channel3000.com/news/crash-involving-pedestrian-closes-part-of-park-st-in-madison/717093601
    The woman was crossing Park Street near Hughes Place when the crash happened, officers told News 3. The pedestrian was wearing all dark clothing and was not in a crosswalk at the time.

    The driver stayed on scene and is cooperating with police, officials said. The driver will not be cited, according to a release from Madison police.

    So why would this kill autonomous vehicles? The fact it hasn't happened before should be a testament to the effectiveness of autonomous vehicles.

    Aren't driverless vehicles supposed to be bringing all kinds of safety benefits? Doesn't the use of LIDAR mean that vehicles can "see" pedestrians even if they are wearing dark clothing at night? Never mind that we shouldn't require people to wear something special just to be outside at night.

    Also, let's talk about how the police immediately go to "crossing the street outside the crosswalk" and then if you watch the video here (https://www.abc15.com/news/region-southeast-valley/tempe/tempe-police-investigating-self-driving-uber-car-involved-in-crash-overnight) you can see a smashed up bicycle 12 seconds in. Hmm.

    "More safe" does not mean "perfectly safe." Autonomous vehicles should definitely be the former! But they shouldn't be held accountable for the latter. There should be some work here on Uber's side to see why their system failed, but it is not an indictment on autonomous vehicles that they were involved in an accident.

    So far, autonomous vehicles have not even proven to be "more safe" than unimpaired human drivers. Human drivers in the US average about 1 fatality per 100 million miles, and that includes the 30% of fatalities that are drunk driving related. So they have to way more safe than the existing average rates. Here's fatality #1 for AVs, who have not even racked up 100 million miles yet (Waymo just hit 4 million miles total in November, Uber hit 2 million in December.)

    Well with a sample of a single incident, that hasn't been investigated at all, not controlling for any statistical factors you've convinced me.

    Even with a single incident, you can predict a MTBF. If the first failure was at, generously, 20 million miles, it is extremely unlikely the MTBF is greater than 100 million miles.

    Additionally, investigation doesn't change the case at all if we're simply referring to all fatalities; regardless of whether the woman is at-fault or not, we're comparing apples to apples here.

    Of course the investigation will change things. If it's determined that she stepped into the road closer than the stopping distance of the car, then the collision was physically impossible to avoid and thus it's not a "failure" of the autonomous system.

    Cars kill 4500 and injure 70000-79000 pedestrians each year. So "apples-to-apples" statistically, for this 1 death, self-driving vehicles should have also had 15 non-fatal collisions with pedestrians.

    Human driven vehicle accident stats already include "impossible to avoid" scenarios, so no, it would not affect the stats. That is not to say such a scenario means the vehicle is at fault, but that you can't remove those from MTBF calcs.

    "impossible to avoid" scenarios are still very relevant though because at n=0 you're flawless and at n=1 you've basically failed. The overall stats are large enough that we don't need to calculate a confidence interval for them but we very much do given the comparatively tiny sample size

    And I think focusing strictly on fatalities is a mistake (since there would be no reason a self driving car would be more likely to kill pedestrians but not injure them). If we expand it to pedestrian hospitalizations due to crashes, there were about 135k in 2015 which would make it around 1 per 20 million miles. First failure at a third of the MTBF is certainly possible

    I would disagree with this, actually. While I do not know how self-driving cars operate in depth, it is possible and in fact likely that they do not respond like humans do. For instance, while this is a totally spitballed theory and could be inaccurate, human caused incidents may be due to delayed reactions, while autonomous car caused incidents may be due to a failure to react at all. With delayed reactions, humans are likely to brake to some extent and somewhat mitigate the damage of an accident. However, if an incident is caused due to a total failure to react, it will happen at full speed and be more likely to result in death. So if a self-driving car, say, erroneously fails to detect humans as obstructions in certain conditions regardless of distance, they could have a relatively higher prevalence of fatal accidents without a similar increase in injurious ones.

    I ate an engineer
  • Options
    SchrodingerSchrodinger Registered User regular
    kime wrote: »
    Jaywalking is really dangerous, this reasonably should not slow down autonomous vehicle work at all.

    And yet, I have very little faith in reason these days :(... Plus I secretly kind of what Uber's autonomous vehicles to fail because I don't like Uber (due to stuff this thread has cataloged in depth) :P

    "Jaywalking" is a gooseshit term meant to push accountability onto pedestrians from car drivers:

    https://youtu.be/-AFn7MiJz_s

    The video has a good point in saying, "Hey, maybe we shouldn't have designed these cities the way that we designed them."

    Of course, now that the cities are already that way, there's not much you can do unless you're willing to completely rebuild them. It's like complaining about the merits of Bluray vs. HD-DVD. Even if you had an argument that HD-DVD was superior, it wouldn't matter, because it's already too late for that.

  • Options
    Jebus314Jebus314 Registered User regular
    Goumindong wrote: »
    Google will accept the liability yes. They will not have a choice unless legislation explicitly strips them of such liability.

    Additionally it’s good that self driving car companies accept the liability. It reduces the insurance inefficiencies when large companies can insure themselves(essentially by being large)

    Or they just wont make self driving cars and we will perpetually be stuck in the in-between of very advanced driver assisted cars, but never full autonomy because the costs are too high.

    Or we will switch to a distributed "personal" car system, where google owns all the cars and you just rent them for trips from point a to point b.

    Or maybe something else entirely. But I seriously doubt we will ever see google paying the insurance for a car that I own and maintain because they wrote the software for the driver-less ai.

    "The world is a mess, and I just need to rule it" - Dr Horrible
  • Options
    spool32spool32 Contrary Library Registered User regular
    Jebus314 wrote: »
    Goumindong wrote: »
    Google will accept the liability yes. They will not have a choice unless legislation explicitly strips them of such liability.

    Additionally it’s good that self driving car companies accept the liability. It reduces the insurance inefficiencies when large companies can insure themselves(essentially by being large)

    Or they just wont make self driving cars and we will perpetually be stuck in the in-between of very advanced driver assisted cars, but never full autonomy because the costs are too high.

    Or we will switch to a distributed "personal" car system, where google owns all the cars and you just rent them for trips from point a to point b.

    Or maybe something else entirely. But I seriously doubt we will ever see google paying the insurance for a car that I own and maintain because they wrote the software for the driver-less ai.

    yeah we were talking about this in [chat] the other day - it's not tenable. We'll need to reclassify and create a new insurance market. if nothing else the personal auto insurance industry is not going to just roll over as corporate fleet insurance is handled by 3 huge driverless companies and the whole thing is rolled into the price of the car. Progressive doesn't want Google to insure me any more than Google does.

  • Options
    PolaritiePolaritie Sleepy Registered User regular
    milski wrote: »
    Phyphor wrote: »
    milski wrote: »
    Phyphor wrote: »
    milski wrote: »
    tsmvengy wrote: »
    kime wrote: »
    I understand the history of Jaywalking. I didn't watch the video, but I've read about it before. It doesn't change the fact that currently, something like 80% of accidents involving pedestrians are with pedestrians that were not on a crosswalk.

    Which brings me back to my statement that Jaywalking is dangerous and you shouldn't do it.
    tsmvengy wrote: »
    Veevee wrote: »
    The woman was crossing the street outside a crosswalk when she was hit, the spokesperson said.

    This has been one of the big fears in autonomous vehicle research, and has put the program once again in hiatus.

    Why? This happens all the time and the driver is not held responsible because the pedestrian was outside of the cross walk. From this last friday:

    https://www.channel3000.com/news/crash-involving-pedestrian-closes-part-of-park-st-in-madison/717093601
    The woman was crossing Park Street near Hughes Place when the crash happened, officers told News 3. The pedestrian was wearing all dark clothing and was not in a crosswalk at the time.

    The driver stayed on scene and is cooperating with police, officials said. The driver will not be cited, according to a release from Madison police.

    So why would this kill autonomous vehicles? The fact it hasn't happened before should be a testament to the effectiveness of autonomous vehicles.

    Aren't driverless vehicles supposed to be bringing all kinds of safety benefits? Doesn't the use of LIDAR mean that vehicles can "see" pedestrians even if they are wearing dark clothing at night? Never mind that we shouldn't require people to wear something special just to be outside at night.

    Also, let's talk about how the police immediately go to "crossing the street outside the crosswalk" and then if you watch the video here (https://www.abc15.com/news/region-southeast-valley/tempe/tempe-police-investigating-self-driving-uber-car-involved-in-crash-overnight) you can see a smashed up bicycle 12 seconds in. Hmm.

    "More safe" does not mean "perfectly safe." Autonomous vehicles should definitely be the former! But they shouldn't be held accountable for the latter. There should be some work here on Uber's side to see why their system failed, but it is not an indictment on autonomous vehicles that they were involved in an accident.

    So far, autonomous vehicles have not even proven to be "more safe" than unimpaired human drivers. Human drivers in the US average about 1 fatality per 100 million miles, and that includes the 30% of fatalities that are drunk driving related. So they have to way more safe than the existing average rates. Here's fatality #1 for AVs, who have not even racked up 100 million miles yet (Waymo just hit 4 million miles total in November, Uber hit 2 million in December.)

    Well with a sample of a single incident, that hasn't been investigated at all, not controlling for any statistical factors you've convinced me.

    Even with a single incident, you can predict a MTBF. If the first failure was at, generously, 20 million miles, it is extremely unlikely the MTBF is greater than 100 million miles.

    Additionally, investigation doesn't change the case at all if we're simply referring to all fatalities; regardless of whether the woman is at-fault or not, we're comparing apples to apples here.

    If this were the case we would expect a large increase in collisions of all types with self driving cars hitting things all over, but a study by Virginia Tech found a slight overall decrease in collision rate (for Google's tech anyway, and everybody's tech is likely to produce different rates). If the woman had survived after going to the hospital but was still hit and nothing else had changed we would be talking about injuries not fatalities

    This would be assuming that all accident rates would increase or decrease in the same proportion and not that e.g. self driving technology in its present state is more likely to catastrophically fail by never recognizing a pedestrian compared to other modes of failure.
    milski wrote: »
    tsmvengy wrote: »
    kime wrote: »
    I understand the history of Jaywalking. I didn't watch the video, but I've read about it before. It doesn't change the fact that currently, something like 80% of accidents involving pedestrians are with pedestrians that were not on a crosswalk.

    Which brings me back to my statement that Jaywalking is dangerous and you shouldn't do it.
    tsmvengy wrote: »
    Veevee wrote: »
    The woman was crossing the street outside a crosswalk when she was hit, the spokesperson said.

    This has been one of the big fears in autonomous vehicle research, and has put the program once again in hiatus.

    Why? This happens all the time and the driver is not held responsible because the pedestrian was outside of the cross walk. From this last friday:

    https://www.channel3000.com/news/crash-involving-pedestrian-closes-part-of-park-st-in-madison/717093601
    The woman was crossing Park Street near Hughes Place when the crash happened, officers told News 3. The pedestrian was wearing all dark clothing and was not in a crosswalk at the time.

    The driver stayed on scene and is cooperating with police, officials said. The driver will not be cited, according to a release from Madison police.

    So why would this kill autonomous vehicles? The fact it hasn't happened before should be a testament to the effectiveness of autonomous vehicles.

    Aren't driverless vehicles supposed to be bringing all kinds of safety benefits? Doesn't the use of LIDAR mean that vehicles can "see" pedestrians even if they are wearing dark clothing at night? Never mind that we shouldn't require people to wear something special just to be outside at night.

    Also, let's talk about how the police immediately go to "crossing the street outside the crosswalk" and then if you watch the video here (https://www.abc15.com/news/region-southeast-valley/tempe/tempe-police-investigating-self-driving-uber-car-involved-in-crash-overnight) you can see a smashed up bicycle 12 seconds in. Hmm.

    "More safe" does not mean "perfectly safe." Autonomous vehicles should definitely be the former! But they shouldn't be held accountable for the latter. There should be some work here on Uber's side to see why their system failed, but it is not an indictment on autonomous vehicles that they were involved in an accident.

    So far, autonomous vehicles have not even proven to be "more safe" than unimpaired human drivers. Human drivers in the US average about 1 fatality per 100 million miles, and that includes the 30% of fatalities that are drunk driving related. So they have to way more safe than the existing average rates. Here's fatality #1 for AVs, who have not even racked up 100 million miles yet (Waymo just hit 4 million miles total in November, Uber hit 2 million in December.)

    Well with a sample of a single incident, that hasn't been investigated at all, not controlling for any statistical factors you've convinced me.

    Even with a single incident, you can predict a MTBF. If the first failure was at, generously, 20 million miles, it is extremely unlikely the MTBF is greater than 100 million miles.

    Additionally, investigation doesn't change the case at all if we're simply referring to all fatalities; regardless of whether the woman is at-fault or not, we're comparing apples to apples here.

    Of course the investigation will change things. If it's determined that she stepped into the road closer than the stopping distance of the car, then the collision was physically impossible to avoid and thus it's not a "failure" of the autonomous system.

    Cars kill 4500 and injure 70000-79000 pedestrians each year. So "apples-to-apples" statistically, for this 1 death, self-driving vehicles should have also had 15 non-fatal collisions with pedestrians.

    Human driven vehicle accident stats already include "impossible to avoid" scenarios, so no, it would not affect the stats. That is not to say such a scenario means the vehicle is at fault, but that you can't remove those from MTBF calcs.

    "impossible to avoid" scenarios are still very relevant though because at n=0 you're flawless and at n=1 you've basically failed. The overall stats are large enough that we don't need to calculate a confidence interval for them but we very much do given the comparatively tiny sample size

    And I think focusing strictly on fatalities is a mistake (since there would be no reason a self driving car would be more likely to kill pedestrians but not injure them). If we expand it to pedestrian hospitalizations due to crashes, there were about 135k in 2015 which would make it around 1 per 20 million miles. First failure at a third of the MTBF is certainly possible

    I would disagree with this, actually. While I do not know how self-driving cars operate in depth, it is possible and in fact likely that they do not respond like humans do. For instance, while this is a totally spitballed theory and could be inaccurate, human caused incidents may be due to delayed reactions, while autonomous car caused incidents may be due to a failure to react at all. With delayed reactions, humans are likely to brake to some extent and somewhat mitigate the damage of an accident. However, if an incident is caused due to a total failure to react, it will happen at full speed and be more likely to result in death. So if a self-driving car, say, erroneously fails to detect humans as obstructions in certain conditions regardless of distance, they could have a relatively higher prevalence of fatal accidents without a similar increase in injurious ones.

    So... humans are totally capable of freezing up and failing to act at all, and computers are completely capable of having slow algorithms. So... your premise seems to need work?

    Steam: Polaritie
    3DS: 0473-8507-2652
    Switch: SW-5185-4991-5118
    PSN: AbEntropy
  • Options
    schussschuss Registered User regular
    spool32 wrote: »
    Jebus314 wrote: »
    Goumindong wrote: »
    Google will accept the liability yes. They will not have a choice unless legislation explicitly strips them of such liability.

    Additionally it’s good that self driving car companies accept the liability. It reduces the insurance inefficiencies when large companies can insure themselves(essentially by being large)

    Or they just wont make self driving cars and we will perpetually be stuck in the in-between of very advanced driver assisted cars, but never full autonomy because the costs are too high.

    Or we will switch to a distributed "personal" car system, where google owns all the cars and you just rent them for trips from point a to point b.

    Or maybe something else entirely. But I seriously doubt we will ever see google paying the insurance for a car that I own and maintain because they wrote the software for the driver-less ai.

    yeah we were talking about this in [chat] the other day - it's not tenable. We'll need to reclassify and create a new insurance market. if nothing else the personal auto insurance industry is not going to just roll over as corporate fleet insurance is handled by 3 huge driverless companies and the whole thing is rolled into the price of the car. Progressive doesn't want Google to insure me any more than Google does.

    As someone working for an insurance Company: we don't really have a choice. Fleet insurance is already a thing, and it's pretty easy through rental or leasing to make things run through the automaker or a proxy. Google will be fine with it as they'll likely just self insure with reinsurance to cover the risk of a catastrophic glitch.
    Personal auto is pretty done if you look 20 years down the road.

  • Options
    syndalissyndalis Getting Classy On the WallRegistered User, Loves Apple Products regular
    I also am of a mind that self driving cars will continue to become safer as more of the cars on the road become self driving.

    A "herd immunity" of sorts, where we can set up localized swarm driving, backed by road utilization, to have the vehicles take the optimal routes not just for you but for everyone on the road to get where they need to go as soon as possible.

    Cars could make way for emergency vehicles well before a siren is heard and notify the passengers as to why they are pulling over momentarily.

    There are so many benefits to where this could go that I would be super fucking annoyed if luddites halted progress industry-wide because someone (unfortunately) died during testing by a company who is known for operating fast and loose. The problem here is not moving the ball forward on autonomous vehicles; the problem here may not even be uber, since expecting perfection is generally the enemy of the good and it sounds like the person who was hit was a perfect storm of shit that no driver, autonomous or otherwise, could have avoided. But uber is so tarnished as a company that I would be okay with them taking the fall for this to be honest, if it means other actors on the stage can keep moving the ball forward on this technology.

    SW-4158-3990-6116
    Let's play Mario Kart or something...
  • Options
    milskimilski Poyo! Registered User regular
    Polaritie wrote: »
    milski wrote: »
    Phyphor wrote: »
    milski wrote: »
    Phyphor wrote: »
    milski wrote: »
    tsmvengy wrote: »
    kime wrote: »
    I understand the history of Jaywalking. I didn't watch the video, but I've read about it before. It doesn't change the fact that currently, something like 80% of accidents involving pedestrians are with pedestrians that were not on a crosswalk.

    Which brings me back to my statement that Jaywalking is dangerous and you shouldn't do it.
    tsmvengy wrote: »
    Veevee wrote: »
    The woman was crossing the street outside a crosswalk when she was hit, the spokesperson said.

    This has been one of the big fears in autonomous vehicle research, and has put the program once again in hiatus.

    Why? This happens all the time and the driver is not held responsible because the pedestrian was outside of the cross walk. From this last friday:

    https://www.channel3000.com/news/crash-involving-pedestrian-closes-part-of-park-st-in-madison/717093601
    The woman was crossing Park Street near Hughes Place when the crash happened, officers told News 3. The pedestrian was wearing all dark clothing and was not in a crosswalk at the time.

    The driver stayed on scene and is cooperating with police, officials said. The driver will not be cited, according to a release from Madison police.

    So why would this kill autonomous vehicles? The fact it hasn't happened before should be a testament to the effectiveness of autonomous vehicles.

    Aren't driverless vehicles supposed to be bringing all kinds of safety benefits? Doesn't the use of LIDAR mean that vehicles can "see" pedestrians even if they are wearing dark clothing at night? Never mind that we shouldn't require people to wear something special just to be outside at night.

    Also, let's talk about how the police immediately go to "crossing the street outside the crosswalk" and then if you watch the video here (https://www.abc15.com/news/region-southeast-valley/tempe/tempe-police-investigating-self-driving-uber-car-involved-in-crash-overnight) you can see a smashed up bicycle 12 seconds in. Hmm.

    "More safe" does not mean "perfectly safe." Autonomous vehicles should definitely be the former! But they shouldn't be held accountable for the latter. There should be some work here on Uber's side to see why their system failed, but it is not an indictment on autonomous vehicles that they were involved in an accident.

    So far, autonomous vehicles have not even proven to be "more safe" than unimpaired human drivers. Human drivers in the US average about 1 fatality per 100 million miles, and that includes the 30% of fatalities that are drunk driving related. So they have to way more safe than the existing average rates. Here's fatality #1 for AVs, who have not even racked up 100 million miles yet (Waymo just hit 4 million miles total in November, Uber hit 2 million in December.)

    Well with a sample of a single incident, that hasn't been investigated at all, not controlling for any statistical factors you've convinced me.

    Even with a single incident, you can predict a MTBF. If the first failure was at, generously, 20 million miles, it is extremely unlikely the MTBF is greater than 100 million miles.

    Additionally, investigation doesn't change the case at all if we're simply referring to all fatalities; regardless of whether the woman is at-fault or not, we're comparing apples to apples here.

    If this were the case we would expect a large increase in collisions of all types with self driving cars hitting things all over, but a study by Virginia Tech found a slight overall decrease in collision rate (for Google's tech anyway, and everybody's tech is likely to produce different rates). If the woman had survived after going to the hospital but was still hit and nothing else had changed we would be talking about injuries not fatalities

    This would be assuming that all accident rates would increase or decrease in the same proportion and not that e.g. self driving technology in its present state is more likely to catastrophically fail by never recognizing a pedestrian compared to other modes of failure.
    milski wrote: »
    tsmvengy wrote: »
    kime wrote: »
    I understand the history of Jaywalking. I didn't watch the video, but I've read about it before. It doesn't change the fact that currently, something like 80% of accidents involving pedestrians are with pedestrians that were not on a crosswalk.

    Which brings me back to my statement that Jaywalking is dangerous and you shouldn't do it.
    tsmvengy wrote: »
    Veevee wrote: »
    The woman was crossing the street outside a crosswalk when she was hit, the spokesperson said.

    This has been one of the big fears in autonomous vehicle research, and has put the program once again in hiatus.

    Why? This happens all the time and the driver is not held responsible because the pedestrian was outside of the cross walk. From this last friday:

    https://www.channel3000.com/news/crash-involving-pedestrian-closes-part-of-park-st-in-madison/717093601
    The woman was crossing Park Street near Hughes Place when the crash happened, officers told News 3. The pedestrian was wearing all dark clothing and was not in a crosswalk at the time.

    The driver stayed on scene and is cooperating with police, officials said. The driver will not be cited, according to a release from Madison police.

    So why would this kill autonomous vehicles? The fact it hasn't happened before should be a testament to the effectiveness of autonomous vehicles.

    Aren't driverless vehicles supposed to be bringing all kinds of safety benefits? Doesn't the use of LIDAR mean that vehicles can "see" pedestrians even if they are wearing dark clothing at night? Never mind that we shouldn't require people to wear something special just to be outside at night.

    Also, let's talk about how the police immediately go to "crossing the street outside the crosswalk" and then if you watch the video here (https://www.abc15.com/news/region-southeast-valley/tempe/tempe-police-investigating-self-driving-uber-car-involved-in-crash-overnight) you can see a smashed up bicycle 12 seconds in. Hmm.

    "More safe" does not mean "perfectly safe." Autonomous vehicles should definitely be the former! But they shouldn't be held accountable for the latter. There should be some work here on Uber's side to see why their system failed, but it is not an indictment on autonomous vehicles that they were involved in an accident.

    So far, autonomous vehicles have not even proven to be "more safe" than unimpaired human drivers. Human drivers in the US average about 1 fatality per 100 million miles, and that includes the 30% of fatalities that are drunk driving related. So they have to way more safe than the existing average rates. Here's fatality #1 for AVs, who have not even racked up 100 million miles yet (Waymo just hit 4 million miles total in November, Uber hit 2 million in December.)

    Well with a sample of a single incident, that hasn't been investigated at all, not controlling for any statistical factors you've convinced me.

    Even with a single incident, you can predict a MTBF. If the first failure was at, generously, 20 million miles, it is extremely unlikely the MTBF is greater than 100 million miles.

    Additionally, investigation doesn't change the case at all if we're simply referring to all fatalities; regardless of whether the woman is at-fault or not, we're comparing apples to apples here.

    Of course the investigation will change things. If it's determined that she stepped into the road closer than the stopping distance of the car, then the collision was physically impossible to avoid and thus it's not a "failure" of the autonomous system.

    Cars kill 4500 and injure 70000-79000 pedestrians each year. So "apples-to-apples" statistically, for this 1 death, self-driving vehicles should have also had 15 non-fatal collisions with pedestrians.

    Human driven vehicle accident stats already include "impossible to avoid" scenarios, so no, it would not affect the stats. That is not to say such a scenario means the vehicle is at fault, but that you can't remove those from MTBF calcs.

    "impossible to avoid" scenarios are still very relevant though because at n=0 you're flawless and at n=1 you've basically failed. The overall stats are large enough that we don't need to calculate a confidence interval for them but we very much do given the comparatively tiny sample size

    And I think focusing strictly on fatalities is a mistake (since there would be no reason a self driving car would be more likely to kill pedestrians but not injure them). If we expand it to pedestrian hospitalizations due to crashes, there were about 135k in 2015 which would make it around 1 per 20 million miles. First failure at a third of the MTBF is certainly possible

    I would disagree with this, actually. While I do not know how self-driving cars operate in depth, it is possible and in fact likely that they do not respond like humans do. For instance, while this is a totally spitballed theory and could be inaccurate, human caused incidents may be due to delayed reactions, while autonomous car caused incidents may be due to a failure to react at all. With delayed reactions, humans are likely to brake to some extent and somewhat mitigate the damage of an accident. However, if an incident is caused due to a total failure to react, it will happen at full speed and be more likely to result in death. So if a self-driving car, say, erroneously fails to detect humans as obstructions in certain conditions regardless of distance, they could have a relatively higher prevalence of fatal accidents without a similar increase in injurious ones.

    So... humans are totally capable of freezing up and failing to act at all, and computers are completely capable of having slow algorithms. So... your premise seems to need work?

    This is a very condescending way to reply to something I admitted was spitballed and just meant to highlight how it is not obvious self driving cars would fail in the same fashion as humans.

    I ate an engineer
  • Options
    discriderdiscrider Registered User regular
    syndalis wrote: »
    I also am of a mind that self driving cars will continue to become safer as more of the cars on the road become self driving.

    A "herd immunity" of sorts, where we can set up localized swarm driving, backed by road utilization, to have the vehicles take the optimal routes not just for you but for everyone on the road to get where they need to go as soon as possible.

    Cars could make way for emergency vehicles well before a siren is heard and notify the passengers as to why they are pulling over momentarily.

    There are so many benefits to where this could go that I would be super fucking annoyed if luddites halted progress industry-wide because someone (unfortunately) died during testing by a company who is known for operating fast and loose. The problem here is not moving the ball forward on autonomous vehicles; the problem here may not even be uber, since expecting perfection is generally the enemy of the good and it sounds like the person who was hit was a perfect storm of shit that no driver, autonomous or otherwise, could have avoided. But uber is so tarnished as a company that I would be okay with them taking the fall for this to be honest, if it means other actors on the stage can keep moving the ball forward on this technology.

    I'm still concerned about the 'hacker'/kid that drops an autonomous car's transponder off an overpass onto a highway.
    How is the river of traffic going to deal with the sudden appearance of a stationary 'car'.

    But yeah, Uber needs to turn over the tapes so that we can discern whether the car was at fault, or whether this was an unavoidable accident.

  • Options
    mRahmanimRahmani DetroitRegistered User regular
    discrider wrote: »
    syndalis wrote: »
    I also am of a mind that self driving cars will continue to become safer as more of the cars on the road become self driving.

    A "herd immunity" of sorts, where we can set up localized swarm driving, backed by road utilization, to have the vehicles take the optimal routes not just for you but for everyone on the road to get where they need to go as soon as possible.

    Cars could make way for emergency vehicles well before a siren is heard and notify the passengers as to why they are pulling over momentarily.

    There are so many benefits to where this could go that I would be super fucking annoyed if luddites halted progress industry-wide because someone (unfortunately) died during testing by a company who is known for operating fast and loose. The problem here is not moving the ball forward on autonomous vehicles; the problem here may not even be uber, since expecting perfection is generally the enemy of the good and it sounds like the person who was hit was a perfect storm of shit that no driver, autonomous or otherwise, could have avoided. But uber is so tarnished as a company that I would be okay with them taking the fall for this to be honest, if it means other actors on the stage can keep moving the ball forward on this technology.

    I'm still concerned about the 'hacker'/kid that drops an autonomous car's transponder off an overpass onto a highway.
    How is the river of traffic going to deal with the sudden appearance of a stationary 'car'.

    But yeah, Uber needs to turn over the tapes so that we can discern whether the car was at fault, or whether this was an unavoidable accident.

    Redundant systems. Right now, autonomous cars rely on a mix of camera, radar, and lidar inputs (among other things.) The car won't be making decisions purely based on one thing.

    Throttle control (gas pedal), for example, uses redundant sensors today, because we can't afford to have a computer erroneously think the pedal is pushed to the floor.

  • Options
    tsmvengytsmvengy Registered User regular
    Dedwrekka wrote: »
    This may be the most well documented pedestrian accident due to the car's sensors and cameras. It may be safe to say that we can wait and see where the fault lies.

    I agree, which is why I find it especially disappointing that the police STILL jump straight to putting all the onus on the victim.

    steam_sig.png
  • Options
    lazegamerlazegamer The magnanimous cyberspaceRegistered User regular
    edited March 2018
    discrider wrote: »
    syndalis wrote: »
    I also am of a mind that self driving cars will continue to become safer as more of the cars on the road become self driving.

    A "herd immunity" of sorts, where we can set up localized swarm driving, backed by road utilization, to have the vehicles take the optimal routes not just for you but for everyone on the road to get where they need to go as soon as possible.

    Cars could make way for emergency vehicles well before a siren is heard and notify the passengers as to why they are pulling over momentarily.

    There are so many benefits to where this could go that I would be super fucking annoyed if luddites halted progress industry-wide because someone (unfortunately) died during testing by a company who is known for operating fast and loose. The problem here is not moving the ball forward on autonomous vehicles; the problem here may not even be uber, since expecting perfection is generally the enemy of the good and it sounds like the person who was hit was a perfect storm of shit that no driver, autonomous or otherwise, could have avoided. But uber is so tarnished as a company that I would be okay with them taking the fall for this to be honest, if it means other actors on the stage can keep moving the ball forward on this technology.

    I'm still concerned about the 'hacker'/kid that drops an autonomous car's transponder off an overpass onto a highway.
    How is the river of traffic going to deal with the sudden appearance of a stationary 'car'.

    But yeah, Uber needs to turn over the tapes so that we can discern whether the car was at fault, or whether this was an unavoidable accident.

    Shouldn't you be equally concerned about a miscreant dropping an ied off an overpass onto a highway? Humans are fully capable of wreaking wanton damage and injury in the current situation, but it doesn't seem like something that happens frequently enough to have policy implications.

    lazegamer on
    I would download a car.
  • Options
    JavenJaven Registered User regular
    How long have the self-driving Uber vehicles been active?

  • Options
    DevoutlyApatheticDevoutlyApathetic Registered User regular
    lazegamer wrote: »
    discrider wrote: »
    syndalis wrote: »
    I also am of a mind that self driving cars will continue to become safer as more of the cars on the road become self driving.

    A "herd immunity" of sorts, where we can set up localized swarm driving, backed by road utilization, to have the vehicles take the optimal routes not just for you but for everyone on the road to get where they need to go as soon as possible.

    Cars could make way for emergency vehicles well before a siren is heard and notify the passengers as to why they are pulling over momentarily.

    There are so many benefits to where this could go that I would be super fucking annoyed if luddites halted progress industry-wide because someone (unfortunately) died during testing by a company who is known for operating fast and loose. The problem here is not moving the ball forward on autonomous vehicles; the problem here may not even be uber, since expecting perfection is generally the enemy of the good and it sounds like the person who was hit was a perfect storm of shit that no driver, autonomous or otherwise, could have avoided. But uber is so tarnished as a company that I would be okay with them taking the fall for this to be honest, if it means other actors on the stage can keep moving the ball forward on this technology.

    I'm still concerned about the 'hacker'/kid that drops an autonomous car's transponder off an overpass onto a highway.
    How is the river of traffic going to deal with the sudden appearance of a stationary 'car'.

    But yeah, Uber needs to turn over the tapes so that we can discern whether the car was at fault, or whether this was an unavoidable accident.

    Shouldn't you be equally concerned about a miscreant dropping an ied off an overpass onto a highway? Humans are fully capable of wreaking wanton damage and injury in the current situation, but it doesn't seem like something that happens frequently enough to have policy implications.

    The issue is gonna be ubiquity. While IEDs aren't complicated most folks do not have the skill set to make one. Removing a transponder is going to end up a bit like removing a spark plug eventually.

    Though your counter example still works, just replace IED with a bowling ball. For reasonably dense traffic at highway speed it might as well be.

    Nod. Get treat. PSN: Quippish
  • Options
    SiliconStewSiliconStew Registered User regular
    edited March 2018
    discrider wrote: »
    syndalis wrote: »
    I also am of a mind that self driving cars will continue to become safer as more of the cars on the road become self driving.

    A "herd immunity" of sorts, where we can set up localized swarm driving, backed by road utilization, to have the vehicles take the optimal routes not just for you but for everyone on the road to get where they need to go as soon as possible.

    Cars could make way for emergency vehicles well before a siren is heard and notify the passengers as to why they are pulling over momentarily.

    There are so many benefits to where this could go that I would be super fucking annoyed if luddites halted progress industry-wide because someone (unfortunately) died during testing by a company who is known for operating fast and loose. The problem here is not moving the ball forward on autonomous vehicles; the problem here may not even be uber, since expecting perfection is generally the enemy of the good and it sounds like the person who was hit was a perfect storm of shit that no driver, autonomous or otherwise, could have avoided. But uber is so tarnished as a company that I would be okay with them taking the fall for this to be honest, if it means other actors on the stage can keep moving the ball forward on this technology.

    I'm still concerned about the 'hacker'/kid that drops an autonomous car's transponder off an overpass onto a highway.
    How is the river of traffic going to deal with the sudden appearance of a stationary 'car'.

    But yeah, Uber needs to turn over the tapes so that we can discern whether the car was at fault, or whether this was an unavoidable accident.

    Assuming cars only used that one input to make driving decisions, they would come to a stop. As opposed to when someone throws a brick off an overpass and a driver freaks out, suddenly swerves into another lane, and causes a multi-car pileup.

    SiliconStew on
    Just remember that half the people you meet are below average intelligence.
  • Options
    FencingsaxFencingsax It is difficult to get a man to understand, when his salary depends upon his not understanding GNU Terry PratchettRegistered User regular
    lazegamer wrote: »
    discrider wrote: »
    syndalis wrote: »
    I also am of a mind that self driving cars will continue to become safer as more of the cars on the road become self driving.

    A "herd immunity" of sorts, where we can set up localized swarm driving, backed by road utilization, to have the vehicles take the optimal routes not just for you but for everyone on the road to get where they need to go as soon as possible.

    Cars could make way for emergency vehicles well before a siren is heard and notify the passengers as to why they are pulling over momentarily.

    There are so many benefits to where this could go that I would be super fucking annoyed if luddites halted progress industry-wide because someone (unfortunately) died during testing by a company who is known for operating fast and loose. The problem here is not moving the ball forward on autonomous vehicles; the problem here may not even be uber, since expecting perfection is generally the enemy of the good and it sounds like the person who was hit was a perfect storm of shit that no driver, autonomous or otherwise, could have avoided. But uber is so tarnished as a company that I would be okay with them taking the fall for this to be honest, if it means other actors on the stage can keep moving the ball forward on this technology.

    I'm still concerned about the 'hacker'/kid that drops an autonomous car's transponder off an overpass onto a highway.
    How is the river of traffic going to deal with the sudden appearance of a stationary 'car'.

    But yeah, Uber needs to turn over the tapes so that we can discern whether the car was at fault, or whether this was an unavoidable accident.

    Shouldn't you be equally concerned about a miscreant dropping an ied off an overpass onto a highway? Humans are fully capable of wreaking wanton damage and injury in the current situation, but it doesn't seem like something that happens frequently enough to have policy implications.

    The issue is gonna be ubiquity. While IEDs aren't complicated most folks do not have the skill set to make one. Removing a transponder is going to end up a bit like removing a spark plug eventually.

    Though your counter example still works, just replace IED with a bowling ball. For reasonably dense traffic at highway speed it might as well be.

    Why wouldn't they put it somewhere hard to get to?

  • Options
    DevoutlyApatheticDevoutlyApathetic Registered User regular
    Fencingsax wrote: »
    lazegamer wrote: »
    discrider wrote: »
    syndalis wrote: »
    I also am of a mind that self driving cars will continue to become safer as more of the cars on the road become self driving.

    A "herd immunity" of sorts, where we can set up localized swarm driving, backed by road utilization, to have the vehicles take the optimal routes not just for you but for everyone on the road to get where they need to go as soon as possible.

    Cars could make way for emergency vehicles well before a siren is heard and notify the passengers as to why they are pulling over momentarily.

    There are so many benefits to where this could go that I would be super fucking annoyed if luddites halted progress industry-wide because someone (unfortunately) died during testing by a company who is known for operating fast and loose. The problem here is not moving the ball forward on autonomous vehicles; the problem here may not even be uber, since expecting perfection is generally the enemy of the good and it sounds like the person who was hit was a perfect storm of shit that no driver, autonomous or otherwise, could have avoided. But uber is so tarnished as a company that I would be okay with them taking the fall for this to be honest, if it means other actors on the stage can keep moving the ball forward on this technology.

    I'm still concerned about the 'hacker'/kid that drops an autonomous car's transponder off an overpass onto a highway.
    How is the river of traffic going to deal with the sudden appearance of a stationary 'car'.

    But yeah, Uber needs to turn over the tapes so that we can discern whether the car was at fault, or whether this was an unavoidable accident.

    Shouldn't you be equally concerned about a miscreant dropping an ied off an overpass onto a highway? Humans are fully capable of wreaking wanton damage and injury in the current situation, but it doesn't seem like something that happens frequently enough to have policy implications.

    The issue is gonna be ubiquity. While IEDs aren't complicated most folks do not have the skill set to make one. Removing a transponder is going to end up a bit like removing a spark plug eventually.

    Though your counter example still works, just replace IED with a bowling ball. For reasonably dense traffic at highway speed it might as well be.

    Why wouldn't they put it somewhere hard to get to?

    Maintenance issues. Even then hard to get to doesn't stop folks from doing all kinds of stupid shit to cars. The skill set of car maintenance is gonna include dealing with these things and a whole lot more people do that then build IEDs for fun. Self driving cars may be around the corner but I'm not seeing them becoming disposable goods anywhere near as quickly.

    Also not a radio tech but "clearly broadcasting" and "in the middle of an engine block" seem like opposite design criteria to me.

    Nod. Get treat. PSN: Quippish
  • Options
    SiliconStewSiliconStew Registered User regular
    Fencingsax wrote: »
    lazegamer wrote: »
    discrider wrote: »
    syndalis wrote: »
    I also am of a mind that self driving cars will continue to become safer as more of the cars on the road become self driving.

    A "herd immunity" of sorts, where we can set up localized swarm driving, backed by road utilization, to have the vehicles take the optimal routes not just for you but for everyone on the road to get where they need to go as soon as possible.

    Cars could make way for emergency vehicles well before a siren is heard and notify the passengers as to why they are pulling over momentarily.

    There are so many benefits to where this could go that I would be super fucking annoyed if luddites halted progress industry-wide because someone (unfortunately) died during testing by a company who is known for operating fast and loose. The problem here is not moving the ball forward on autonomous vehicles; the problem here may not even be uber, since expecting perfection is generally the enemy of the good and it sounds like the person who was hit was a perfect storm of shit that no driver, autonomous or otherwise, could have avoided. But uber is so tarnished as a company that I would be okay with them taking the fall for this to be honest, if it means other actors on the stage can keep moving the ball forward on this technology.

    I'm still concerned about the 'hacker'/kid that drops an autonomous car's transponder off an overpass onto a highway.
    How is the river of traffic going to deal with the sudden appearance of a stationary 'car'.

    But yeah, Uber needs to turn over the tapes so that we can discern whether the car was at fault, or whether this was an unavoidable accident.

    Shouldn't you be equally concerned about a miscreant dropping an ied off an overpass onto a highway? Humans are fully capable of wreaking wanton damage and injury in the current situation, but it doesn't seem like something that happens frequently enough to have policy implications.

    The issue is gonna be ubiquity. While IEDs aren't complicated most folks do not have the skill set to make one. Removing a transponder is going to end up a bit like removing a spark plug eventually.

    Though your counter example still works, just replace IED with a bowling ball. For reasonably dense traffic at highway speed it might as well be.

    Why wouldn't they put it somewhere hard to get to?

    Maintenance issues. Even then hard to get to doesn't stop folks from doing all kinds of stupid shit to cars. The skill set of car maintenance is gonna include dealing with these things and a whole lot more people do that then build IEDs for fun. Self driving cars may be around the corner but I'm not seeing them becoming disposable goods anywhere near as quickly.

    Also not a radio tech but "clearly broadcasting" and "in the middle of an engine block" seem like opposite design criteria to me.

    You only need the antenna exposed, not the electronics.

    Just remember that half the people you meet are below average intelligence.
  • Options
    tbloxhamtbloxham Registered User regular
    Fencingsax wrote: »
    lazegamer wrote: »
    discrider wrote: »
    syndalis wrote: »
    I also am of a mind that self driving cars will continue to become safer as more of the cars on the road become self driving.

    A "herd immunity" of sorts, where we can set up localized swarm driving, backed by road utilization, to have the vehicles take the optimal routes not just for you but for everyone on the road to get where they need to go as soon as possible.

    Cars could make way for emergency vehicles well before a siren is heard and notify the passengers as to why they are pulling over momentarily.

    There are so many benefits to where this could go that I would be super fucking annoyed if luddites halted progress industry-wide because someone (unfortunately) died during testing by a company who is known for operating fast and loose. The problem here is not moving the ball forward on autonomous vehicles; the problem here may not even be uber, since expecting perfection is generally the enemy of the good and it sounds like the person who was hit was a perfect storm of shit that no driver, autonomous or otherwise, could have avoided. But uber is so tarnished as a company that I would be okay with them taking the fall for this to be honest, if it means other actors on the stage can keep moving the ball forward on this technology.

    I'm still concerned about the 'hacker'/kid that drops an autonomous car's transponder off an overpass onto a highway.
    How is the river of traffic going to deal with the sudden appearance of a stationary 'car'.

    But yeah, Uber needs to turn over the tapes so that we can discern whether the car was at fault, or whether this was an unavoidable accident.

    Shouldn't you be equally concerned about a miscreant dropping an ied off an overpass onto a highway? Humans are fully capable of wreaking wanton damage and injury in the current situation, but it doesn't seem like something that happens frequently enough to have policy implications.

    The issue is gonna be ubiquity. While IEDs aren't complicated most folks do not have the skill set to make one. Removing a transponder is going to end up a bit like removing a spark plug eventually.

    Though your counter example still works, just replace IED with a bowling ball. For reasonably dense traffic at highway speed it might as well be.

    Why wouldn't they put it somewhere hard to get to?

    Maintenance issues. Even then hard to get to doesn't stop folks from doing all kinds of stupid shit to cars. The skill set of car maintenance is gonna include dealing with these things and a whole lot more people do that then build IEDs for fun. Self driving cars may be around the corner but I'm not seeing them becoming disposable goods anywhere near as quickly.

    Also not a radio tech but "clearly broadcasting" and "in the middle of an engine block" seem like opposite design criteria to me.

    As we move forward towards a completely autonomous driving future, you'll find that the vehicles will be designed to be VERY easy for robots and gantry cranes and skilled techs to quickly repair and service, but that those factors will not be the same as the factors which make a vehicle easy for a human to repair and service alone. Perhaps we will go the other way, and have simple modular vehicles where things can be popped out easily, but in that case I would expect the modules themselves would be easily replaceable but VERY hard to open and access for human repair and modification.

    Also, making the transponder actually broadcast and say "I am another transponder, properly installed in vehicle 282330-nnnnd. I am in the middle of this street, and I am stationary in the center lane. Do not crash into me. I am intending to move into the left lane. Do not use the left lane. My vision systems see an obstruction in the right lane. Do not use the right lane." when its actually just hanging off a bridge will be at least as hard as building an IED.

    "That is cool" - Abraham Lincoln
  • Options
    spool32spool32 Contrary Library Registered User regular
    tbloxham wrote: »
    Fencingsax wrote: »
    lazegamer wrote: »
    discrider wrote: »
    syndalis wrote: »
    I also am of a mind that self driving cars will continue to become safer as more of the cars on the road become self driving.

    A "herd immunity" of sorts, where we can set up localized swarm driving, backed by road utilization, to have the vehicles take the optimal routes not just for you but for everyone on the road to get where they need to go as soon as possible.

    Cars could make way for emergency vehicles well before a siren is heard and notify the passengers as to why they are pulling over momentarily.

    There are so many benefits to where this could go that I would be super fucking annoyed if luddites halted progress industry-wide because someone (unfortunately) died during testing by a company who is known for operating fast and loose. The problem here is not moving the ball forward on autonomous vehicles; the problem here may not even be uber, since expecting perfection is generally the enemy of the good and it sounds like the person who was hit was a perfect storm of shit that no driver, autonomous or otherwise, could have avoided. But uber is so tarnished as a company that I would be okay with them taking the fall for this to be honest, if it means other actors on the stage can keep moving the ball forward on this technology.

    I'm still concerned about the 'hacker'/kid that drops an autonomous car's transponder off an overpass onto a highway.
    How is the river of traffic going to deal with the sudden appearance of a stationary 'car'.

    But yeah, Uber needs to turn over the tapes so that we can discern whether the car was at fault, or whether this was an unavoidable accident.

    Shouldn't you be equally concerned about a miscreant dropping an ied off an overpass onto a highway? Humans are fully capable of wreaking wanton damage and injury in the current situation, but it doesn't seem like something that happens frequently enough to have policy implications.

    The issue is gonna be ubiquity. While IEDs aren't complicated most folks do not have the skill set to make one. Removing a transponder is going to end up a bit like removing a spark plug eventually.

    Though your counter example still works, just replace IED with a bowling ball. For reasonably dense traffic at highway speed it might as well be.

    Why wouldn't they put it somewhere hard to get to?

    Maintenance issues. Even then hard to get to doesn't stop folks from doing all kinds of stupid shit to cars. The skill set of car maintenance is gonna include dealing with these things and a whole lot more people do that then build IEDs for fun. Self driving cars may be around the corner but I'm not seeing them becoming disposable goods anywhere near as quickly.

    Also not a radio tech but "clearly broadcasting" and "in the middle of an engine block" seem like opposite design criteria to me.

    As we move forward towards a completely autonomous driving future, you'll find that the vehicles will be designed to be VERY easy for robots and gantry cranes and skilled techs to quickly repair and service, but that those factors will not be the same as the factors which make a vehicle easy for a human to repair and service alone. Perhaps we will go the other way, and have simple modular vehicles where things can be popped out easily, but in that case I would expect the modules themselves would be easily replaceable but VERY hard to open and access for human repair and modification.

    Also, making the transponder actually broadcast and say "I am another transponder, properly installed in vehicle 282330-nnnnd. I am in the middle of this street, and I am stationary in the center lane. Do not crash into me. I am intending to move into the left lane. Do not use the left lane. My vision systems see an obstruction in the right lane. Do not use the right lane." when its actually just hanging off a bridge will be at least as hard as building an IED.

    The first time. Then you'll just download the code and upload it to Bluetooth.

    More worrying will be people hacking their car to get them through traffic faster...

  • Options
    redxredx I(x)=2(x)+1 whole numbersRegistered User regular
    edited March 2018
    spool32 wrote: »
    tbloxham wrote: »
    Fencingsax wrote: »
    lazegamer wrote: »
    discrider wrote: »
    syndalis wrote: »
    I also am of a mind that self driving cars will continue to become safer as more of the cars on the road become self driving.

    A "herd immunity" of sorts, where we can set up localized swarm driving, backed by road utilization, to have the vehicles take the optimal routes not just for you but for everyone on the road to get where they need to go as soon as possible.

    Cars could make way for emergency vehicles well before a siren is heard and notify the passengers as to why they are pulling over momentarily.

    There are so many benefits to where this could go that I would be super fucking annoyed if luddites halted progress industry-wide because someone (unfortunately) died during testing by a company who is known for operating fast and loose. The problem here is not moving the ball forward on autonomous vehicles; the problem here may not even be uber, since expecting perfection is generally the enemy of the good and it sounds like the person who was hit was a perfect storm of shit that no driver, autonomous or otherwise, could have avoided. But uber is so tarnished as a company that I would be okay with them taking the fall for this to be honest, if it means other actors on the stage can keep moving the ball forward on this technology.

    I'm still concerned about the 'hacker'/kid that drops an autonomous car's transponder off an overpass onto a highway.
    How is the river of traffic going to deal with the sudden appearance of a stationary 'car'.

    But yeah, Uber needs to turn over the tapes so that we can discern whether the car was at fault, or whether this was an unavoidable accident.

    Shouldn't you be equally concerned about a miscreant dropping an ied off an overpass onto a highway? Humans are fully capable of wreaking wanton damage and injury in the current situation, but it doesn't seem like something that happens frequently enough to have policy implications.

    The issue is gonna be ubiquity. While IEDs aren't complicated most folks do not have the skill set to make one. Removing a transponder is going to end up a bit like removing a spark plug eventually.

    Though your counter example still works, just replace IED with a bowling ball. For reasonably dense traffic at highway speed it might as well be.

    Why wouldn't they put it somewhere hard to get to?

    Maintenance issues. Even then hard to get to doesn't stop folks from doing all kinds of stupid shit to cars. The skill set of car maintenance is gonna include dealing with these things and a whole lot more people do that then build IEDs for fun. Self driving cars may be around the corner but I'm not seeing them becoming disposable goods anywhere near as quickly.

    Also not a radio tech but "clearly broadcasting" and "in the middle of an engine block" seem like opposite design criteria to me.

    As we move forward towards a completely autonomous driving future, you'll find that the vehicles will be designed to be VERY easy for robots and gantry cranes and skilled techs to quickly repair and service, but that those factors will not be the same as the factors which make a vehicle easy for a human to repair and service alone. Perhaps we will go the other way, and have simple modular vehicles where things can be popped out easily, but in that case I would expect the modules themselves would be easily replaceable but VERY hard to open and access for human repair and modification.

    Also, making the transponder actually broadcast and say "I am another transponder, properly installed in vehicle 282330-nnnnd. I am in the middle of this street, and I am stationary in the center lane. Do not crash into me. I am intending to move into the left lane. Do not use the left lane. My vision systems see an obstruction in the right lane. Do not use the right lane." when its actually just hanging off a bridge will be at least as hard as building an IED.

    The first time. Then you'll just download the code and upload it to Bluetooth.

    More worrying will be people hacking their car to get them through traffic faster...

    that would probably invalidate your insurance and probably put you into punative damages territory, so you'd be likely to end up with a significant fraction of a million dollars of debt.

    people would do it, but probably not that many


    and most people would not risk bricking their car, stuff like code signing can be pretty effective and the developer has every reason to err on the side of responding agressively to modifications of their code. Run a check on start up, and simply not be a car if it can tell stuff has been tampered with. Not perfect, can be defeated or used to DoS, but probably fairly effective.

    then do some carrot work to make drivers want to keep their cars online and up to date and unmodified.

    redx on
    They moistly come out at night, moistly.
  • Options
    tbloxhamtbloxham Registered User regular
    spool32 wrote: »
    tbloxham wrote: »
    Fencingsax wrote: »
    lazegamer wrote: »
    discrider wrote: »
    syndalis wrote: »
    I also am of a mind that self driving cars will continue to become safer as more of the cars on the road become self driving.

    A "herd immunity" of sorts, where we can set up localized swarm driving, backed by road utilization, to have the vehicles take the optimal routes not just for you but for everyone on the road to get where they need to go as soon as possible.

    Cars could make way for emergency vehicles well before a siren is heard and notify the passengers as to why they are pulling over momentarily.

    There are so many benefits to where this could go that I would be super fucking annoyed if luddites halted progress industry-wide because someone (unfortunately) died during testing by a company who is known for operating fast and loose. The problem here is not moving the ball forward on autonomous vehicles; the problem here may not even be uber, since expecting perfection is generally the enemy of the good and it sounds like the person who was hit was a perfect storm of shit that no driver, autonomous or otherwise, could have avoided. But uber is so tarnished as a company that I would be okay with them taking the fall for this to be honest, if it means other actors on the stage can keep moving the ball forward on this technology.

    I'm still concerned about the 'hacker'/kid that drops an autonomous car's transponder off an overpass onto a highway.
    How is the river of traffic going to deal with the sudden appearance of a stationary 'car'.

    But yeah, Uber needs to turn over the tapes so that we can discern whether the car was at fault, or whether this was an unavoidable accident.

    Shouldn't you be equally concerned about a miscreant dropping an ied off an overpass onto a highway? Humans are fully capable of wreaking wanton damage and injury in the current situation, but it doesn't seem like something that happens frequently enough to have policy implications.

    The issue is gonna be ubiquity. While IEDs aren't complicated most folks do not have the skill set to make one. Removing a transponder is going to end up a bit like removing a spark plug eventually.

    Though your counter example still works, just replace IED with a bowling ball. For reasonably dense traffic at highway speed it might as well be.

    Why wouldn't they put it somewhere hard to get to?

    Maintenance issues. Even then hard to get to doesn't stop folks from doing all kinds of stupid shit to cars. The skill set of car maintenance is gonna include dealing with these things and a whole lot more people do that then build IEDs for fun. Self driving cars may be around the corner but I'm not seeing them becoming disposable goods anywhere near as quickly.

    Also not a radio tech but "clearly broadcasting" and "in the middle of an engine block" seem like opposite design criteria to me.

    As we move forward towards a completely autonomous driving future, you'll find that the vehicles will be designed to be VERY easy for robots and gantry cranes and skilled techs to quickly repair and service, but that those factors will not be the same as the factors which make a vehicle easy for a human to repair and service alone. Perhaps we will go the other way, and have simple modular vehicles where things can be popped out easily, but in that case I would expect the modules themselves would be easily replaceable but VERY hard to open and access for human repair and modification.

    Also, making the transponder actually broadcast and say "I am another transponder, properly installed in vehicle 282330-nnnnd. I am in the middle of this street, and I am stationary in the center lane. Do not crash into me. I am intending to move into the left lane. Do not use the left lane. My vision systems see an obstruction in the right lane. Do not use the right lane." when its actually just hanging off a bridge will be at least as hard as building an IED.

    The first time. Then you'll just download the code and upload it to Bluetooth.

    More worrying will be people hacking their car to get them through traffic faster...

    But again, its still VASTLY harder than say, filling a box with packing peanuts and chucking it off an overpass. And much less cool visually. If you did it, I imagine the cars would initially come to a halt, but then they would start querying the system in ways it couldn't understand or respond to and would eventually just ignore it as glitching and malfunctioning.

    "282330-nnnnd -> I am 102299-xxyyz -> I am to your right. I see no hazard in the right lane."
    "282330-nnnnd -> I am 433920-mkqsq -> I am to your right. I see no hazard in the right lane."
    "282330-nnnnd -> I am 120000-aabbg -> I am to your left. I do not see you. I do not see your turn signals."
    "282330-nnnnd -> I am 022881-wperkd -> I am to your rear. I do not see you visually. My lidar does not see your reflector systems. My radar indicates no object present"
    "282330-nnnnd -> I am Southern California Traffic Station 20 -> I do not see you using camera 20xa. Refresh your position. Your lane change request is denied. Left lane is now open."
    "282330-nnnnd -> I am Southern California Traffic Station 20 -> I do not see the obstruction you indicate using camera 20xb or 102299-xxyyz or 433920-mkqsq. Right lane open for low speed only."

    And so on, all happening within 30 seconds or so.

    Car through traffic faster will be much harder. I imagine local 'do not pass, there is a hazard' systems will be handled locally between the vehicles which are in direct communication, but prioritization of speed, lane position etc will be handled by the central traffic management system

    "That is cool" - Abraham Lincoln
  • Options
    tinwhiskerstinwhiskers Registered User regular
    tsmvengy wrote: »
    Dedwrekka wrote: »
    This may be the most well documented pedestrian accident due to the car's sensors and cameras. It may be safe to say that we can wait and see where the fault lies.

    I agree, which is why I find it especially disappointing that the police STILL jump straight to putting all the onus on the victim.

    The police might have you know interviewed the safety driver and other people at the scene- like they do with every other accident that doesn't involve a self driving car. That can be done in minutes versus getting all the telemetry and video data off the car.

    6ylyzxlir2dz.png
  • Options
    spool32spool32 Contrary Library Registered User regular
    redx wrote: »
    spool32 wrote: »
    tbloxham wrote: »
    Fencingsax wrote: »
    lazegamer wrote: »
    discrider wrote: »
    syndalis wrote: »
    I also am of a mind that self driving cars will continue to become safer as more of the cars on the road become self driving.

    A "herd immunity" of sorts, where we can set up localized swarm driving, backed by road utilization, to have the vehicles take the optimal routes not just for you but for everyone on the road to get where they need to go as soon as possible.

    Cars could make way for emergency vehicles well before a siren is heard and notify the passengers as to why they are pulling over momentarily.

    There are so many benefits to where this could go that I would be super fucking annoyed if luddites halted progress industry-wide because someone (unfortunately) died during testing by a company who is known for operating fast and loose. The problem here is not moving the ball forward on autonomous vehicles; the problem here may not even be uber, since expecting perfection is generally the enemy of the good and it sounds like the person who was hit was a perfect storm of shit that no driver, autonomous or otherwise, could have avoided. But uber is so tarnished as a company that I would be okay with them taking the fall for this to be honest, if it means other actors on the stage can keep moving the ball forward on this technology.

    I'm still concerned about the 'hacker'/kid that drops an autonomous car's transponder off an overpass onto a highway.
    How is the river of traffic going to deal with the sudden appearance of a stationary 'car'.

    But yeah, Uber needs to turn over the tapes so that we can discern whether the car was at fault, or whether this was an unavoidable accident.

    Shouldn't you be equally concerned about a miscreant dropping an ied off an overpass onto a highway? Humans are fully capable of wreaking wanton damage and injury in the current situation, but it doesn't seem like something that happens frequently enough to have policy implications.

    The issue is gonna be ubiquity. While IEDs aren't complicated most folks do not have the skill set to make one. Removing a transponder is going to end up a bit like removing a spark plug eventually.

    Though your counter example still works, just replace IED with a bowling ball. For reasonably dense traffic at highway speed it might as well be.

    Why wouldn't they put it somewhere hard to get to?

    Maintenance issues. Even then hard to get to doesn't stop folks from doing all kinds of stupid shit to cars. The skill set of car maintenance is gonna include dealing with these things and a whole lot more people do that then build IEDs for fun. Self driving cars may be around the corner but I'm not seeing them becoming disposable goods anywhere near as quickly.

    Also not a radio tech but "clearly broadcasting" and "in the middle of an engine block" seem like opposite design criteria to me.

    As we move forward towards a completely autonomous driving future, you'll find that the vehicles will be designed to be VERY easy for robots and gantry cranes and skilled techs to quickly repair and service, but that those factors will not be the same as the factors which make a vehicle easy for a human to repair and service alone. Perhaps we will go the other way, and have simple modular vehicles where things can be popped out easily, but in that case I would expect the modules themselves would be easily replaceable but VERY hard to open and access for human repair and modification.

    Also, making the transponder actually broadcast and say "I am another transponder, properly installed in vehicle 282330-nnnnd. I am in the middle of this street, and I am stationary in the center lane. Do not crash into me. I am intending to move into the left lane. Do not use the left lane. My vision systems see an obstruction in the right lane. Do not use the right lane." when its actually just hanging off a bridge will be at least as hard as building an IED.

    The first time. Then you'll just download the code and upload it to Bluetooth.

    More worrying will be people hacking their car to get them through traffic faster...

    that would probably invalidate your insurance and probably put you into punative damages territory, so you'd be likely to end up with a significant fraction of a million dollars of debt.

    people would do it, but probably not that many


    and most people would not risk bricking their car, stuff like code signing can be pretty effective and the developer has every reason to err on the side of responding agressively to modifications of their code. Run a check on start up, and simply not be a car if it can tell stuff has been tampered with. Not perfect, can be defeated or used to DoS, but probably fairly effective.

    then do some carrot work to make drivers want to keep their cars online and up to date and unmodified.

    That runs contra to recent legislation in multiple states guaranteeing the right to modify equipment. What you're describing is the current state of affairs with John Deere, except consumers get mega fucking screwed by required software updates while having to sign a EULA for their tractor that absolves JD of liability..

  • Options
    tbloxhamtbloxham Registered User regular
    spool32 wrote: »
    redx wrote: »
    spool32 wrote: »
    tbloxham wrote: »
    Fencingsax wrote: »
    lazegamer wrote: »
    discrider wrote: »
    syndalis wrote: »
    I also am of a mind that self driving cars will continue to become safer as more of the cars on the road become self driving.

    A "herd immunity" of sorts, where we can set up localized swarm driving, backed by road utilization, to have the vehicles take the optimal routes not just for you but for everyone on the road to get where they need to go as soon as possible.

    Cars could make way for emergency vehicles well before a siren is heard and notify the passengers as to why they are pulling over momentarily.

    There are so many benefits to where this could go that I would be super fucking annoyed if luddites halted progress industry-wide because someone (unfortunately) died during testing by a company who is known for operating fast and loose. The problem here is not moving the ball forward on autonomous vehicles; the problem here may not even be uber, since expecting perfection is generally the enemy of the good and it sounds like the person who was hit was a perfect storm of shit that no driver, autonomous or otherwise, could have avoided. But uber is so tarnished as a company that I would be okay with them taking the fall for this to be honest, if it means other actors on the stage can keep moving the ball forward on this technology.

    I'm still concerned about the 'hacker'/kid that drops an autonomous car's transponder off an overpass onto a highway.
    How is the river of traffic going to deal with the sudden appearance of a stationary 'car'.

    But yeah, Uber needs to turn over the tapes so that we can discern whether the car was at fault, or whether this was an unavoidable accident.

    Shouldn't you be equally concerned about a miscreant dropping an ied off an overpass onto a highway? Humans are fully capable of wreaking wanton damage and injury in the current situation, but it doesn't seem like something that happens frequently enough to have policy implications.

    The issue is gonna be ubiquity. While IEDs aren't complicated most folks do not have the skill set to make one. Removing a transponder is going to end up a bit like removing a spark plug eventually.

    Though your counter example still works, just replace IED with a bowling ball. For reasonably dense traffic at highway speed it might as well be.

    Why wouldn't they put it somewhere hard to get to?

    Maintenance issues. Even then hard to get to doesn't stop folks from doing all kinds of stupid shit to cars. The skill set of car maintenance is gonna include dealing with these things and a whole lot more people do that then build IEDs for fun. Self driving cars may be around the corner but I'm not seeing them becoming disposable goods anywhere near as quickly.

    Also not a radio tech but "clearly broadcasting" and "in the middle of an engine block" seem like opposite design criteria to me.

    As we move forward towards a completely autonomous driving future, you'll find that the vehicles will be designed to be VERY easy for robots and gantry cranes and skilled techs to quickly repair and service, but that those factors will not be the same as the factors which make a vehicle easy for a human to repair and service alone. Perhaps we will go the other way, and have simple modular vehicles where things can be popped out easily, but in that case I would expect the modules themselves would be easily replaceable but VERY hard to open and access for human repair and modification.

    Also, making the transponder actually broadcast and say "I am another transponder, properly installed in vehicle 282330-nnnnd. I am in the middle of this street, and I am stationary in the center lane. Do not crash into me. I am intending to move into the left lane. Do not use the left lane. My vision systems see an obstruction in the right lane. Do not use the right lane." when its actually just hanging off a bridge will be at least as hard as building an IED.

    The first time. Then you'll just download the code and upload it to Bluetooth.

    More worrying will be people hacking their car to get them through traffic faster...

    that would probably invalidate your insurance and probably put you into punative damages territory, so you'd be likely to end up with a significant fraction of a million dollars of debt.

    people would do it, but probably not that many


    and most people would not risk bricking their car, stuff like code signing can be pretty effective and the developer has every reason to err on the side of responding agressively to modifications of their code. Run a check on start up, and simply not be a car if it can tell stuff has been tampered with. Not perfect, can be defeated or used to DoS, but probably fairly effective.

    then do some carrot work to make drivers want to keep their cars online and up to date and unmodified.

    That runs contra to recent legislation in multiple states guaranteeing the right to modify equipment. What you're describing is the current state of affairs with John Deere, except consumers get mega fucking screwed by required software updates while having to sign a EULA for their tractor that absolves JD of liability..

    However, most of that legislation simply requires the device not be made deliberately impossible for a user to service and states that the manufacturer has to provide necessary manuals to repair shops and so forth. However, if a device (like a car) can be shown to have a reason why it is hard to repair without a pair of 6 axis robotic arms and gantry crane (which would be true for most vehicles who wished to have fuel efficient designs and reliable connections) then its perfectly fine for the repair manual to say...

    "Take the vehicle and run shutdown and repair code 43b using the main terminal"
    "Place the vehicle onto your gantry crane and elevate to between 3 and 6 feet"
    "Activate your robotic engine removal system and run removal protocol 2038c"

    "That is cool" - Abraham Lincoln
  • Options
    QuidQuid Definitely not a banana Registered User regular
    You would still be free to modify your car to your desire. It would just no longer be legal to take on public roads like various other modified vehicles.

  • Options
    PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    A relevant article, though referencing data from last year: https://www.nextbigfuture.com/2018/03/ubers-technically-inferior-self-driving-system-kills-an-arizona-pedestrian.html

    Apparently Uber's implementation of self driving only goes 2/3rds of a mile without requiring intervention from the driver, the worst in the industry

  • Options
    spool32spool32 Contrary Library Registered User regular
    edited March 2018
    Quid wrote: »
    You would still be free to modify your car to your desire. It would just no longer be legal to take on public roads like various other modified vehicles.

    This is getting a little far afield, but "you can modify your car but you can't drive it on a street anymore" is equivalent to "you can't modify your car". Have a look at efforts to secure The Right to Repair for more info.

    The ability to talk to your onboard computer should be core - you shouldn't need to call a Google technician and pay them 8x the standard service fee just to flash the bios on your self-driving car, nor should you have to agree to a EULA that indemnifies the manufacturer in case of accident.

    Or, on the flip side, we delete the private auto industry and return all that cost to the consumer.

    spool32 on
  • Options
    QuidQuid Definitely not a banana Registered User regular
    spool32 wrote: »
    Quid wrote: »
    You would still be free to modify your car to your desire. It would just no longer be legal to take on public roads like various other modified vehicles.

    This is getting a little far afield, but "you can modify your car but you can't drive it on a street anymore" is equivalent to "you can't modify your car". Have a look at efforts to secure The Right to Repair for more info.

    I'm cool with that.

  • Options
    Phoenix-DPhoenix-D Registered User regular
    That 50 cent figure isn't just depreciation.
    spool32 wrote: »
    Quid wrote: »
    You would still be free to modify your car to your desire. It would just no longer be legal to take on public roads like various other modified vehicles.

    This is getting a little far afield, but "you can modify your car but you can't drive it on a street anymore" is equivalent to "you can't modify your car". Have a look at efforts to secure The Right to Repair for more info.

    The ability to talk to your onboard computer should be core - you shouldn't need to call a Google technician and pay them 8x the standard service fee just to flash the bios on your self-driving car, nor should you have to agree to a EULA that indemnifies the manufacturer in case of accident.

    Or, on the flip side, we delete the private auto industry and return all that cost to the consumer.

    This is starting to feel like GST territory but weren't you saying in chat that none of the makers would accept liability anyway?

  • Options
    spool32spool32 Contrary Library Registered User regular
    Quid wrote: »
    spool32 wrote: »
    Quid wrote: »
    You would still be free to modify your car to your desire. It would just no longer be legal to take on public roads like various other modified vehicles.

    This is getting a little far afield, but "you can modify your car but you can't drive it on a street anymore" is equivalent to "you can't modify your car". Have a look at efforts to secure The Right to Repair for more info.

    I'm cool with that.

    I'm super not! I'd like to be able to put an aftermarket GPS in my car without bricking the onboard computer, and I'd like to be able to change the oil without needing a trip to the garage so they can charge me $200 to attach a wire and click "approved".

  • Options
    MortiousMortious The Nightmare Begins Move to New ZealandRegistered User regular
    spool32 wrote: »
    Quid wrote: »
    spool32 wrote: »
    Quid wrote: »
    You would still be free to modify your car to your desire. It would just no longer be legal to take on public roads like various other modified vehicles.

    This is getting a little far afield, but "you can modify your car but you can't drive it on a street anymore" is equivalent to "you can't modify your car". Have a look at efforts to secure The Right to Repair for more info.

    I'm cool with that.

    I'm super not! I'd like to be able to put an aftermarket GPS in my car without bricking the onboard computer, and I'd like to be able to change the oil without needing a trip to the garage so they can charge me $200 to attach a wire and click "approved".

    I feel there's a middle ground between filling up the wiper fluid and being able to remove and reprogram a transponder.

    Move to New Zealand
    It’s not a very important country most of the time
    http://steamcommunity.com/id/mortious
  • Options
    PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    spool32 wrote: »
    Quid wrote: »
    You would still be free to modify your car to your desire. It would just no longer be legal to take on public roads like various other modified vehicles.

    This is getting a little far afield, but "you can modify your car but you can't drive it on a street anymore" is equivalent to "you can't modify your car". Have a look at efforts to secure The Right to Repair for more info.

    The ability to talk to your onboard computer should be core - you shouldn't need to call a Google technician and pay them 8x the standard service fee just to flash the bios on your self-driving car, nor should you have to agree to a EULA that indemnifies the manufacturer in case of accident.

    Or, on the flip side, we delete the private auto industry and return all that cost to the consumer.

    There is zero chance that any self driving car will let you change any control software, especially if the manufacturer assumes any liability for crashes

    How would you even conceptually modify it and who will create the modifications? Changing what is usually changed now - power curves and such - either won't apply or won't help and changing the driving algorithms is a non-starter given that you need a large company to even attempt to do it right

  • Options
    QuidQuid Definitely not a banana Registered User regular
    spool32 wrote: »
    Quid wrote: »
    spool32 wrote: »
    Quid wrote: »
    You would still be free to modify your car to your desire. It would just no longer be legal to take on public roads like various other modified vehicles.

    This is getting a little far afield, but "you can modify your car but you can't drive it on a street anymore" is equivalent to "you can't modify your car". Have a look at efforts to secure The Right to Repair for more info.

    I'm cool with that.

    I'm super not! I'd like to be able to put an aftermarket GPS in my car without bricking the onboard computer, and I'd like to be able to change the oil without needing a trip to the garage so they can charge me $200 to attach a wire and click "approved".

    I'd like for people to not modify their cars in ways that endanger others. The things you're listing as concerns are not "people hacking their car to get them through traffic faster". They still could. That they'd be banned from driving on public roads is fine.

Sign In or Register to comment.