Options

Autonomous Vehicles: The Robot Apocalypse and You

2456734

Posts

  • Options
    XaquinXaquin Right behind you!Registered User regular
    people drive over it because the clearance is larger than a brick?

  • Options
    asurasur Registered User regular
    DarkPrimus wrote: »
    ElJeffe wrote: »
    DarkPrimus wrote: »
    ElJeffe wrote: »
    DarkPrimus wrote: »
    ElJeffe wrote: »
    Yeah, they've been on the road for years now and have logged tens of millions of driver hours. I've watched a bunch of videos on how they respond to various conditions and emergency situations and it's pretty rad.

    Oh, you mean like how Tesla owners can reset their safety scores in order to get free access to the autonomous driving beta-test, or how Tesla's own safety guidelines instruct drivers to not take their hands off the wheel for more than a minute?

    Well no, that would be a rather silly thing to find rad.

    I was more commenting on the impressive technical performance of AI drivers versus human drivers, and how much better it is than the median driver.

    It's admittedly an uphill climb convincing people that driverless cars can be not just as good as, but significantly better than human drivers. Partly because everyone thinks they're an above average driver, so they tend to go "Well, that car just swerved into a pedestrian, obviously I would never swerve into a pedestrian," ignoring the thousands of drivers per year who swerve into pedestrians.

    Tesla isn't the entirety of AVs.

    I doubt that is a metric that's actually been met, and even were it so, AI drivers should not merely be better than the average human driver. The unpredictable nature of road conditions are such that we should be holding these companies accountable for every accident the same as we would hold any human driver accountable.

    I think this raises some interesting cost benefit analysis, and some interesting ethical concerns. How much better than the average driver does an AI have to be in order to be worth implementing? Like BSoB says, if the AI is as good as the average sober, alert driver, that's already a huge step up when implemented on a wide scale.

    Liability is also an interesting question. If a Tesla-navigated car gets into an accident because the AI fucks up, I'd put that on par with a human driver making a bad (but not negligent) decision that causes an accident. Looking in his mirror at the wrong time, slow reflexes, that kind of thing. So that sort of accident would be handled by whatever our insurance regime is, just as human accidents are now.

    And if it turns out the error was something known and concealed, that could be treated as a human driver being negligent - driving drunk, texting, what have you.

    The nice thing about errors in AI is that once the error is identified and fixed, it ceases to be a problem, because now every car in the network implements the same fix. Versus with human drivers, where a driver falling asleep at the wheel doesn't prevent the next driver falling asleep at the wheel.

    Edit: also, if you have any evidence that AI isn't very reliable - e.g., an actual study showing that it's worse than human drivers - I'd be curious to see it.

    Let's not presume that fixing a problem is as easy as identifying a problem. Identifying a problem and fixing a problem are two very different things, as anyone with any experience with software development can tell you.

    And once a problem is identified... what is to be done with that software? Do we simply allow it to continue to function in its flawed state? Do we disable it nation-wide until a patch is developed? What happens if someone refuses to install the update?

    As to your edit: The burden of proof is on those making the claim that self-driving cars are already as safe as a human driver. Where are the sources that aren't directly from the corporations themselves?

    You can solve a lot of this through insurance requirements both on the manufacturer and on the vehicle. It's not in place yet, but states could very easily adapt the current framework for this.

    The current state with Tesla is the worst of all worlds. The company is basically offloading all responsibility on the driver because the advertised self driving isn't self driving, but assisted driving and the driver must maintain control even though that's basically impossible.

  • Options
    tinwhiskerstinwhiskers Registered User regular
    Heffling wrote: »
    I think autonomous vehicles will also be held to higher standards because the vehicle manufacturers are taking on a much greater liability with self-driving programs. I think Tesla advises their drivers not to take their hands off the steering wheel not because there are studies showing that accident rates are reduced by doing so, but because if they can demonstrate that the person wasn't paying attention via steering wheel sensors, they can place all liability on the driver.

    Once it is better than the average person, every bit of better you wait for just kills extra people every year till you get there.

    6ylyzxlir2dz.png
  • Options
    evilmrhenryevilmrhenry Registered User regular
    Of course, the elephant in the room is that building all our infrastructure around the assumption that everyone has a car was, and still is, a terrible idea that the US is terrible at. While this might decrease the accident rate, it's a patch on a really bad design that requires everyone to spend tens of thousands of dollars on a new car to see a benefit. A Netherlands-style road network is a lot safer than what the US has, as well as being more pedestrian-friendly and creating better cities in general.

  • Options
    ElJeffeElJeffe Not actually a mod. Roaming the streets, waving his gun around.Moderator, ClubPA mod
    The nice thing about autonomous cars is that they don't have to just be cars. It can be applied to public transit, as well. Or autonomous taxis. The efficiency of an entirely or mostly AV system means not everyone has to own a car to get around, and when you ARE traveling on the road, it can be done more quickly and more cleanly.

    I submitted an entry to Lego Ideas, and if 10,000 people support me, it'll be turned into an actual Lego set!If you'd like to see and support my submission, follow this link.
  • Options
    electricitylikesmeelectricitylikesme Registered User regular
    In almost every way we would've been better off building rail networks then roads. We still probably should be doing this for highways - some standardized hookup to provide power and guidance.

    Which technically gets you most of the way towards "pedestrian beacons" because it's basically accepted that if you cross rail-lines unexpectedly in front of a train you'll die.

  • Options
    HonkHonk Honk is this poster. Registered User, __BANNED USERS regular
    I have various issues with autonomous cars though I’d very much like them to be the future tbh. Been thinking about the systems in particular. This smartphone I’m posting from was bought in July and already routinely freezes a few seconds every now and then. My reverse-camera lens in my car fell out out of its fitting after about 800km. Parking/assist sensors are pretty much constantly clogged by dirt on all cars I’ve had unless you wash them every week (nobody does). Tesla can already barely assemble car parts from how it looks in the papers. I’m concerned with a car that 100% relies on these extremely fallible things. For economics that work out I don’t believe we can make systems like this that don’t randomly fail very very often.

    PSN: Honkalot
  • Options
    QuidQuid Definitely not a banana Registered User regular
    Honk wtf are you doing to your cars?

  • Options
    HonkHonk Honk is this poster. Registered User, __BANNED USERS regular
    Quid wrote: »
    Honk wtf are you doing to your cars?

    Where we lived most the past years had a gravel road the final 5km, not a bad road per se but varied with weather. The lens likely shook loose with that road as a large contributor.

    I have noticed the sensor clogging regardless though. Highway oil/exhaust grime easily competes with the stuff a gravel road kicks up.

    PSN: Honkalot
  • Options
    FeralFeral MEMETICHARIZARD interior crocodile alligator ⇔ ǝɹʇɐǝɥʇ ǝᴉʌoɯ ʇǝloɹʌǝɥɔ ɐ ǝʌᴉɹp ᴉRegistered User regular
    ElJeffe wrote: »
    The nice thing about autonomous cars is that they don't have to just be cars. It can be applied to public transit, as well. Or autonomous taxis. The efficiency of an entirely or mostly AV system means not everyone has to own a car to get around, and when you ARE traveling on the road, it can be done more quickly and more cleanly.

    Why would road travel be done more quickly?

    every person who doesn't like an acquired taste always seems to think everyone who likes it is faking it. it should be an official fallacy.

    the "no true scotch man" fallacy.
  • Options
    FeralFeral MEMETICHARIZARD interior crocodile alligator ⇔ ǝɹʇɐǝɥʇ ǝᴉʌoɯ ʇǝloɹʌǝɥɔ ɐ ǝʌᴉɹp ᴉRegistered User regular
    Disclosure: I have a strong belief that autonomous vehicles will increase the number of vehicles on the road, increase travel times, increase the average time people spend in cars, and reduce average road speeds. Essentially, AVs = more traffic

    I still support AVs, but not because they'll make car travel any faster. They'll do the opposite.

    every person who doesn't like an acquired taste always seems to think everyone who likes it is faking it. it should be an official fallacy.

    the "no true scotch man" fallacy.
  • Options
    ReynoldsReynolds Gone Fishin'Registered User regular
    Computers can respond faster than people, so speed can be increased and following distance decreased if every vehicle is theoretically perfectly controlled. Like, you'd merge onto the freeway seamlessly every time and instantly accelerate to maximum safe speed and then stay there the entire trip until you smoothly changed lanes and exited, barring any complications.

    uyvfOQy.png
  • Options
    FeralFeral MEMETICHARIZARD interior crocodile alligator ⇔ ǝɹʇɐǝɥʇ ǝᴉʌoɯ ʇǝloɹʌǝɥɔ ɐ ǝʌᴉɹp ᴉRegistered User regular
    Reynolds wrote: »
    Computers can respond faster than people, so speed can be increased and following distance decreased if every vehicle is theoretically perfectly controlled. Like, you'd merge onto the freeway seamlessly every time and instantly accelerate to maximum safe speed and then stay there the entire trip until you smoothly changed lanes and exited, barring any complications.

    Reaction time is only one input into safe following distance - velocity, momentum, and the vehicle's braking performance are bigger factors.

    Most passenger cars have max braking performance around 10 m/s^2, so even an autonomous vehicle with instantaneous reaction times can't instantaneously brake. That's part of why we tell humans the "3 second rule", because it takes about 2.5 seconds after the driver has applied max braking force to bring a car from 55mph to a full stop.

    (A good article about it here: https://www.eetasia.com/finding-the-formula-behind-driving-cautiously/)

    every person who doesn't like an acquired taste always seems to think everyone who likes it is faking it. it should be an official fallacy.

    the "no true scotch man" fallacy.
  • Options
    FeralFeral MEMETICHARIZARD interior crocodile alligator ⇔ ǝɹʇɐǝɥʇ ǝᴉʌoɯ ʇǝloɹʌǝɥɔ ɐ ǝʌᴉɹp ᴉRegistered User regular
    edited November 2021
    This isn't rigorous, but in my experience, most humans follow too closely. When I use the 3 second rule in my driving, I get road ragers passing me on the right - even if I'm using my car's adaptive cruise control to match the speed of the car in front of me.

    If my impression is correct, and most humans do follow too closely, then we should expect AVs to, on average, increase following distance rather than decrease it.

    That's not why I think AVs = more traffic, though. It's just a casual observation. I'll write up an explanation tomorrow if this thread gets more traffic (pun intended)

    Feral on
    every person who doesn't like an acquired taste always seems to think everyone who likes it is faking it. it should be an official fallacy.

    the "no true scotch man" fallacy.
  • Options
    shrykeshryke Member of the Beast Registered User regular
    edited November 2021
    asur wrote: »
    DarkPrimus wrote: »
    ElJeffe wrote: »
    DarkPrimus wrote: »
    ElJeffe wrote: »
    DarkPrimus wrote: »
    ElJeffe wrote: »
    Yeah, they've been on the road for years now and have logged tens of millions of driver hours. I've watched a bunch of videos on how they respond to various conditions and emergency situations and it's pretty rad.

    Oh, you mean like how Tesla owners can reset their safety scores in order to get free access to the autonomous driving beta-test, or how Tesla's own safety guidelines instruct drivers to not take their hands off the wheel for more than a minute?

    Well no, that would be a rather silly thing to find rad.

    I was more commenting on the impressive technical performance of AI drivers versus human drivers, and how much better it is than the median driver.

    It's admittedly an uphill climb convincing people that driverless cars can be not just as good as, but significantly better than human drivers. Partly because everyone thinks they're an above average driver, so they tend to go "Well, that car just swerved into a pedestrian, obviously I would never swerve into a pedestrian," ignoring the thousands of drivers per year who swerve into pedestrians.

    Tesla isn't the entirety of AVs.

    I doubt that is a metric that's actually been met, and even were it so, AI drivers should not merely be better than the average human driver. The unpredictable nature of road conditions are such that we should be holding these companies accountable for every accident the same as we would hold any human driver accountable.

    I think this raises some interesting cost benefit analysis, and some interesting ethical concerns. How much better than the average driver does an AI have to be in order to be worth implementing? Like BSoB says, if the AI is as good as the average sober, alert driver, that's already a huge step up when implemented on a wide scale.

    Liability is also an interesting question. If a Tesla-navigated car gets into an accident because the AI fucks up, I'd put that on par with a human driver making a bad (but not negligent) decision that causes an accident. Looking in his mirror at the wrong time, slow reflexes, that kind of thing. So that sort of accident would be handled by whatever our insurance regime is, just as human accidents are now.

    And if it turns out the error was something known and concealed, that could be treated as a human driver being negligent - driving drunk, texting, what have you.

    The nice thing about errors in AI is that once the error is identified and fixed, it ceases to be a problem, because now every car in the network implements the same fix. Versus with human drivers, where a driver falling asleep at the wheel doesn't prevent the next driver falling asleep at the wheel.

    Edit: also, if you have any evidence that AI isn't very reliable - e.g., an actual study showing that it's worse than human drivers - I'd be curious to see it.

    Let's not presume that fixing a problem is as easy as identifying a problem. Identifying a problem and fixing a problem are two very different things, as anyone with any experience with software development can tell you.

    And once a problem is identified... what is to be done with that software? Do we simply allow it to continue to function in its flawed state? Do we disable it nation-wide until a patch is developed? What happens if someone refuses to install the update?

    As to your edit: The burden of proof is on those making the claim that self-driving cars are already as safe as a human driver. Where are the sources that aren't directly from the corporations themselves?

    You can solve a lot of this through insurance requirements both on the manufacturer and on the vehicle. It's not in place yet, but states could very easily adapt the current framework for this.

    The current state with Tesla is the worst of all worlds. The company is basically offloading all responsibility on the driver because the advertised self driving isn't self driving, but assisted driving and the driver must maintain control even though that's basically impossible.

    How pricey does that get though? If your car drives itself then the people designing the cars navigation system are the "driver" now and would I think logically be liable for most accidents that occur because of the car's driving. Which means every manufacturer is now paying something like the insurance premiums currently paid by everyone who drives their cars. That seems like an extremely expensive proposition.

    shryke on
  • Options
    ReynoldsReynolds Gone Fishin'Registered User regular
    If no automatic car ever needs to slam on its brakes, because they never get into accidents or slow down to look at something or whatever, then you don't need a following distance that allows for that. But the seamless lane changing and constant high speed would be the biggest time gains, definitely.

    uyvfOQy.png
  • Options
    EchoEcho ski-bap ba-dapModerator mod
    Quid wrote: »
    Honk wtf are you doing to your cars?

    Wouldn't be the first time some industry suddenly gets a case of "and then Scandinavian climate happened".

  • Options
    evilmrhenryevilmrhenry Registered User regular
    Feral wrote: »
    ElJeffe wrote: »
    The nice thing about autonomous cars is that they don't have to just be cars. It can be applied to public transit, as well. Or autonomous taxis. The efficiency of an entirely or mostly AV system means not everyone has to own a car to get around, and when you ARE traveling on the road, it can be done more quickly and more cleanly.

    Why would road travel be done more quickly?

    A mostly AV system where everyone commutes to work in a separate car wouldn't get faster, But, if you have a free AV bus network, using a car becomes less necessary, and using a car for every trip even less so. That means there are fewer cars on the road, so if you do need to use your car, you won't get stuck in traffic. It also means adding routes is cheaper, so using the bus might actually be possible for more people.

    Of course, this only works if we actually do that. Tesla isn't out there trying to sell people on an automated public transit system, they're selling overpriced cars. A lot of the players in this field are car companies, which very much want you to be forced to buy a $40,000 car, and view the prospect of widespread AV buses as a direct threat to their business.

    (I'm also dubious about the cost savings from AV buses. Most proposals still require an attendant on board, who still needs to get paid, and bus drivers aren't exactly highly-paid. Getting people to accept buses without an attendant would save a lot of money, but getting people to accept that would be quite difficult.)

  • Options
    HamHamJHamHamJ Registered User regular
    HamHamJ wrote: »
    One thing is that the weaknesses of purely camera based systems can be overcome with other sensors. You can have lidar or whatever to detect that there is a solid object on front of you and override to hit the breaks without needing to rely on an image recognition algorithm to recognize what it is.

    The issue is that while "just don't hit anything" sounds good in theory, in practice if your self-driving car is liable to slam on the brakes when non-obstructions are in front of it (i.e. returns that are too small to reasonably be obstructions, or transients) then you've probably created a huge hazard anyway, or alternatively a car which just can't bring itself to drive forwards.

    The problem is that LIDAR doesn't give a clean "obstruction yes/no" output - neither does RADAR, and having both technically worsens it. LIDAR in rain for example will produce noisy returns, and also treat any random object as "solid" - so you wind up having to train to reject LIDAR returns. RADAR is worse - non-metallic objects don't return at all, and RADAR returns can get all sorts of weird.

    The case of the Uber vehicle which killed a pedestrian was almost certainly the result of something like this: the frame of the bike being pushed producing a scatter of tiny LIDAR returns which were rejected, and then resulting in any object in that volume to be lowered in priority for expectation. You can blame this on bad training or programming, but it definitely speaks to the problems of "sensor fusion" - once a sensor is into an "unreliable" range, how much has your model suffered in overall accuracy?

    I feel like needing to drive over a cardboard box or something that has gotten onto the road happens to me maybe once a year? Less? On a highway especially those kinds of false positives will be rare enough I think that you can make it overly cautious to the point where it is breaking for cardboard boxes and that will still be usable. Lane assist on cruise control is already a thing and I think it will eventually go from being a luxury add on to a standard feature. If it stops working in even a light drizzle I think that's also fine because again you still get a lot of utility out of it the other 80% of the time, though in certain places this could be a bigger problem than others.

    A bigger reason we won't see fully automated roads any time soon is that the half life of existing cars is multiple decades. They are not going to all get replaced overnight.

    While racing light mechs, your Urbanmech comes in second place, but only because it ran out of ammo.
  • Options
    kimekime Queen of Blades Registered User regular
    AVs have a much faster reaction time than people (theoretically), so don't necessarily need as much of a safe following distance as people do. We need extra because it takes us a long distance of seeing someone stop suddenly before we even touch the brakes.

    Battle.net ID: kime#1822
    3DS Friend Code: 3110-5393-4113
    Steam profile
  • Options
    OrcaOrca Also known as Espressosaurus WrexRegistered User regular
    HamHamJ wrote: »
    HamHamJ wrote: »
    One thing is that the weaknesses of purely camera based systems can be overcome with other sensors. You can have lidar or whatever to detect that there is a solid object on front of you and override to hit the breaks without needing to rely on an image recognition algorithm to recognize what it is.

    The issue is that while "just don't hit anything" sounds good in theory, in practice if your self-driving car is liable to slam on the brakes when non-obstructions are in front of it (i.e. returns that are too small to reasonably be obstructions, or transients) then you've probably created a huge hazard anyway, or alternatively a car which just can't bring itself to drive forwards.

    The problem is that LIDAR doesn't give a clean "obstruction yes/no" output - neither does RADAR, and having both technically worsens it. LIDAR in rain for example will produce noisy returns, and also treat any random object as "solid" - so you wind up having to train to reject LIDAR returns. RADAR is worse - non-metallic objects don't return at all, and RADAR returns can get all sorts of weird.

    The case of the Uber vehicle which killed a pedestrian was almost certainly the result of something like this: the frame of the bike being pushed producing a scatter of tiny LIDAR returns which were rejected, and then resulting in any object in that volume to be lowered in priority for expectation. You can blame this on bad training or programming, but it definitely speaks to the problems of "sensor fusion" - once a sensor is into an "unreliable" range, how much has your model suffered in overall accuracy?

    I feel like needing to drive over a cardboard box or something that has gotten onto the road happens to me maybe once a year? Less? On a highway especially those kinds of false positives will be rare enough I think that you can make it overly cautious to the point where it is breaking for cardboard boxes and that will still be usable. Lane assist on cruise control is already a thing and I think it will eventually go from being a luxury add on to a standard feature. If it stops working in even a light drizzle I think that's also fine because again you still get a lot of utility out of it the other 80% of the time, though in certain places this could be a bigger problem than others.

    A bigger reason we won't see fully automated roads any time soon is that the half life of existing cars is multiple decades. They are not going to all get replaced overnight.

    Also that the tech is nowhere near ready, even in summer months.

    Waymo is the only one that has demonstrated anything remotely usable and it's still pretty limited.

    I'm personally deeply skeptical the tech is going anywhere in the near future. Liability laws need to catch up even after the tech has, and it needs to deal with the thousands of everyday interactions with human beings signalling and not signalling to each other--not a small feat. And that's before you get into adverse weather, construction accidents, etc.

    Will it arrive eventually? Yes. In the next 5 years? Uh, doubtful. 10 years? Maybe we'll see the first real commercial efforts.

    I could see efforts to start installing beacons in the road and other self-driving affordances, but then that also pushes the timeline since infrastructure like that takes a long time.

    And like was said up-thread, increasing availability is going to increase traffic. It's not going to make it faster to get anywhere. It may make it safer and more like a personal bus.

  • Options
    spool32spool32 Contrary Library Registered User regular
    edited November 2021
    Test drove, and then put the reserve on a Tesla Model Y last week. Looks like we'll take delivery in March

    I am excited! Already love our hybrid, this should be even better. Getting the range model, grey/white, and paying the monthly for "full-autonomy" whenever we want to go a long way.

    spool32 on
  • Options
    spool32spool32 Contrary Library Registered User regular
    Feral wrote: »
    This isn't rigorous, but in my experience, most humans follow too closely. When I use the 3 second rule in my driving, I get road ragers passing me on the right - even if I'm using my car's adaptive cruise control to match the speed of the car in front of me.

    If my impression is correct, and most humans do follow too closely, then we should expect AVs to, on average, increase following distance rather than decrease it.

    That's not why I think AVs = more traffic, though. It's just a casual observation. I'll write up an explanation tomorrow if this thread gets more traffic (pun intended)

    I'm very curious to see why you think AVs = more traffic. I don't understand why the driver being autonomous would change much in terms of who is on the road, and AVs should increase traffic efficiency as the density increases.

  • Options
    PolaritiePolaritie Sleepy Registered User regular
    spool32 wrote: »
    Test drove, and then put the reserve on a Tesla Model Y last week. Looks like we'll take delivery in March

    I am excited! Already love our hybrid, this should be even better. Getting the range model, grey/white, and paying the monthly for "full-autonomy" whenever we want to go a long way.

    See, that's what I mean about Tesla's labeling being dumb. I don't recall "full-autonomous" actually being anything of the sort.

    Steam: Polaritie
    3DS: 0473-8507-2652
    Switch: SW-5185-4991-5118
    PSN: AbEntropy
  • Options
    spool32spool32 Contrary Library Registered User regular
    Heffling wrote: »
    I think autonomous vehicles will also be held to higher standards because the vehicle manufacturers are taking on a much greater liability with self-driving programs. I think Tesla advises their drivers not to take their hands off the steering wheel not because there are studies showing that accident rates are reduced by doing so, but because if they can demonstrate that the person wasn't paying attention via steering wheel sensors, they can place all liability on the driver.

    Once it is better than the average person, every bit of better you wait for just kills extra people every year till you get there.

    AVs being 1% better than people would have prevented about 52,000 accidents and saved 361 lives in 2019. This is unambiguously good!

    Of course, everyone wearing a mask would have saved maybe 250,00-300,000 lives in 2020 so I don't think half the population gives one single shit about saving lives. That argument is a fucking dead letter. In 10 years the main problem with AVs won't be whether they're 1% or 5% better, it's going to be alt-right trolls intentionally fucking with them on the road and killing AEV drivers to own the libs.


    I'm completely serious about this. I believe the risk of AEVs becoming political will be a larger danger to life and limb than anything the computers ever do.

  • Options
    SmrtnikSmrtnik job boli zub Registered User regular
    Label wrote: »
    I oppose mandatory AI driving on a couple of grounds.

    First is Civil Liberties. I do NOT want my car's position to be automatically recorded. Especially after the last federal administration. Voting rights advocates already have enough problems.

    Second is Public Emergencies. AI driving relies on more background infrastructure to be functional. Whatever development company being on point, the update process being intact, etc.

    Thirdly, economic disadvantage. I live in a rural area. Any AI-driving developed for my area is likely to be hamstrung by the low monetary value available here. There are plenty of places like this in America, from areas even more rural than this, to impoverished neighborhoods in large cities. Doing AI driving poorly may not be the potentially lifesaving tool we think of it as.



    Note, at the beginning I said mandatory AI driving. I think if left unchecked and unregulated, companies will push to make human driving either too difficult to pursue, or outright illegal. They will be pursuing their profits in the name of "public safety." It could be AI car manufacturing companies or insurance companies, it doesn't matter. Both will have financial incentives to get human drivers off the road, and lobby for legislation mandating that.

    Meaning if/when AI driving becomes functional enough to operate, without significant pushback it will become mandatory.


    P.S. The onset of AI driving provides a great opportunity to increase the driving license safety training and standards, and this should be pursued as well.

    Rural States have so much overrepresentation on the Senate that there is no way a mandatory AI driving bill passes there.

    steam_sig.png
  • Options
    spool32spool32 Contrary Library Registered User regular
    Polaritie wrote: »
    spool32 wrote: »
    Test drove, and then put the reserve on a Tesla Model Y last week. Looks like we'll take delivery in March

    I am excited! Already love our hybrid, this should be even better. Getting the range model, grey/white, and paying the monthly for "full-autonomy" whenever we want to go a long way.

    See, that's what I mean about Tesla's labeling being dumb. I don't recall "full-autonomous" actually being anything of the sort.

    I think it's officially called "Full self-driving capability" which is also not real - plenty of roads without stripes, and to be honest I chickened out letting it try to stop at a red light during the test drive so idk if that part works on city streets or not...

    The lane change assist is cool as shit though.

  • Options
    tinwhiskerstinwhiskers Registered User regular
    edited November 2021
    OTR trucking is where the first AV stuff is going to take hold.

    The vehicles have high utilization, so you aren't paying for a sensor/software system that only runs 500 hours a year or w/e. With regulation changes they could also allow for more running time since the computer doesn't need rest breaks and sleep.
    They drive mostly on the easiest roadway types for AV-highways.
    The drivers they can replace are costly, compared to cab drivers and the such who make very little.

    tinwhiskers on
    6ylyzxlir2dz.png
  • Options
    SleepSleep Registered User regular
    I doubt we hit any kind of mandatory AVs before we hit societal collapse. Like we need it to be thus that the used cars people are getting come with compatible systems, and that won’t be a reality for like 20 or 30 years at least. Till having a driving AI is kinda like having a radio or cruise control.

  • Options
    DarkPrimusDarkPrimus Registered User regular
    There are so very many tech "innovations" that companies are trying to push out for profit without actually testing it thoroughly in real-world conditions nor considering the legal and ethical ramifications of deploying the technology to the public - or even just to private entities. They just "disrupt" with a bunch of buzzwords and empty promises that people take as gospel, and we can only pray that legislation catches up to account for this new variable, nevermind all the chaos that comes as the limits of their highly-regulated testing parameters are exposed once real-world variables start popping up.

    Faster reaction times for AI don't affect the time and distance needed for a vehicle to come to a complete stop, but those factors do increase with speed, which I feel needs stressing to those saying that automated driving could mean faster speeds and shorter follow distances between vehicles.

  • Options
    mrondeaumrondeau Montréal, CanadaRegistered User regular
    AV would do nothing for most of the problems with cars and car-centric development. They won't be smaller, they won't be lighter, they will be just as noisy, they will still take public space away from people, and they will still externalize the costs, inconvenience, and danger away from the users.

    The way forward is to discourage cars and encourage others means of transportation, in particular walking, through good city design and investment in rail and public transportation.

  • Options
    kimekime Queen of Blades Registered User regular
    DarkPrimus wrote: »
    There are so very many tech "innovations" that companies are trying to push out for profit without actually testing it thoroughly in real-world conditions nor considering the legal and ethical ramifications of deploying the technology to the public - or even just to private entities. They just "disrupt" with a bunch of buzzwords and empty promises that people take as gospel, and we can only pray that legislation catches up to account for this new variable, nevermind all the chaos that comes as the limits of their highly-regulated testing parameters are exposed once real-world variables start popping up.

    Faster reaction times for AI don't affect the time and distance needed for a vehicle to come to a complete stop, but those factors do increase with speed, which I feel needs stressing to those saying that automated driving could mean faster speeds and shorter follow distances between vehicles.

    The main difference with stopping is that you won't rear-end the car in front of you. You could follow very closely behind them, and as long as they have time to stop, you have time to stop. This is specifically with AVs, it's very not the case with human drivers.

    Now, that effect doesn't really come into play with something unexpected appearing in front of you. If a biker/runner suddenly cross the street from an obscured viewpoint and you need to stop quickly, the AV would still do it better, but you're right that physics start to really come into play, then.

    But if we take a moment to assume "good" conditions on a highway? You could be going like, 100mph with 10 feet in between you and the car in front of you, and that's totally fine for an AV. A theoretical future AV that works well, not what we have now. That's kind of what I mean when I'm saying that the stopping distance calculations would change. It's unlike it'll change to be that extreme anytime soon, but hopefully the idea here is coming across. AVs are not ready for that now, again. But they could do more than humans in that direction.

    Battle.net ID: kime#1822
    3DS Friend Code: 3110-5393-4113
    Steam profile
  • Options
    spool32spool32 Contrary Library Registered User regular
    edited November 2021
    mrondeau wrote: »
    AV would do nothing for most of the problems with cars and car-centric development. They won't be smaller, they won't be lighter, they will be just as noisy, they will still take public space away from people, and they will still externalize the costs, inconvenience, and danger away from the users.

    The way forward is to discourage cars and encourage others means of transportation, in particular walking, through good city design and investment in rail and public transportation.

    It's 105+ degrees almost every day, sometimes for 60-75 days in a row, from late May - September. "Good city design" that encourages walking is just not a thing for distances that take longer than a few minutes to cover at best. Most of the people do not live anywhere that this idea is practical or possible as critical infrastructure.

    spool32 on
  • Options
    kimekime Queen of Blades Registered User regular
    spool32 wrote: »
    mrondeau wrote: »
    AV would do nothing for most of the problems with cars and car-centric development. They won't be smaller, they won't be lighter, they will be just as noisy, they will still take public space away from people, and they will still externalize the costs, inconvenience, and danger away from the users.

    The way forward is to discourage cars and encourage others means of transportation, in particular walking, through good city design and investment in rail and public transportation.

    It's 105 degrees every day, sometimes for 60-75 days in a row. "Good city design" that encourages walking is just not a thing for distances that take longer than a few minutes to cover at best. Most of the people do not live anywhere that this idea is practical or possible as critical infrastructure.

    Public transit has AC though, so that should be beefed up a lot.

    Battle.net ID: kime#1822
    3DS Friend Code: 3110-5393-4113
    Steam profile
  • Options
    HamHamJHamHamJ Registered User regular
    kime wrote: »
    DarkPrimus wrote: »
    There are so very many tech "innovations" that companies are trying to push out for profit without actually testing it thoroughly in real-world conditions nor considering the legal and ethical ramifications of deploying the technology to the public - or even just to private entities. They just "disrupt" with a bunch of buzzwords and empty promises that people take as gospel, and we can only pray that legislation catches up to account for this new variable, nevermind all the chaos that comes as the limits of their highly-regulated testing parameters are exposed once real-world variables start popping up.

    Faster reaction times for AI don't affect the time and distance needed for a vehicle to come to a complete stop, but those factors do increase with speed, which I feel needs stressing to those saying that automated driving could mean faster speeds and shorter follow distances between vehicles.

    The main difference with stopping is that you won't rear-end the car in front of you. You could follow very closely behind them, and as long as they have time to stop, you have time to stop. This is specifically with AVs, it's very not the case with human drivers.

    Now, that effect doesn't really come into play with something unexpected appearing in front of you. If a biker/runner suddenly cross the street from an obscured viewpoint and you need to stop quickly, the AV would still do it better, but you're right that physics start to really come into play, then.

    But if we take a moment to assume "good" conditions on a highway? You could be going like, 100mph with 10 feet in between you and the car in front of you, and that's totally fine for an AV. A theoretical future AV that works well, not what we have now. That's kind of what I mean when I'm saying that the stopping distance calculations would change. It's unlike it'll change to be that extreme anytime soon, but hopefully the idea here is coming across. AVs are not ready for that now, again. But they could do more than humans in that direction.

    No, that would be incredibly unsafe. You need to assume that the car in front of you may rear-end a semi that has stopped or a huge pileup or something and come to a stop more or less instantly at any time. As such you need to follow at a distance where you can stop before hitting them if that happens. This is actually worse for an AV because it can't see things like that coming as far ahead as a human can. A safely designed AV would have a greater follow distance than your average driver.

    While racing light mechs, your Urbanmech comes in second place, but only because it ran out of ammo.
  • Options
    tinwhiskerstinwhiskers Registered User regular
    DarkPrimus wrote: »
    There are so very many tech "innovations" that companies are trying to push out for profit without actually testing it thoroughly in real-world conditions nor considering the legal and ethical ramifications of deploying the technology to the public - or even just to private entities. They just "disrupt" with a bunch of buzzwords and empty promises that people take as gospel, and we can only pray that legislation catches up to account for this new variable, nevermind all the chaos that comes as the limits of their highly-regulated testing parameters are exposed once real-world variables start popping up.

    Faster reaction times for AI don't affect the time and distance needed for a vehicle to come to a complete stop, but those factors do increase with speed, which I feel needs stressing to those saying that automated driving could mean faster speeds and shorter follow distances between vehicles.

    The vehicle in front of you doesn't stop instantaneously either though. That's why you don't need to keep hundreds of feet of space between vehicles currently.

    One potential positives of AV(or really any much more sensor heavy vehicle) would be they could be locked out or put into limp mode if the computer senses issues with the vehicle that make it unsafe. My commute is a parade of cars with visibly underinflated tires, messed up suspensions, etc.

    It doesn't matter if you've got the reflexes of Mic Schumacher, if your shocks are dead and your tires are bald. Then again the US could address this by doing actual vehicle inspections like some euro countries do.

    6ylyzxlir2dz.png
  • Options
    HamHamJHamHamJ Registered User regular
    DarkPrimus wrote: »
    There are so very many tech "innovations" that companies are trying to push out for profit without actually testing it thoroughly in real-world conditions nor considering the legal and ethical ramifications of deploying the technology to the public - or even just to private entities. They just "disrupt" with a bunch of buzzwords and empty promises that people take as gospel, and we can only pray that legislation catches up to account for this new variable, nevermind all the chaos that comes as the limits of their highly-regulated testing parameters are exposed once real-world variables start popping up.

    Faster reaction times for AI don't affect the time and distance needed for a vehicle to come to a complete stop, but those factors do increase with speed, which I feel needs stressing to those saying that automated driving could mean faster speeds and shorter follow distances between vehicles.

    The vehicle in front of you doesn't stop instantaneously either though. That's why you don't need to keep hundreds of feet of space between vehicles currently.

    It absolutely can. How people currently drive is not actually safe, and why when you get an accident on a freeway it sometimes turns into a multi-car pileup as the next person in line keeps not being able to stop and slamming into the last one.

    While racing light mechs, your Urbanmech comes in second place, but only because it ran out of ammo.
  • Options
    spool32spool32 Contrary Library Registered User regular
    kime wrote: »
    spool32 wrote: »
    mrondeau wrote: »
    AV would do nothing for most of the problems with cars and car-centric development. They won't be smaller, they won't be lighter, they will be just as noisy, they will still take public space away from people, and they will still externalize the costs, inconvenience, and danger away from the users.

    The way forward is to discourage cars and encourage others means of transportation, in particular walking, through good city design and investment in rail and public transportation.

    It's 105 degrees every day, sometimes for 60-75 days in a row. "Good city design" that encourages walking is just not a thing for distances that take longer than a few minutes to cover at best. Most of the people do not live anywhere that this idea is practical or possible as critical infrastructure.

    Public transit has AC though, so that should be beefed up a lot.

    If you closed all of downtown Austin, just shut down a 30x30 block grid and turned it all into air conditioned public transit, it would not solve for even half the cases in which someone needs a car in this city, or any other city. Even the cities with the best public transit are largely ones in which most people who don't own cars have avoided it because the inconvenience is outweighed by the expense. Also all those cities are in the northern parts of the country where it's not 115+F on the sidewalk outside.

    Public transit should be beefed up all over the place! "Replace car with shoe leather" is not a practical solution outside dense city cores, and even then only in places where the climate makes it manageable.

  • Options
    DoodmannDoodmann Registered User regular
    HamHamJ wrote: »
    kime wrote: »
    DarkPrimus wrote: »
    There are so very many tech "innovations" that companies are trying to push out for profit without actually testing it thoroughly in real-world conditions nor considering the legal and ethical ramifications of deploying the technology to the public - or even just to private entities. They just "disrupt" with a bunch of buzzwords and empty promises that people take as gospel, and we can only pray that legislation catches up to account for this new variable, nevermind all the chaos that comes as the limits of their highly-regulated testing parameters are exposed once real-world variables start popping up.

    Faster reaction times for AI don't affect the time and distance needed for a vehicle to come to a complete stop, but those factors do increase with speed, which I feel needs stressing to those saying that automated driving could mean faster speeds and shorter follow distances between vehicles.

    The main difference with stopping is that you won't rear-end the car in front of you. You could follow very closely behind them, and as long as they have time to stop, you have time to stop. This is specifically with AVs, it's very not the case with human drivers.

    Now, that effect doesn't really come into play with something unexpected appearing in front of you. If a biker/runner suddenly cross the street from an obscured viewpoint and you need to stop quickly, the AV would still do it better, but you're right that physics start to really come into play, then.

    But if we take a moment to assume "good" conditions on a highway? You could be going like, 100mph with 10 feet in between you and the car in front of you, and that's totally fine for an AV. A theoretical future AV that works well, not what we have now. That's kind of what I mean when I'm saying that the stopping distance calculations would change. It's unlike it'll change to be that extreme anytime soon, but hopefully the idea here is coming across. AVs are not ready for that now, again. But they could do more than humans in that direction.

    No, that would be incredibly unsafe. You need to assume that the car in front of you may rear-end a semi that has stopped or a huge pileup or something and come to a stop more or less instantly at any time. As such you need to follow at a distance where you can stop before hitting them if that happens. This is actually worse for an AV because it can't see things like that coming as far ahead as a human can. A safely designed AV would have a greater follow distance than your average driver.

    You're assuming the AVs wouldn't be passively talking to each other. The AV in a crash caused every AV behind it to slow/stop/reroute simultaneously so you don't get a pile-up. The reactivity would be predictive as you got farther from the crisis point.

    Whippy wrote: »
    nope nope nope nope abort abort talk about anime
    I like to ART
  • Options
    MonwynMonwyn Apathy's a tragedy, and boredom is a crime. A little bit of everything, all of the time.Registered User regular
    Feral wrote: »
    ElJeffe wrote: »
    The nice thing about autonomous cars is that they don't have to just be cars. It can be applied to public transit, as well. Or autonomous taxis. The efficiency of an entirely or mostly AV system means not everyone has to own a car to get around, and when you ARE traveling on the road, it can be done more quickly and more cleanly.

    Why would road travel be done more quickly?

    Imagine a world where everyone knows how to fucking merge

    uH3IcEi.png
Sign In or Register to comment.