As was foretold, we've added advertisements to the forums! If you have questions, or if you encounter any bugs, please visit this thread: https://forums.penny-arcade.com/discussion/240191/forum-advertisement-faq-and-reports-thread/
Options

[Autonomous Transportation] When the cars have all the jobs, the poor will walk the earth

1161719212248

Posts

  • Options
    QuidQuid Definitely not a banana Registered User regular
    Hitting up bars is a big reason. Getting to the packed train station and not having to worry if there's parking. Not wanting to drive in the city.

    I don't like driving. If I can get out of it occasionally for small cost then I'll absolutely opt to.

  • Options
    glimmungglimmung Registered User regular
    There are two reasons for me to use a car right now, getting to an airport or train station and shopping. By design my job is never more than a bike ride away.

  • Options
    GnizmoGnizmo Registered User regular
    zakkiel wrote: »
    Aioua wrote: »
    zakkiel wrote: »
    I don't see this going well in terms of riders. Initially I'm sure a lot of people will hail self-driving Ubers for the novelty. But one thing self-driving cars are definitely not known for is getting you around quickly through urban areas. I assume Uber is doing this for the data and publicity.

    Huh?

    Right now self-driving cars are known for... being tested around Google HQ.

    I'm not sure what you're trying to say here. Is it that we don't actually know how the cars act in traffic?
    Quid wrote: »
    Krieghund wrote: »
    Are passengers of autonomous cars supposed to ride in the front or back? In the drivers seat in case of emergency? What if you're drunk? You can't sit in the drivers seat.

    Currently there's a driver still present.
    zakkiel wrote: »
    I don't see this going well in terms of riders. Initially I'm sure a lot of people will hail self-driving Ubers for the novelty. But one thing self-driving cars are definitely not known for is getting you around quickly through urban areas. I assume Uber is doing this for the data and publicity.

    I can't speak for others but if I want to get somewhere quickly I drive myself. I use uber for convenience rather than speed.

    That's... weird. Why do you take an Uber when you have your own car available? I can think of niche cases like barhopping, but nothing that adds up to a significant fraction of a person's travel. Anyway, doesn't matter. The choice in September won't be between driving or autonomous. It will be between autonomous or having someone else drive you.

    Try parking downtown in even a medium sized city and it becomes obvious honestly. There is a decent sized chance you could end up paying more for a spot to park the car and/or spend a half hour trying to find a spot.

  • Options
    zakkielzakkiel Registered User regular
    edited August 2016
    Gnizmo wrote: »
    zakkiel wrote: »
    Aioua wrote: »
    zakkiel wrote: »
    I don't see this going well in terms of riders. Initially I'm sure a lot of people will hail self-driving Ubers for the novelty. But one thing self-driving cars are definitely not known for is getting you around quickly through urban areas. I assume Uber is doing this for the data and publicity.

    Huh?

    Right now self-driving cars are known for... being tested around Google HQ.

    I'm not sure what you're trying to say here. Is it that we don't actually know how the cars act in traffic?
    Quid wrote: »
    Krieghund wrote: »
    Are passengers of autonomous cars supposed to ride in the front or back? In the drivers seat in case of emergency? What if you're drunk? You can't sit in the drivers seat.

    Currently there's a driver still present.
    zakkiel wrote: »
    I don't see this going well in terms of riders. Initially I'm sure a lot of people will hail self-driving Ubers for the novelty. But one thing self-driving cars are definitely not known for is getting you around quickly through urban areas. I assume Uber is doing this for the data and publicity.

    I can't speak for others but if I want to get somewhere quickly I drive myself. I use uber for convenience rather than speed.

    That's... weird. Why do you take an Uber when you have your own car available? I can think of niche cases like barhopping, but nothing that adds up to a significant fraction of a person's travel. Anyway, doesn't matter. The choice in September won't be between driving or autonomous. It will be between autonomous or having someone else drive you.

    Try parking downtown in even a medium sized city and it becomes obvious honestly. There is a decent sized chance you could end up paying more for a spot to park the car and/or spend a half hour trying to find a spot.

    I live in a huge city. I'm familiar with downtown parking. But also familiar with trains. Also, even here a few hours' parking is unlikely to run you more than $30. An to and from downtown costs more.

    Anyway, I think the people who really don't care if their conveyance sits through three lights because it doesn't know how to force a left turn are going to be a distinct minority. Until there's a significant price advantage to automated driving--which there won't be while human drivers have to sit at the wheel--I do not see the market for Uber's experiment.

    zakkiel on
    Account not recoverable. So long.
  • Options
    mcdermottmcdermott Registered User regular
    zakkiel wrote: »
    Gnizmo wrote: »
    zakkiel wrote: »
    Aioua wrote: »
    zakkiel wrote: »
    I don't see this going well in terms of riders. Initially I'm sure a lot of people will hail self-driving Ubers for the novelty. But one thing self-driving cars are definitely not known for is getting you around quickly through urban areas. I assume Uber is doing this for the data and publicity.

    Huh?

    Right now self-driving cars are known for... being tested around Google HQ.

    I'm not sure what you're trying to say here. Is it that we don't actually know how the cars act in traffic?
    Quid wrote: »
    Krieghund wrote: »
    Are passengers of autonomous cars supposed to ride in the front or back? In the drivers seat in case of emergency? What if you're drunk? You can't sit in the drivers seat.

    Currently there's a driver still present.
    zakkiel wrote: »
    I don't see this going well in terms of riders. Initially I'm sure a lot of people will hail self-driving Ubers for the novelty. But one thing self-driving cars are definitely not known for is getting you around quickly through urban areas. I assume Uber is doing this for the data and publicity.

    I can't speak for others but if I want to get somewhere quickly I drive myself. I use uber for convenience rather than speed.

    That's... weird. Why do you take an Uber when you have your own car available? I can think of niche cases like barhopping, but nothing that adds up to a significant fraction of a person's travel. Anyway, doesn't matter. The choice in September won't be between driving or autonomous. It will be between autonomous or having someone else drive you.

    Try parking downtown in even a medium sized city and it becomes obvious honestly. There is a decent sized chance you could end up paying more for a spot to park the car and/or spend a half hour trying to find a spot.

    I live in a huge city. I'm familiar with downtown parking. But also familiar with trains. Also, even here a few hours' parking is unlikely to run you more than $30. An to and from downtown costs more.

    Anyway, I think the people who really don't care if their conveyance sits through three lights because it doesn't know how to force a left turn are going to be a distinct minority. Until there's a significant price advantage to automated driving--which there won't be while human drivers have to sit at the wheel--I do not see the market for Uber's experiment.

    If you live in a huge city you are familiar with cabs. Uber exists for the same reason.

    I used Uber constantly when I lived in the city, and I still use it from time to time when I travel into the city. Sure, I could walk the seventeen blocks home. Or wait 35 minutes for the next bus. Or I can tap on my phone and a car shows up, and had me home in 10.

    And a round trip Uber was generally $20 or so.

    I've paid $15 in parking before. Or circled the blocks until I miss my reservation because there's not a single open spot. Or had to park nine blocks away, so I'm walking halfway there (and back). Oh, and now I have to find parking at home too, which in my neighborhood is nearly was hard as downtown.

    Or I leave my car where it is, and call an Uber.

    There were weeks where my car didn't move. I'd actually have to move it to avoid a ticket. But if I needed to go to the burbs? Or out of town? Great to have.

  • Options
    zakkielzakkiel Registered User regular
    I mean, if your city has both terrible parking and shitty public transportation, then yes I can see you Ubering a lot more. Although from your own description it's still cheaper to pay for parking in your city.

    Account not recoverable. So long.
  • Options
    Mr KhanMr Khan Not Everyone WAHHHRegistered User regular
    The requirement of human drivers makes it moot in the short term since they still have to pay them, that's where all the cost of livery comes from in the long run.

    Not being anti-labor, of course, but until they actually eliminate the driver taking a self-driving uber ride versus a non-self would just be a novelty for price concerns.

  • Options
    mcdermottmcdermott Registered User regular
    edited August 2016
    zakkiel wrote: »
    I mean, if your city has both terrible parking and shitty public transportation, then yes I can see you Ubering a lot more. Although from your own description it's still cheaper to pay for parking in your city.

    I feel like Seattle has decent-ish public transit. Ubering down from Cap Hill to downtown still made sense at times, because it might be either a 20 minute wait for the next bus or a 15 minute walk in the rain to a slightly quicker bus...or, again, a two minute wait for a car at my door. And I lived on Broadway, so yes the parking at both my origin and destination was often absolutely terrible. There may be "free" parking in the evening downtown, but it's full. So it's either $7 or $10 to park at the destination downtown, or $20 for a roundtrip Uber. For that extra $10, I get to not deal with trying to repark on the street in cap hill (in an unrestricted space) at midnight when I get home, which generally meant either fifteen minutes of block circling, or parking eight blocks away, or both. It's a bargain.

    Public transit is great, but sometimes it just doesn't work with your schedule. That's when taxis (and by extention, Uber) comes in. But I'd think that anybody that's lived in a dense urban core would understand the idea of owning a car for long/suburban trips, but doing everything in your damn power to avoid moving it otherwise. I'm like 99% sure there was a Seinfeld episode about it, even.

    I mean that, or you pay the $200 a month it costs for a legit parking spot.


    Of course this was all before they opened the light rail station in cap hill. It would be much, much easier now.


    EDIT: Don't get me wrong, we used public transit a lot as well, when it worked out. But we were never afraid to call a car when it didn't. It was very, very nice to have the option.

    mcdermott on
  • Options
    MillMill Registered User regular
    Where I live the road infrastructure is shit. The roads are over capacity and it doesn't help that we have a bunch of clovers. To make things worse, we have a fair number of asshole drivers that make some really bad setups worse (like not understanding the zipper, turn signals or that by law they do have to let people merge when they are coming off of one of the clovers because the road fucking ends. Also fun getting asshats that decide the best time to change lanes is not a mile before the area where people are merging in from another road, nope got to save like 10 miserable seconds and try to get into the lane that others are merging into without bothering to signal). So yeah, I fully understand why most people in the city would like to have good public transit because it's a miserable experience.

    If the Hampton roads area had viable public transit that could get me to work without taking 2 fucking hours for a trip that only takes me 20 minutes. I'd happily shell out a little more money a month for it.

    Honestly, in a properly built city, the only reason to have a car is for trips outside the area or for shopping trips where you either have items that need to be chilled or will be carrying enough, to not want to be walking around town with it.

  • Options
    QuidQuid Definitely not a banana Registered User regular
    Self driving cars are now in Pittsburgh and this person wrote up a review.

    Summed up in two words: Felt normal.

  • Options
    japanjapan Registered User regular
    https://www.theguardian.com/technology/2016/sep/15/autopilot-supplier-disowns-tesla-for-pushing-the-envelope-on-safety

    Interesting.

    It fits what I suspected at the time of the fatality, which was that tesla were using an off-the-shelf system in a way that it wasn't intended for.

  • Options
    honoverehonovere Registered User regular
    Just read an article about the fact that while silicon valley companies are the ones that seem to be at the forefront of autonomous driving, the largest bulk of patents in that area come from Japan and Germany, from mostly large tradional car companies and suppliers.

  • Options
    DedwrekkaDedwrekka Metal Hell adjacentRegistered User regular
    NHTSA published a Federal Automated Vehicle Policy
    It's a policy piece, so there's little regulation. they briefly call out a few things that have been brought up here, but leave it up in the air and offer no answers on the subject.

    Relevant bit
    Three reasonable objectives of most vehicle operators are safety, mobility, and legality.
    In most instances, those three objectives can be achieved simultaneously and without conflict. In some cases, achievement of those objectives may come into conflict. For example, most States have a law prohibiting motor vehicles from crossing a double yellow line in the center of a roadway. When another vehicle on a two-lane road is double-parked or otherwise blocking a vehicle’s travel lane, the mobility objective (to move forward toward an intended destination) may come into conflict with safety and legality objectives (e.g., avoiding risk of crash with oncoming car and obeying a law). An HAV confronted with this conflict could resolve it in a few different ways, depending on the decision rules it has been programmed to apply, or even settings applied by a human driver or occupant.

    Similarly, a conflict within the safety objective can be created when addressing the safety of one car’s occupants versus the safety of another car’s occupants. In such situations, it may be that the safety of one person may be protected only at the cost of the safety of another person. In such a dilemma situation, the programming of the HAV will have a significant influence over the outcome for each individual involved.

    Since these decisions potentially impact not only the automated vehicle and its occupants but also surrounding road users, the resolution to these conflicts should be broadly acceptable. Thus, it is important to consider whether HAVs are required to apply particular decision rules in instances of conflicts between safety, mobility, and legality objectives. Algorithms for resolving these conflict situations should be developed transparently using input from Federal and State regulators, drivers, passengers and vulnerable road users, and taking into account the consequences of an HAV’s actions on
    others.

    Basically "These are problems we've seen brought up, we don't have answers, but yours should be good".

  • Options
    AimAim Registered User regular
    Dedwrekka wrote: »
    NHTSA published a Federal Automated Vehicle Policy
    It's a policy piece, so there's little regulation. they briefly call out a few things that have been brought up here, but leave it up in the air and offer no answers on the subject.

    Relevant bit
    Three reasonable objectives of most vehicle operators are safety, mobility, and legality.
    In most instances, those three objectives can be achieved simultaneously and without conflict. In some cases, achievement of those objectives may come into conflict. For example, most States have a law prohibiting motor vehicles from crossing a double yellow line in the center of a roadway. When another vehicle on a two-lane road is double-parked or otherwise blocking a vehicle’s travel lane, the mobility objective (to move forward toward an intended destination) may come into conflict with safety and legality objectives (e.g., avoiding risk of crash with oncoming car and obeying a law). An HAV confronted with this conflict could resolve it in a few different ways, depending on the decision rules it has been programmed to apply, or even settings applied by a human driver or occupant.

    Similarly, a conflict within the safety objective can be created when addressing the safety of one car’s occupants versus the safety of another car’s occupants. In such situations, it may be that the safety of one person may be protected only at the cost of the safety of another person. In such a dilemma situation, the programming of the HAV will have a significant influence over the outcome for each individual involved.

    Since these decisions potentially impact not only the automated vehicle and its occupants but also surrounding road users, the resolution to these conflicts should be broadly acceptable. Thus, it is important to consider whether HAVs are required to apply particular decision rules in instances of conflicts between safety, mobility, and legality objectives. Algorithms for resolving these conflict situations should be developed transparently using input from Federal and State regulators, drivers, passengers and vulnerable road users, and taking into account the consequences of an HAV’s actions on
    others.

    Basically "These are problems we've seen brought up, we don't have answers, but yours should be good".

    Also let us know how you plan to solve them and expect us to tell you to change them.

  • Options
    QuidQuid Definitely not a banana Registered User regular
    If they're bad solutions I certainly hope so.

  • Options
    electricitylikesmeelectricitylikesme Registered User regular
    Yeah there is way too much...negativity about this? Like these policy guidelines are a huge step forward for autonomous vehicles, because they represent broad support for the idea and progressive steps to allow manufacturers to get products to market with a consistent legal framework.

    It's actually really great that they don't start calling out specific ideas because the field doesn't really have technology which can inform the practicality or consequences of them.

  • Options
    AiouaAioua Ora Occidens Ora OptimaRegistered User regular
    Revive!

    Relevant to the current discussion: I'm of the opinion that self-driving vehicle are going to be mainly owned and operated by the manufacturers and not individuals, which bypasses the whole "should I have the right to mod this software" thing

    life's a game that you're bound to lose / like using a hammer to pound in screws
    fuck up once and you break your thumb / if you're happy at all then you're god damn dumb
    that's right we're on a fucked up cruise / God is dead but at least we have booze
    bad things happen, no one knows why / the sun burns out and everyone dies
  • Options
    kimekime Queen of Blades Registered User regular
    edited March 2018
    kime wrote: »
    kime wrote: »
    Changing the stopping distance seems like one of the most important things that you don't let people change. That's probably really not the best example to support your point :(. As in, it's a really bad example :P

    You don't let people remove the safety features that autonomous vehicles would get you.

    Spool's example was to make it more cautious, not less.

    It occurs to me that differing values could have knock on effects spreading through the system but the infrastructure for cars to talk to cars about their settings is another issue entirely.

    If you can hack the code to make it more cautious, you can also make it less cautious. That's the whole point for why people are saying these things shouldn't be editable by users.

    If you are suggesting that people could only change the value in a certain range, well then sure! That's just a configurable value, it should be in the settings of your cars onboard computer or something, but that is not what spool was suggesting.

    I'm just suggesting you argue against spool's actual arguments there.

    I'll offer a competing and more likely scenario: You buy a self driving car and it is fantastic and you love it and maintain it spotlessly for a decade. Now your car is woefully out of date, the manufacturer hasn't updated the control software in 5 years and you want it to conform to the latest and safest driving methodologies. Happily there is a lovely package of DD-WRT open source car driving whatever software that fully conforms with all DOT regulations and is in fact provably safer than the OEM software on the car.

    Should you be able to install it?

    I am trying to argue against spool's actual arguments. He (and you) is saying that you should be able to install whatever software you want on the car. Your suggestions for the reasons behind that are nice. But the point is that if you open up the software, then you open up the software. You can just as easily go in the other direction. There are repercussions to what you are suggesting, which is why it's a dangerous idea. Not necessarily impossible, but you have to accept what you are actually arguing for.

    Let's take your example. What ensures that the software you install conforms to all DOT regulations? Something has to enforce that, otherwise I could just as easily install "SPEED-THROUGH-EVERYONE-AND-GET-ME-WHERE-I-WANT-TO-GO-ASAP" software on my own vehicle. And if something is enforcing that it complies with regulations, then it's not fully customizable.

    Which I'm thinking is a good idea, but it's not what you seem to be arguing.

    Possible I am misunderstanding :)
    spool32 wrote: »
    kime wrote: »
    kime wrote: »
    Changing the stopping distance seems like one of the most important things that you don't let people change. That's probably really not the best example to support your point :(. As in, it's a really bad example :P

    You don't let people remove the safety features that autonomous vehicles would get you.

    Spool's example was to make it more cautious, not less.

    It occurs to me that differing values could have knock on effects spreading through the system but the infrastructure for cars to talk to cars about their settings is another issue entirely.

    If you can hack the code to make it more cautious, you can also make it less cautious. That's the whole point for why people are saying these things shouldn't be editable by users.

    If you are suggesting that people could only change the value in a certain range, well then sure! That's just a configurable value, it should be in the settings of your cars onboard computer or something, but that is not what spool was suggesting.

    Not users. Owners.

    And yes, this is exactly the problem, because you have this situation where:

    - If I can't change it, I don't own it
    - If I don't own it, I'm not liable for how it behaves
    - If I'm not liable, the maker is
    - If corporations don't want the liability, they won't make it

    Therefore, no self-driving cars.

    - If I do own it, I can change it
    - If I can change it, I can change it illegally
    - If you stop me changing it, you're liable for how it operates
    - If corporations don't want the liability, they won't make it

    Therefore, no self-driving cars.

    I agree there is a liability problem here. Dunno the best solution for that.

    I think that letting people do what they want with their autonomous vehicles is a bad idea. Like, "do what you want, but comply with the laws and regulations, which limit what you can do to the software/hardware that keeps things safe."

    kime on
    Battle.net ID: kime#1822
    3DS Friend Code: 3110-5393-4113
    Steam profile
  • Options
    tbloxhamtbloxham Registered User regular
    Aioua wrote: »
    Revive!

    Relevant to the current discussion: I'm of the opinion that self-driving vehicle are going to be mainly owned and operated by the manufacturers and not individuals, which bypasses the whole "should I have the right to mod this software" thing

    I think they will be owned by the individuals, but very likely operated like a flock of units rather than every unit driving to its best individual advantage. So going on a freeway will be more like taking public transit. You'll enter your destination, and the car will open comms with the central system and schedule your route, while also updating all the other routes for optimum fuel efficiency and time. As such, and to make this system work, the auto drive systems will be heavily standardized and nearly impossible to modify. Personal repair (replacing broken modules etc) will be possible, but modification will likely make you lose the network license and be forced to drive manually.

    "That is cool" - Abraham Lincoln
  • Options
    kimekime Queen of Blades Registered User regular
    tbloxham wrote: »
    Aioua wrote: »
    Revive!

    Relevant to the current discussion: I'm of the opinion that self-driving vehicle are going to be mainly owned and operated by the manufacturers and not individuals, which bypasses the whole "should I have the right to mod this software" thing

    I think they will be owned by the individuals, but very likely operated like a flock of units rather than every unit driving to its best individual advantage. So going on a freeway will be more like taking public transit. You'll enter your destination, and the car will open comms with the central system and schedule your route, while also updating all the other routes for optimum fuel efficiency and time. As such, and to make this system work, the auto drive systems will be heavily standardized and nearly impossible to modify. Personal repair (replacing broken modules etc) will be possible, but modification will likely make you lose the network license and be forced to drive manually.

    There's a good amount of research going on about how to do this in a way that doesn't require a centralized system. Or at least, there was a few years ago, not sure about now. So it's not an unescapable fact that it'll have to be centralized to get those benefits.

    I think the conclusion of "you can't modify the critical central software" is still valid either way.

    Battle.net ID: kime#1822
    3DS Friend Code: 3110-5393-4113
    Steam profile
  • Options
    MortiousMortious The Nightmare Begins Move to New ZealandRegistered User regular
    Con't from the Uber thread
    spool32 wrote: »
    spool32 wrote: »
    Phyphor wrote: »
    spool32 wrote: »
    Quid wrote: »
    spool32 wrote: »
    Quid wrote: »
    spool32 wrote: »
    Quid wrote: »
    You would still be free to modify your car to your desire. It would just no longer be legal to take on public roads like various other modified vehicles.

    This is getting a little far afield, but "you can modify your car but you can't drive it on a street anymore" is equivalent to "you can't modify your car". Have a look at efforts to secure The Right to Repair for more info.

    I'm cool with that.

    I'm super not! I'd like to be able to put an aftermarket GPS in my car without bricking the onboard computer, and I'd like to be able to change the oil without needing a trip to the garage so they can charge me $200 to attach a wire and click "approved".

    I'd like for people to not modify their cars in ways that endanger others. The things you're listing as concerns are not "people hacking their car to get them through traffic faster". They still could. That they'd be banned from driving on public roads is fine.

    So I should have access to the on board computer for good things, but I should also not have access to it in case I do bad things. What we're talking about here is the ability to do something vs the legality of doing it.

    Maybe we're not disagreeing? It's just that when you add software, you add hacking. If you lock down the equiment to prevent hacking - when the hacker has physical access to the equipment - you severely reduce the ability of benign users to make desirable and legal changes.

    What desirable and legal changes to self-driving software can you possibly make?

    Maybe you want a little more stopping distance programmed in because your back is shit and the car brakes too hard for you?

    Maybe you bought cutting-edge LIDAR in the 4th year of owning the car, but the software doesn't support it without installing a patch validated by the manufacturer, which you can't install without visiting the dealer even though it's just bluetooth.

    I could go on and on! Maybe you want the new HUD but the software won't display at a resolution the OLED manufacturer says the display supports. Maybe a stick of RAM failed and you want to replace it yourself. Maybe you cancelled your Verizon HUM driving monitor subscription but you own the hardware now, still want to get info from it, and you know it's sending out signals but can't read them anymore. Maybe you put snow tires on and can't get the damned thing to recognize them without paying the dealer $250 for a service call where he just logs in remotely and ticks the "snow tire" box.

    I could go on. There's a hundred legit reasons to modify the software in a car.

    As a software engineer, those are weak reasons to allow altering of the on board electronics. The braking distance may be a config parameter at best, but really that should be based on the conditions of the road anyway and not the comfort of the passenger once various base safety levels are met. I absolutely DO NOT want people to change out ram on their car like it's a PC. Hell, the ram won't even be able to be replaced like a PC, it is almost always embedded on the board in one way or another.
    Autonomous will trend towards aerospace design rather than PC design. Hell, it is part of the reason why ISO 26262 exists.

    Fundamental difference: I own the car. The desires of a corporate software engineer to control what I do with my property are, to put it mildly, irrelevant.

    Any situation which puts me in the position of being in a personal mass transit vehicle had better come with a similar cost, i.e. a couple of bucks a day. If I'm paying $50,000 for this car the only thing I should have to do is follow the law, and it needs to be possible for me to do that through my own ability.

    And then you can drive it on the roads that you own, or you have permission to drive your modified vehicle on. Not public roads though.

    Lots of rights get curbed for the greater societal benefit. I can see self driving cars cut down on road deaths and congestion and a bunch of other things that'll improve QoL.

    How many people actually do things to their cars like that now? With my previous cars I canged the oil and cleaned the spark plugs etc, but with my new electric car there's just a box where the engine would be.

    Even if I wanted to mess with it, I'll just end up breaking it.

    Move to New Zealand
    It’s not a very important country most of the time
    http://steamcommunity.com/id/mortious
  • Options
    tbloxhamtbloxham Registered User regular
    Mortious wrote: »
    Con't from the Uber thread
    spool32 wrote: »
    spool32 wrote: »
    Phyphor wrote: »
    spool32 wrote: »
    Quid wrote: »
    spool32 wrote: »
    Quid wrote: »
    spool32 wrote: »
    Quid wrote: »
    You would still be free to modify your car to your desire. It would just no longer be legal to take on public roads like various other modified vehicles.

    This is getting a little far afield, but "you can modify your car but you can't drive it on a street anymore" is equivalent to "you can't modify your car". Have a look at efforts to secure The Right to Repair for more info.

    I'm cool with that.

    I'm super not! I'd like to be able to put an aftermarket GPS in my car without bricking the onboard computer, and I'd like to be able to change the oil without needing a trip to the garage so they can charge me $200 to attach a wire and click "approved".

    I'd like for people to not modify their cars in ways that endanger others. The things you're listing as concerns are not "people hacking their car to get them through traffic faster". They still could. That they'd be banned from driving on public roads is fine.

    So I should have access to the on board computer for good things, but I should also not have access to it in case I do bad things. What we're talking about here is the ability to do something vs the legality of doing it.

    Maybe we're not disagreeing? It's just that when you add software, you add hacking. If you lock down the equiment to prevent hacking - when the hacker has physical access to the equipment - you severely reduce the ability of benign users to make desirable and legal changes.

    What desirable and legal changes to self-driving software can you possibly make?

    Maybe you want a little more stopping distance programmed in because your back is shit and the car brakes too hard for you?

    Maybe you bought cutting-edge LIDAR in the 4th year of owning the car, but the software doesn't support it without installing a patch validated by the manufacturer, which you can't install without visiting the dealer even though it's just bluetooth.

    I could go on and on! Maybe you want the new HUD but the software won't display at a resolution the OLED manufacturer says the display supports. Maybe a stick of RAM failed and you want to replace it yourself. Maybe you cancelled your Verizon HUM driving monitor subscription but you own the hardware now, still want to get info from it, and you know it's sending out signals but can't read them anymore. Maybe you put snow tires on and can't get the damned thing to recognize them without paying the dealer $250 for a service call where he just logs in remotely and ticks the "snow tire" box.

    I could go on. There's a hundred legit reasons to modify the software in a car.

    As a software engineer, those are weak reasons to allow altering of the on board electronics. The braking distance may be a config parameter at best, but really that should be based on the conditions of the road anyway and not the comfort of the passenger once various base safety levels are met. I absolutely DO NOT want people to change out ram on their car like it's a PC. Hell, the ram won't even be able to be replaced like a PC, it is almost always embedded on the board in one way or another.
    Autonomous will trend towards aerospace design rather than PC design. Hell, it is part of the reason why ISO 26262 exists.

    Fundamental difference: I own the car. The desires of a corporate software engineer to control what I do with my property are, to put it mildly, irrelevant.

    Any situation which puts me in the position of being in a personal mass transit vehicle had better come with a similar cost, i.e. a couple of bucks a day. If I'm paying $50,000 for this car the only thing I should have to do is follow the law, and it needs to be possible for me to do that through my own ability.

    And then you can drive it on the roads that you own, or you have permission to drive your modified vehicle on. Not public roads though.

    Lots of rights get curbed for the greater societal benefit. I can see self driving cars cut down on road deaths and congestion and a bunch of other things that'll improve QoL.

    How many people actually do things to their cars like that now? With my previous cars I canged the oil and cleaned the spark plugs etc, but with my new electric car there's just a box where the engine would be.

    Even if I wanted to mess with it, I'll just end up breaking it.

    And all that right to repair laws mean is that the company can't deliberately install limits on what you could yourself easily do. Like, they can't have some sensor which stops you inflating your own tires. And they have to provide third party repair shops with the manuals, and sell the requisite parts, for them to make repairs.

    So, when you are in your Twitter Monocar in 2031 you won't be able to do anything but change the wiper fluid at home, but you can take it down to Crazy Zuckerbergs Discount Car Service and Facebook Headquarters for a replacement LIDAR system.

    "That is cool" - Abraham Lincoln
  • Options
    spool32spool32 Contrary Library Registered User regular
    kime wrote: »
    kime wrote: »
    kime wrote: »
    Changing the stopping distance seems like one of the most important things that you don't let people change. That's probably really not the best example to support your point :(. As in, it's a really bad example :P

    You don't let people remove the safety features that autonomous vehicles would get you.

    Spool's example was to make it more cautious, not less.

    It occurs to me that differing values could have knock on effects spreading through the system but the infrastructure for cars to talk to cars about their settings is another issue entirely.

    If you can hack the code to make it more cautious, you can also make it less cautious. That's the whole point for why people are saying these things shouldn't be editable by users.

    If you are suggesting that people could only change the value in a certain range, well then sure! That's just a configurable value, it should be in the settings of your cars onboard computer or something, but that is not what spool was suggesting.

    I'm just suggesting you argue against spool's actual arguments there.

    I'll offer a competing and more likely scenario: You buy a self driving car and it is fantastic and you love it and maintain it spotlessly for a decade. Now your car is woefully out of date, the manufacturer hasn't updated the control software in 5 years and you want it to conform to the latest and safest driving methodologies. Happily there is a lovely package of DD-WRT open source car driving whatever software that fully conforms with all DOT regulations and is in fact provably safer than the OEM software on the car.

    Should you be able to install it?

    I am trying to argue against spool's actual arguments. He (and you) is saying that you should be able to install whatever software you want on the car. Your suggestions for the reasons behind that are nice. But the point is that if you open up the software, then you open up the software. You can just as easily go in the other direction. There are repercussions to what you are suggesting, which is why it's a dangerous idea. Not necessarily impossible, but you have to accept what you are actually arguing for.

    Let's take your example. What ensures that the software you install conforms to all DOT regulations? Something has to enforce that, otherwise I could just as easily install "SPEED-THROUGH-EVERYONE-AND-GET-ME-WHERE-I-WANT-TO-GO-ASAP" software on my own vehicle. And if something is enforcing that it complies with regulations, then it's not fully customizable.

    Which I'm thinking is a good idea, but it's not what you seem to be arguing.

    Possible I am misunderstanding :)
    spool32 wrote: »
    kime wrote: »
    kime wrote: »
    Changing the stopping distance seems like one of the most important things that you don't let people change. That's probably really not the best example to support your point :(. As in, it's a really bad example :P

    You don't let people remove the safety features that autonomous vehicles would get you.

    Spool's example was to make it more cautious, not less.

    It occurs to me that differing values could have knock on effects spreading through the system but the infrastructure for cars to talk to cars about their settings is another issue entirely.

    If you can hack the code to make it more cautious, you can also make it less cautious. That's the whole point for why people are saying these things shouldn't be editable by users.

    If you are suggesting that people could only change the value in a certain range, well then sure! That's just a configurable value, it should be in the settings of your cars onboard computer or something, but that is not what spool was suggesting.

    Not users. Owners.

    And yes, this is exactly the problem, because you have this situation where:

    - If I can't change it, I don't own it
    - If I don't own it, I'm not liable for how it behaves
    - If I'm not liable, the maker is
    - If corporations don't want the liability, they won't make it

    Therefore, no self-driving cars.

    - If I do own it, I can change it
    - If I can change it, I can change it illegally
    - If you stop me changing it, you're liable for how it operates
    - If corporations don't want the liability, they won't make it

    Therefore, no self-driving cars.

    I agree there is a liability problem here. Dunno the best solution for that.

    I think that letting people do what they want with their autonomous vehicles is a bad idea. Like, "do what you want, but comply with the laws and regulations, which limit what you can do to the software/hardware that keeps things safe."

    There has to be some limit on restriction, or we end up in a situation where, like I've described, you have to pay the Ford dealer $250 to log into your car and activate snow tire mode, and you can't replace your cameras with better ones halfway through the car's life.

    Or like, buy it secondhand without signing the software license agreement.

    Or dozens of other things!

  • Options
    PhyphorPhyphor Building Planet Busters Tasting FruitRegistered User regular
    kime wrote: »
    kime wrote: »
    Changing the stopping distance seems like one of the most important things that you don't let people change. That's probably really not the best example to support your point :(. As in, it's a really bad example :P

    You don't let people remove the safety features that autonomous vehicles would get you.

    Spool's example was to make it more cautious, not less.

    It occurs to me that differing values could have knock on effects spreading through the system but the infrastructure for cars to talk to cars about their settings is another issue entirely.

    If you can hack the code to make it more cautious, you can also make it less cautious. That's the whole point for why people are saying these things shouldn't be editable by users.

    If you are suggesting that people could only change the value in a certain range, well then sure! That's just a configurable value, it should be in the settings of your cars onboard computer or something, but that is not what spool was suggesting.

    I'm just suggesting you argue against spool's actual arguments there.

    I'll offer a competing and more likely scenario: You buy a self driving car and it is fantastic and you love it and maintain it spotlessly for a decade. Now your car is woefully out of date, the manufacturer hasn't updated the control software in 5 years and you want it to conform to the latest and safest driving methodologies. Happily there is a lovely package of DD-WRT open source car driving whatever software that fully conforms with all DOT regulations and is in fact provably safer than the OEM software on the car.

    Should you be able to install it?

    The software won't exist and it won't be provably safer. It's a problem of money. Nobody will insure you if you run custom self-driving software (and you'll probably be considered at-fault for anything that happens), nobody will insure you for running open source software either, largely because it's both unproven and possible to modify. To even get off the ground you need lots of specialized people for years to even write a first version. To write a version for a car without a steering wheel to fall back on, well it's nearly 10 years in and we're not 100% there yet. Without people running the software in the real world the only way to do any testing is in simulation which the insurance companies will still just shrug at and the few people willing to self-insure, accept full liability for everything won't be able to drive enough miles by themselves to matter

    Now, the manufacturer being mandated to provide control updates for the working lifetime of the car, that's a thing that may happen

  • Options
    tsmvengytsmvengy Registered User regular
    Honestly if AVs are not owned and operated in fleets and as shared vehicles it's going to be a complete and total disaster for cities. We have horrible traffic now when each car is carrying, on average, 1.3 people. What happens when we have vehicles with the ability to drive around empty?

    https://youtu.be/DeUE4kHRpEk

    Robin Chase is the former CEO of Zipcar

    steam_sig.png
  • Options
    AngelHedgieAngelHedgie Registered User regular
    There was an attempt to create an "open source" autonomous vehicle system about a year or so back. It completely collapsed when the DoT demanded that it be tested and regulated like any other system.

    XBL: Nox Aeternum / PSN: NoxAeternum / NN:NoxAeternum / Steam: noxaeternum
  • Options
    lunchbox12682lunchbox12682 MinnesotaRegistered User regular
    As I mentioned ISO 26262 and aerospace in the Uber thread, I don't doubt for a second there will be an FAA equivalent (maybe FTC or DOT or whoever) that mandates the necessary standards for AV. Maybe this is only for Federal highways, but it will either trickle down to states or the states will just create similar ones that spread, like California with emission standards.
    If you want your vehicle on the road, then it must be street legal.
    Regarding obsolescence, which I think is a valid point, I would guess part of the laws will be pushed on manufactures to maintain software for X number of years. Of course, I think there will be lanes for autonomous as it builds up over the decades.

  • Options
    shrykeshryke Member of the Beast Registered User regular
    edited March 2018
    Her point about zombie cars is a pretty good one, even if I'm not really agreeing with lots of the rest of it.

    If your car can drive on it's own, why pay for parking ever?

    shryke on
  • Options
    mRahmanimRahmani DetroitRegistered User regular
    Mortious wrote: »
    And then you can drive it on the roads that you own, or you have permission to drive your modified vehicle on. Not public roads though.

    Lots of rights get curbed for the greater societal benefit. I can see self driving cars cut down on road deaths and congestion and a bunch of other things that'll improve QoL.

    How many people actually do things to their cars like that now? With my previous cars I canged the oil and cleaned the spark plugs etc, but with my new electric car there's just a box where the engine would be.

    Even if I wanted to mess with it, I'll just end up breaking it.

    People modify cars and flash ECMs for performance gains all the time. HPTuners, for example, is one of many products in that area. They typically adversely affect emissions and fuel economy to get the increased performance, but short of mandated tests there's not a whole lot that can be done to enforce them. Even in places that have annual inspections, people typically just flash back to the factory calibration to get their sticker and then go right back to their performance flash.

  • Options
    tbloxhamtbloxham Registered User regular
    spool32 wrote: »
    kime wrote: »
    kime wrote: »
    kime wrote: »
    Changing the stopping distance seems like one of the most important things that you don't let people change. That's probably really not the best example to support your point :(. As in, it's a really bad example :P

    You don't let people remove the safety features that autonomous vehicles would get you.

    Spool's example was to make it more cautious, not less.

    It occurs to me that differing values could have knock on effects spreading through the system but the infrastructure for cars to talk to cars about their settings is another issue entirely.

    If you can hack the code to make it more cautious, you can also make it less cautious. That's the whole point for why people are saying these things shouldn't be editable by users.

    If you are suggesting that people could only change the value in a certain range, well then sure! That's just a configurable value, it should be in the settings of your cars onboard computer or something, but that is not what spool was suggesting.

    I'm just suggesting you argue against spool's actual arguments there.

    I'll offer a competing and more likely scenario: You buy a self driving car and it is fantastic and you love it and maintain it spotlessly for a decade. Now your car is woefully out of date, the manufacturer hasn't updated the control software in 5 years and you want it to conform to the latest and safest driving methodologies. Happily there is a lovely package of DD-WRT open source car driving whatever software that fully conforms with all DOT regulations and is in fact provably safer than the OEM software on the car.

    Should you be able to install it?

    I am trying to argue against spool's actual arguments. He (and you) is saying that you should be able to install whatever software you want on the car. Your suggestions for the reasons behind that are nice. But the point is that if you open up the software, then you open up the software. You can just as easily go in the other direction. There are repercussions to what you are suggesting, which is why it's a dangerous idea. Not necessarily impossible, but you have to accept what you are actually arguing for.

    Let's take your example. What ensures that the software you install conforms to all DOT regulations? Something has to enforce that, otherwise I could just as easily install "SPEED-THROUGH-EVERYONE-AND-GET-ME-WHERE-I-WANT-TO-GO-ASAP" software on my own vehicle. And if something is enforcing that it complies with regulations, then it's not fully customizable.

    Which I'm thinking is a good idea, but it's not what you seem to be arguing.

    Possible I am misunderstanding :)
    spool32 wrote: »
    kime wrote: »
    kime wrote: »
    Changing the stopping distance seems like one of the most important things that you don't let people change. That's probably really not the best example to support your point :(. As in, it's a really bad example :P

    You don't let people remove the safety features that autonomous vehicles would get you.

    Spool's example was to make it more cautious, not less.

    It occurs to me that differing values could have knock on effects spreading through the system but the infrastructure for cars to talk to cars about their settings is another issue entirely.

    If you can hack the code to make it more cautious, you can also make it less cautious. That's the whole point for why people are saying these things shouldn't be editable by users.

    If you are suggesting that people could only change the value in a certain range, well then sure! That's just a configurable value, it should be in the settings of your cars onboard computer or something, but that is not what spool was suggesting.

    Not users. Owners.

    And yes, this is exactly the problem, because you have this situation where:

    - If I can't change it, I don't own it
    - If I don't own it, I'm not liable for how it behaves
    - If I'm not liable, the maker is
    - If corporations don't want the liability, they won't make it

    Therefore, no self-driving cars.

    - If I do own it, I can change it
    - If I can change it, I can change it illegally
    - If you stop me changing it, you're liable for how it operates
    - If corporations don't want the liability, they won't make it

    Therefore, no self-driving cars.

    I agree there is a liability problem here. Dunno the best solution for that.

    I think that letting people do what they want with their autonomous vehicles is a bad idea. Like, "do what you want, but comply with the laws and regulations, which limit what you can do to the software/hardware that keeps things safe."

    There has to be some limit on restriction, or we end up in a situation where, like I've described, you have to pay the Ford dealer $250 to log into your car and activate snow tire mode, and you can't replace your cameras with better ones halfway through the car's life.

    Or like, buy it secondhand without signing the software license agreement.

    Or dozens of other things!

    Those things are all governed by the current laws though. And they don't say that you can just do whatever you want. They just say that the company can't arbitrarily stop you doing things which are easy (your snow tire sensor example) and they can't stop you making repairs at capable third party repair shops (or yourself, if you can do it yourself, which is incredibly unlikely, because humans and robots are good at repairing things in different ways)

    So to your examples I'd say

    Snow tire sensor -> Right to repair. Need to allow consumers and third parties to respond and toggle.

    Replace broken camera -> Right to repair. Need to allow consumers and third parties to fix providing they follow the manuals.

    Replace camera with better one (while manufacturer supports the vehicle) -> If the manufacturer offers the camera replacement, then right to repair laws mean they need to allow consumers and third parties to modify providing they follow the manuals and use the right parts (which they have to sell at a fair price)

    Replace camera with better one (5 years after manufacturer stopped supporting the vehicle) -> Here's a big grey area right now. You can't just do whatever you want. The government would likely have to offer some kind of inspection system so you could use 'new' parts.

    Buy it secondhand -> A self driving vehicle is a driver. As such, when you buy the vehicle, the 'driving license' comes with it. So your contract is effectively with the government, not the manufacturer. And the government is perfectly allowed to force you to sign contracts to drive. And that contract would say, "You must repair and maintain your automatic driving system according to standard X."

    "That is cool" - Abraham Lincoln
  • Options
    kimekime Queen of Blades Registered User regular
    tsmvengy wrote: »
    Honestly if AVs are not owned and operated in fleets and as shared vehicles it's going to be a complete and total disaster for cities. We have horrible traffic now when each car is carrying, on average, 1.3 people. What happens when we have vehicles with the ability to drive around empty?

    https://youtu.be/DeUE4kHRpEk

    Robin Chase is the former CEO of Zipcar

    Maybe nothing. I didn't watch the video, not able to right now, but I could imagine a world with more parking available, more efficient driving, and 1 car being shared among multiple people (in a shared vehicle/fleet situation) improving city traffic pretty easily. It's not like the fleet of cars will just be continuously driving around in addition to all the cars that would otherwise be there with people in them.

    tbh though, I would personally accept worse traffic and longer travel times if I could read/sleep during transit (I already do when I can commute via bus, which I prefer). I hope that's not the tradeoff we have to make, though.

    Battle.net ID: kime#1822
    3DS Friend Code: 3110-5393-4113
    Steam profile
  • Options
    MortiousMortious The Nightmare Begins Move to New ZealandRegistered User regular
    spool32 wrote: »
    kime wrote: »
    kime wrote: »
    kime wrote: »
    Changing the stopping distance seems like one of the most important things that you don't let people change. That's probably really not the best example to support your point :(. As in, it's a really bad example :P

    You don't let people remove the safety features that autonomous vehicles would get you.

    Spool's example was to make it more cautious, not less.

    It occurs to me that differing values could have knock on effects spreading through the system but the infrastructure for cars to talk to cars about their settings is another issue entirely.

    If you can hack the code to make it more cautious, you can also make it less cautious. That's the whole point for why people are saying these things shouldn't be editable by users.

    If you are suggesting that people could only change the value in a certain range, well then sure! That's just a configurable value, it should be in the settings of your cars onboard computer or something, but that is not what spool was suggesting.

    I'm just suggesting you argue against spool's actual arguments there.

    I'll offer a competing and more likely scenario: You buy a self driving car and it is fantastic and you love it and maintain it spotlessly for a decade. Now your car is woefully out of date, the manufacturer hasn't updated the control software in 5 years and you want it to conform to the latest and safest driving methodologies. Happily there is a lovely package of DD-WRT open source car driving whatever software that fully conforms with all DOT regulations and is in fact provably safer than the OEM software on the car.

    Should you be able to install it?

    I am trying to argue against spool's actual arguments. He (and you) is saying that you should be able to install whatever software you want on the car. Your suggestions for the reasons behind that are nice. But the point is that if you open up the software, then you open up the software. You can just as easily go in the other direction. There are repercussions to what you are suggesting, which is why it's a dangerous idea. Not necessarily impossible, but you have to accept what you are actually arguing for.

    Let's take your example. What ensures that the software you install conforms to all DOT regulations? Something has to enforce that, otherwise I could just as easily install "SPEED-THROUGH-EVERYONE-AND-GET-ME-WHERE-I-WANT-TO-GO-ASAP" software on my own vehicle. And if something is enforcing that it complies with regulations, then it's not fully customizable.

    Which I'm thinking is a good idea, but it's not what you seem to be arguing.

    Possible I am misunderstanding :)
    spool32 wrote: »
    kime wrote: »
    kime wrote: »
    Changing the stopping distance seems like one of the most important things that you don't let people change. That's probably really not the best example to support your point :(. As in, it's a really bad example :P

    You don't let people remove the safety features that autonomous vehicles would get you.

    Spool's example was to make it more cautious, not less.

    It occurs to me that differing values could have knock on effects spreading through the system but the infrastructure for cars to talk to cars about their settings is another issue entirely.

    If you can hack the code to make it more cautious, you can also make it less cautious. That's the whole point for why people are saying these things shouldn't be editable by users.

    If you are suggesting that people could only change the value in a certain range, well then sure! That's just a configurable value, it should be in the settings of your cars onboard computer or something, but that is not what spool was suggesting.

    Not users. Owners.

    And yes, this is exactly the problem, because you have this situation where:

    - If I can't change it, I don't own it
    - If I don't own it, I'm not liable for how it behaves
    - If I'm not liable, the maker is
    - If corporations don't want the liability, they won't make it

    Therefore, no self-driving cars.

    - If I do own it, I can change it
    - If I can change it, I can change it illegally
    - If you stop me changing it, you're liable for how it operates
    - If corporations don't want the liability, they won't make it

    Therefore, no self-driving cars.

    I agree there is a liability problem here. Dunno the best solution for that.

    I think that letting people do what they want with their autonomous vehicles is a bad idea. Like, "do what you want, but comply with the laws and regulations, which limit what you can do to the software/hardware that keeps things safe."

    There has to be some limit on restriction, or we end up in a situation where, like I've described, you have to pay the Ford dealer $250 to log into your car and activate snow tire mode, and you can't replace your cameras with better ones halfway through the car's life.

    Or like, buy it secondhand without signing the software license agreement.

    Or dozens of other things!

    I agree with the snow tire example, but not the camera one.

    I'm fine with limitations on the owner's ability to modify their vehicle needing to be justified, but that doesn't preclude those limitations.

    Move to New Zealand
    It’s not a very important country most of the time
    http://steamcommunity.com/id/mortious
  • Options
    kimekime Queen of Blades Registered User regular
    Mortious wrote: »
    spool32 wrote: »
    kime wrote: »
    kime wrote: »
    kime wrote: »
    Changing the stopping distance seems like one of the most important things that you don't let people change. That's probably really not the best example to support your point :(. As in, it's a really bad example :P

    You don't let people remove the safety features that autonomous vehicles would get you.

    Spool's example was to make it more cautious, not less.

    It occurs to me that differing values could have knock on effects spreading through the system but the infrastructure for cars to talk to cars about their settings is another issue entirely.

    If you can hack the code to make it more cautious, you can also make it less cautious. That's the whole point for why people are saying these things shouldn't be editable by users.

    If you are suggesting that people could only change the value in a certain range, well then sure! That's just a configurable value, it should be in the settings of your cars onboard computer or something, but that is not what spool was suggesting.

    I'm just suggesting you argue against spool's actual arguments there.

    I'll offer a competing and more likely scenario: You buy a self driving car and it is fantastic and you love it and maintain it spotlessly for a decade. Now your car is woefully out of date, the manufacturer hasn't updated the control software in 5 years and you want it to conform to the latest and safest driving methodologies. Happily there is a lovely package of DD-WRT open source car driving whatever software that fully conforms with all DOT regulations and is in fact provably safer than the OEM software on the car.

    Should you be able to install it?

    I am trying to argue against spool's actual arguments. He (and you) is saying that you should be able to install whatever software you want on the car. Your suggestions for the reasons behind that are nice. But the point is that if you open up the software, then you open up the software. You can just as easily go in the other direction. There are repercussions to what you are suggesting, which is why it's a dangerous idea. Not necessarily impossible, but you have to accept what you are actually arguing for.

    Let's take your example. What ensures that the software you install conforms to all DOT regulations? Something has to enforce that, otherwise I could just as easily install "SPEED-THROUGH-EVERYONE-AND-GET-ME-WHERE-I-WANT-TO-GO-ASAP" software on my own vehicle. And if something is enforcing that it complies with regulations, then it's not fully customizable.

    Which I'm thinking is a good idea, but it's not what you seem to be arguing.

    Possible I am misunderstanding :)
    spool32 wrote: »
    kime wrote: »
    kime wrote: »
    Changing the stopping distance seems like one of the most important things that you don't let people change. That's probably really not the best example to support your point :(. As in, it's a really bad example :P

    You don't let people remove the safety features that autonomous vehicles would get you.

    Spool's example was to make it more cautious, not less.

    It occurs to me that differing values could have knock on effects spreading through the system but the infrastructure for cars to talk to cars about their settings is another issue entirely.

    If you can hack the code to make it more cautious, you can also make it less cautious. That's the whole point for why people are saying these things shouldn't be editable by users.

    If you are suggesting that people could only change the value in a certain range, well then sure! That's just a configurable value, it should be in the settings of your cars onboard computer or something, but that is not what spool was suggesting.

    Not users. Owners.

    And yes, this is exactly the problem, because you have this situation where:

    - If I can't change it, I don't own it
    - If I don't own it, I'm not liable for how it behaves
    - If I'm not liable, the maker is
    - If corporations don't want the liability, they won't make it

    Therefore, no self-driving cars.

    - If I do own it, I can change it
    - If I can change it, I can change it illegally
    - If you stop me changing it, you're liable for how it operates
    - If corporations don't want the liability, they won't make it

    Therefore, no self-driving cars.

    I agree there is a liability problem here. Dunno the best solution for that.

    I think that letting people do what they want with their autonomous vehicles is a bad idea. Like, "do what you want, but comply with the laws and regulations, which limit what you can do to the software/hardware that keeps things safe."

    There has to be some limit on restriction, or we end up in a situation where, like I've described, you have to pay the Ford dealer $250 to log into your car and activate snow tire mode, and you can't replace your cameras with better ones halfway through the car's life.

    Or like, buy it secondhand without signing the software license agreement.

    Or dozens of other things!

    I agree with the snow tire example, but not the camera one.

    I'm fine with limitations on the owner's ability to modify their vehicle needing to be justified, but that doesn't preclude those limitations.

    A camera could be fine, if it's done by a professional and is an approved model for your vehicle. Which, yes, means you don't get 100% control over something you "own," but I think we can pretty easily get a happy medium.

    Battle.net ID: kime#1822
    3DS Friend Code: 3110-5393-4113
    Steam profile
  • Options
    MortiousMortious The Nightmare Begins Move to New ZealandRegistered User regular
    kime wrote: »
    Mortious wrote: »
    spool32 wrote: »
    kime wrote: »
    kime wrote: »
    kime wrote: »
    Changing the stopping distance seems like one of the most important things that you don't let people change. That's probably really not the best example to support your point :(. As in, it's a really bad example :P

    You don't let people remove the safety features that autonomous vehicles would get you.

    Spool's example was to make it more cautious, not less.

    It occurs to me that differing values could have knock on effects spreading through the system but the infrastructure for cars to talk to cars about their settings is another issue entirely.

    If you can hack the code to make it more cautious, you can also make it less cautious. That's the whole point for why people are saying these things shouldn't be editable by users.

    If you are suggesting that people could only change the value in a certain range, well then sure! That's just a configurable value, it should be in the settings of your cars onboard computer or something, but that is not what spool was suggesting.

    I'm just suggesting you argue against spool's actual arguments there.

    I'll offer a competing and more likely scenario: You buy a self driving car and it is fantastic and you love it and maintain it spotlessly for a decade. Now your car is woefully out of date, the manufacturer hasn't updated the control software in 5 years and you want it to conform to the latest and safest driving methodologies. Happily there is a lovely package of DD-WRT open source car driving whatever software that fully conforms with all DOT regulations and is in fact provably safer than the OEM software on the car.

    Should you be able to install it?

    I am trying to argue against spool's actual arguments. He (and you) is saying that you should be able to install whatever software you want on the car. Your suggestions for the reasons behind that are nice. But the point is that if you open up the software, then you open up the software. You can just as easily go in the other direction. There are repercussions to what you are suggesting, which is why it's a dangerous idea. Not necessarily impossible, but you have to accept what you are actually arguing for.

    Let's take your example. What ensures that the software you install conforms to all DOT regulations? Something has to enforce that, otherwise I could just as easily install "SPEED-THROUGH-EVERYONE-AND-GET-ME-WHERE-I-WANT-TO-GO-ASAP" software on my own vehicle. And if something is enforcing that it complies with regulations, then it's not fully customizable.

    Which I'm thinking is a good idea, but it's not what you seem to be arguing.

    Possible I am misunderstanding :)
    spool32 wrote: »
    kime wrote: »
    kime wrote: »
    Changing the stopping distance seems like one of the most important things that you don't let people change. That's probably really not the best example to support your point :(. As in, it's a really bad example :P

    You don't let people remove the safety features that autonomous vehicles would get you.

    Spool's example was to make it more cautious, not less.

    It occurs to me that differing values could have knock on effects spreading through the system but the infrastructure for cars to talk to cars about their settings is another issue entirely.

    If you can hack the code to make it more cautious, you can also make it less cautious. That's the whole point for why people are saying these things shouldn't be editable by users.

    If you are suggesting that people could only change the value in a certain range, well then sure! That's just a configurable value, it should be in the settings of your cars onboard computer or something, but that is not what spool was suggesting.

    Not users. Owners.

    And yes, this is exactly the problem, because you have this situation where:

    - If I can't change it, I don't own it
    - If I don't own it, I'm not liable for how it behaves
    - If I'm not liable, the maker is
    - If corporations don't want the liability, they won't make it

    Therefore, no self-driving cars.

    - If I do own it, I can change it
    - If I can change it, I can change it illegally
    - If you stop me changing it, you're liable for how it operates
    - If corporations don't want the liability, they won't make it

    Therefore, no self-driving cars.

    I agree there is a liability problem here. Dunno the best solution for that.

    I think that letting people do what they want with their autonomous vehicles is a bad idea. Like, "do what you want, but comply with the laws and regulations, which limit what you can do to the software/hardware that keeps things safe."

    There has to be some limit on restriction, or we end up in a situation where, like I've described, you have to pay the Ford dealer $250 to log into your car and activate snow tire mode, and you can't replace your cameras with better ones halfway through the car's life.

    Or like, buy it secondhand without signing the software license agreement.

    Or dozens of other things!

    I agree with the snow tire example, but not the camera one.

    I'm fine with limitations on the owner's ability to modify their vehicle needing to be justified, but that doesn't preclude those limitations.

    A camera could be fine, if it's done by a professional and is an approved model for your vehicle. Which, yes, means you don't get 100% control over something you "own," but I think we can pretty easily get a happy medium.

    His example was for any owner to be able to do it at home, and without having it to be re-approved for road use.

    Move to New Zealand
    It’s not a very important country most of the time
    http://steamcommunity.com/id/mortious
  • Options
    AiouaAioua Ora Occidens Ora OptimaRegistered User regular
    I imagine a future where self-driving cars aren't just a service, but able to fufill multiple service roles.

    Like, the car itself it basically a low to the ground flatbed, and has different cabins or attachments for different jobs.

    On the low level you just hail a car and it rolls up with a company-provided passenger cabin, takes you where you wanna go, maybe you share rides with a stranger to make it even cheaper. Car can work that all day long.

    The natural objections to this are replacing use cases of private cars. You want to store things in your car, you want to have a nice and clean interior, etc. So your middle class types could buy their own car cabin that will attach to the car. It sits in your driveway or the parking lot at work while the car itself stays out on the road being useful. Maybe it heads back to the hub to pick up a delivery van module and spends the working hours working instead of idling in the parking lot.

    If the cars spend the majority of the time doing actual work instead of being parked the cost per ride should be fairly low which I think would drive personal ownership out. If you only need to pay for 1/10th of a car it's gonna be hard for people selling you the whole car to compete.

    life's a game that you're bound to lose / like using a hammer to pound in screws
    fuck up once and you break your thumb / if you're happy at all then you're god damn dumb
    that's right we're on a fucked up cruise / God is dead but at least we have booze
    bad things happen, no one knows why / the sun burns out and everyone dies
  • Options
    AngelHedgieAngelHedgie Registered User regular
    tsmvengy wrote: »
    Honestly if AVs are not owned and operated in fleets and as shared vehicles it's going to be a complete and total disaster for cities. We have horrible traffic now when each car is carrying, on average, 1.3 people. What happens when we have vehicles with the ability to drive around empty?

    https://youtu.be/DeUE4kHRpEk

    Robin Chase is the former CEO of Zipcar

    Hey, it's the former CEO of a company which is pushing to make private ownership of urban autonomous vehicles illegal.

    XBL: Nox Aeternum / PSN: NoxAeternum / NN:NoxAeternum / Steam: noxaeternum
  • Options
    kimekime Queen of Blades Registered User regular
    Mortious wrote: »
    kime wrote: »
    Mortious wrote: »
    spool32 wrote: »
    kime wrote: »
    kime wrote: »
    kime wrote: »
    Changing the stopping distance seems like one of the most important things that you don't let people change. That's probably really not the best example to support your point :(. As in, it's a really bad example :P

    You don't let people remove the safety features that autonomous vehicles would get you.

    Spool's example was to make it more cautious, not less.

    It occurs to me that differing values could have knock on effects spreading through the system but the infrastructure for cars to talk to cars about their settings is another issue entirely.

    If you can hack the code to make it more cautious, you can also make it less cautious. That's the whole point for why people are saying these things shouldn't be editable by users.

    If you are suggesting that people could only change the value in a certain range, well then sure! That's just a configurable value, it should be in the settings of your cars onboard computer or something, but that is not what spool was suggesting.

    I'm just suggesting you argue against spool's actual arguments there.

    I'll offer a competing and more likely scenario: You buy a self driving car and it is fantastic and you love it and maintain it spotlessly for a decade. Now your car is woefully out of date, the manufacturer hasn't updated the control software in 5 years and you want it to conform to the latest and safest driving methodologies. Happily there is a lovely package of DD-WRT open source car driving whatever software that fully conforms with all DOT regulations and is in fact provably safer than the OEM software on the car.

    Should you be able to install it?

    I am trying to argue against spool's actual arguments. He (and you) is saying that you should be able to install whatever software you want on the car. Your suggestions for the reasons behind that are nice. But the point is that if you open up the software, then you open up the software. You can just as easily go in the other direction. There are repercussions to what you are suggesting, which is why it's a dangerous idea. Not necessarily impossible, but you have to accept what you are actually arguing for.

    Let's take your example. What ensures that the software you install conforms to all DOT regulations? Something has to enforce that, otherwise I could just as easily install "SPEED-THROUGH-EVERYONE-AND-GET-ME-WHERE-I-WANT-TO-GO-ASAP" software on my own vehicle. And if something is enforcing that it complies with regulations, then it's not fully customizable.

    Which I'm thinking is a good idea, but it's not what you seem to be arguing.

    Possible I am misunderstanding :)
    spool32 wrote: »
    kime wrote: »
    kime wrote: »
    Changing the stopping distance seems like one of the most important things that you don't let people change. That's probably really not the best example to support your point :(. As in, it's a really bad example :P

    You don't let people remove the safety features that autonomous vehicles would get you.

    Spool's example was to make it more cautious, not less.

    It occurs to me that differing values could have knock on effects spreading through the system but the infrastructure for cars to talk to cars about their settings is another issue entirely.

    If you can hack the code to make it more cautious, you can also make it less cautious. That's the whole point for why people are saying these things shouldn't be editable by users.

    If you are suggesting that people could only change the value in a certain range, well then sure! That's just a configurable value, it should be in the settings of your cars onboard computer or something, but that is not what spool was suggesting.

    Not users. Owners.

    And yes, this is exactly the problem, because you have this situation where:

    - If I can't change it, I don't own it
    - If I don't own it, I'm not liable for how it behaves
    - If I'm not liable, the maker is
    - If corporations don't want the liability, they won't make it

    Therefore, no self-driving cars.

    - If I do own it, I can change it
    - If I can change it, I can change it illegally
    - If you stop me changing it, you're liable for how it operates
    - If corporations don't want the liability, they won't make it

    Therefore, no self-driving cars.

    I agree there is a liability problem here. Dunno the best solution for that.

    I think that letting people do what they want with their autonomous vehicles is a bad idea. Like, "do what you want, but comply with the laws and regulations, which limit what you can do to the software/hardware that keeps things safe."

    There has to be some limit on restriction, or we end up in a situation where, like I've described, you have to pay the Ford dealer $250 to log into your car and activate snow tire mode, and you can't replace your cameras with better ones halfway through the car's life.

    Or like, buy it secondhand without signing the software license agreement.

    Or dozens of other things!

    I agree with the snow tire example, but not the camera one.

    I'm fine with limitations on the owner's ability to modify their vehicle needing to be justified, but that doesn't preclude those limitations.

    A camera could be fine, if it's done by a professional and is an approved model for your vehicle. Which, yes, means you don't get 100% control over something you "own," but I think we can pretty easily get a happy medium.

    His example was for any owner to be able to do it at home, and without having it to be re-approved for road use.

    Yeah, some of the examples spool has given are way too far imo. But on the flipside, there's no safety reason you can't have some customizability with reasonable restrictions.

    Battle.net ID: kime#1822
    3DS Friend Code: 3110-5393-4113
    Steam profile
  • Options
    AiouaAioua Ora Occidens Ora OptimaRegistered User regular
    that video is the first I've heard of the zombie car problem*, which is interesting to consider even if the source might have its own agenda

    *(for those unable to watch the video, if AVs are mostly owned by individuals, there's an incentive for the AVs to spend much of their time driving around with no occupant at all, to avoid parking costs. Why pay for parking downtown when your car can drive back home then pick you up after work? Even worse, spending an hour at the store? just have the car circle the block. Etc)

    life's a game that you're bound to lose / like using a hammer to pound in screws
    fuck up once and you break your thumb / if you're happy at all then you're god damn dumb
    that's right we're on a fucked up cruise / God is dead but at least we have booze
    bad things happen, no one knows why / the sun burns out and everyone dies
  • Options
    tbloxhamtbloxham Registered User regular
    shryke wrote: »
    Her point about zombie cars is a pretty good one, even if I'm not really agreeing with lots of the rest of it.

    If your car can drive on it's own, why pay for parking ever?

    Her point about zombie cars is stupid. Even driving an electric car around and around is expensive compared to parking it. Parking, even in an expensive city is ~$4 an hour at a meter, maybe $6 in a lot. A car circling a block is costing you a lot more than that. Driving the car home and then having it come back is also stupid, because people are AWFUL at planning their time and will forget to summon their car. The whole reason they are in a car in the first place is because they hate planning their time and love being spontaneous. If they liked planning, they would be on a bus.

    What autonomous cars will lead to is ULTRA high density parking structures, where every vehicle can move at the same time to let vehicles which would otherwise be trapped inside out. Or possibly the use of slightly more remote parking lots, like a mile from your desired destination, where the vehicles go and can be back to you in 5 minutes.

    Her point about car sharing IS valuable though, but is far more inevitable than she thinks. Car sharing is much cheaper than ownership, and will be appealing to many people (the fraction who like planning their trips in advance). So the number of trips will fall (making parking even cheaper, and reducing zombie cars still further). Don't need special legislation to make that happen.

    "That is cool" - Abraham Lincoln
  • Options
    Fleur de AlysFleur de Alys Biohacker Registered User regular
    Don't we have a lot of precedence for this already? Like, there's a zillion things you can legally do to modify your vehicle, and twice a zillion things that would make it no longer street-legal. You can upgrade your intake, but you can't put flashing colored lights under the bottom (in most places). You can tint your windows, but you can't tint your windshield.

    Technology is more complicated, sure. It'll take longer to get all the regulations right, and we probably need some serious experts helping put together the list of cans and can'ts, along with frequent revisions as tech adapts and problems are uncovered.

    But it seems manageable? As always, some rules and regulations will greatly anger certain people (and occasionally those people will even be right in their complaints), and we'll miss some others that will probably wind up getting someone hurt or killed. But that's just the way of these things, really.

    Triptycho: A card-and-dice tabletop indie RPG currently in development and playtesting
Sign In or Register to comment.