Due to a security update, you may have to reset your password. Don’t panic, nothing has gone wrong and your password is safe. If you don’t have access to that email, send Tube a message at [email protected] More info here: https://status.vanillaforums.com/incidents/2zdqxf3bt7mj
Our new Indie Games subforum is now open for business in G&T. Go and check it out, you might land a code for a free game. If you're developing an indie game and want to post about it, follow these directions. If you don't, he'll break your legs! Hahaha! Seriously though.

Like a centipede waiting for the other shoe to drop in [The Economy] thread

1848587899092

Posts

  • Commander ZoomCommander Zoom Registered User regular
    The problem has been recognized for a long time, e.g. Standard Oil, AT&T/Bell, et al. We have anti-trust laws on the books. We've just... stopped enforcing them. :P

    steam_sig.png
    Steam, Warframe: Megajoule
    DoodmannSleepQuidKarozL Ron HowardEncFencingsaxIncenjucarButtersBullheadHefflingDuke 2.0Martini_Philosopherchrishallett83OrcathatassemblyguyLord_AsmodeusBloodsheedBlackDragon480TicaldfjamRchanenSkeithGennenalyse RuebenElldrenBrainleechDee KaeVeagleMoridin889HellerbooypainfulPleasanceHacksawMatev
  • monikermoniker Registered User regular
    The problem has been recognized for a long time, e.g. Standard Oil, AT&T/Bell, et al. We have anti-trust laws on the books. We've just... stopped enforcing them. :P

    No, we just decided that Robert Bork's (yes, that Robert Bork) interpretation of them was right for some insane reason.

    shrykeDoodmannAegisMartini_PhilosopherCouscousBlackDragon480ElldrenMoridin889Cantido
  • AngelHedgieAngelHedgie Registered User regular
    moniker wrote: »
    The problem has been recognized for a long time, e.g. Standard Oil, AT&T/Bell, et al. We have anti-trust laws on the books. We've just... stopped enforcing them. :P

    No, we just decided that Robert Bork's (yes, that Robert Bork) interpretation of them was right for some insane reason.

    That insane reason being the infiltration of the Federalist Society into the legal community.

    XBL: Nox Aeternum / PSN: NoxAeternum / NN:NoxAeternum / Steam: noxaeternum
    DoodmannStabbity StyleMartini_PhilosophershrykeOrcaFencingsaxthatassemblyguyLord_AsmodeusGnome-InterruptusBlackDragon480TicaldfjamRchanenGennenalyse RuebenMayabirdpainfulPleasanceHacksawMatev
  • shrykeshryke Member of the Beast Registered User regular
    moniker wrote: »
    The problem has been recognized for a long time, e.g. Standard Oil, AT&T/Bell, et al. We have anti-trust laws on the books. We've just... stopped enforcing them. :P

    No, we just decided that Robert Bork's (yes, that Robert Bork) interpretation of them was right for some insane reason.

    That insane reason being the infiltration of the Federalist Society into the legal community.

    That makes it sound like an outside force. The Federalist Society was part of the legal community from the start. They were founded by conservative members of the legal community to, like all these right-wing organizations, organize and coordinate their right-wing strategy.

  • GoumindongGoumindong Registered User regular
    edited November 7
    yes and no. Its been a common critique of free market capitalism and there have been models that suggest its so (hell i had even built a simple one) but its very much not "accepted"*. I don't think this is going to move the needle.

    *A lot of this is just the power of the school of thought of free market capitalism. Many economists probably have an inkling but haven't formally examined it, if that makes sense, and many will reject it out of hand due to dogma.

    Edit: This was in response to dodman

    Goumindong on
    wbBv3fj.png
  • ButtersButters A glass of some milks Registered User regular
    shryke wrote: »
    moniker wrote: »
    The problem has been recognized for a long time, e.g. Standard Oil, AT&T/Bell, et al. We have anti-trust laws on the books. We've just... stopped enforcing them. :P

    No, we just decided that Robert Bork's (yes, that Robert Bork) interpretation of them was right for some insane reason.

    That insane reason being the infiltration of the Federalist Society into the legal community.

    That makes it sound like an outside force. The Federalist Society was part of the legal community from the start. They were founded by conservative members of the legal community to, like all these right-wing organizations, organize and coordinate their right-wing strategy.

    It was founded by conservative lawyers but they had zero influence until billionaire political donors started up and funded "law and economics" programs and used the Federalist Society as their networking engine. They had almost zero influence in the American Legal community before they started getting Koch money.

    PSN: idontworkhere582 | CFN: idontworkhere | Steam: lordbutters
    MayabirdpainfulPleasance
  • schussschuss Registered User regular
    Yep, capital naturally accumulates unless you confiscate on death. This is because those with more capital can dictate terms on any number of things.
    Also, anyone decrying growth slowing should be highly interested in wealth redistribution as rich people are terrible at spending money compared to lower incomes.

    DoodmannGnome-InterruptusKarozQuidiTunesIsEvilHefflingGennenalyse RuebenCantidoBlackDragon480IncenjucarRchanenOrcaSkeithDevoutlyApatheticVeagleButterschrishallett83ElldrenLord_AsmodeusMoridin889Aegisdurandal4532scherbchenMartini_PhilosopherUnluckywebguy20PhaserlightpainfulPleasanceThe SauceHacksaw
  • AngelHedgieAngelHedgie Registered User regular
    Today in the sexist algorithms running our economy - turns out that if you're female, Apple thinks you're less trustworthy with credit:
    Apple’s new credit card is being investigated by financial regulators after customers complained that the card’s lending algorithms discriminated against women.

    The incident began when software developer David Heinemeier Hansson complained on Twitter that the credit line offered by his Apple Card was 20 times higher than that offered to his spouse, even though the two file joint tax returns and he has the worse credit score.

    “The @AppleCard is such a fucking sexist program,” wrote Hansson. “My wife and I filed joint tax returns, live in a community-property state, and have been married for a long time. Yet Apple’s black box algorithm thinks I deserve 20x the credit limit she does.”

    And the response from support about "but it's what THE ALGORITHM has decreed" just had me shaking my head. NYS is investigating, pointing out that sexist algorithms are against New York state law.

    XBL: Nox Aeternum / PSN: NoxAeternum / NN:NoxAeternum / Steam: noxaeternum
    Hefflingkimeshryke38thDoeTNTrooperKruitechrishallett83Lord_AsmodeusMartini_PhilosopherDuke 2.0EinzelOghulkElldrenMazzyxKayne Red RobeFencingsaxTicaldfjamMoridin889thatassemblyguySkeithpainfulPleasanceThe Saucenever dieShadowhopeMatev
  • MrMonroeMrMonroe Registered User regular
    Huh. I uh, would have tended to think it would have had the opposite result.

    Moridin889
  • TastyfishTastyfish Registered User regular
    MrMonroe wrote: »
    Huh. I uh, would have tended to think it would have had the opposite result.

    Perhaps the additional financial burdans that come from more often being the one with the kids after a divorce?

  • monikermoniker Registered User regular
    MrMonroe wrote: »
    Huh. I uh, would have tended to think it would have had the opposite result.

    A lot of gender neutral financial things basically screw women because it ultimately boils down to how much discretionary income you have, and due to women earning less in wages they always have worse appearing finances even if they behave more responsibly thanks to that gap. (The biggest and most obvious being retirement savings, since it's basically just the gender pay gap multiplied by compounding interest) The joint filing makes it seem weird, but I could see Apple somehow managing to look at W-2 information or equivalent somehow to disregard it.

    QuidAegisshrykeGennenalyse RuebenHefflingLord_AsmodeusOghulkMazzyxElldrenIncenjucarFencingsaxKipling217painfulPleasanceHacksawShadowhope
  • AiouaAioua Ora Occidens Ora OptimaRegistered User regular
    moniker wrote: »
    MrMonroe wrote: »
    Huh. I uh, would have tended to think it would have had the opposite result.

    A lot of gender neutral financial things basically screw women because it ultimately boils down to how much discretionary income you have, and due to women earning less in wages they always have worse appearing finances even if they behave more responsibly thanks to that gap. (The biggest and most obvious being retirement savings, since it's basically just the gender pay gap multiplied by compounding interest) The joint filing makes it seem weird, but I could see Apple somehow managing to look at W-2 information or equivalent somehow to disregard it.

    I mean, depending on how blackbox of an algorithm they're using it might not even be noticing the joint finances... It's just decided having an F next to your name means you get squat.

    life's a game that you're bound to lose / like using a hammer to pound in screws
    fuck up once and you break your thumb / if you're happy at all then you're god damn dumb
    that's right we're on a fucked up cruise / God is dead but at least we have booze
    bad things happen, no one knows why / the sun burns out and everyone dies
    durandal4532QuidSmrtnikFencingsaxkime
  • daveNYCdaveNYC Why universe hate Waspinator? Registered User regular
    Aioua wrote: »
    moniker wrote: »
    MrMonroe wrote: »
    Huh. I uh, would have tended to think it would have had the opposite result.

    A lot of gender neutral financial things basically screw women because it ultimately boils down to how much discretionary income you have, and due to women earning less in wages they always have worse appearing finances even if they behave more responsibly thanks to that gap. (The biggest and most obvious being retirement savings, since it's basically just the gender pay gap multiplied by compounding interest) The joint filing makes it seem weird, but I could see Apple somehow managing to look at W-2 information or equivalent somehow to disregard it.

    I mean, depending on how blackbox of an algorithm they're using it might not even be noticing the joint finances... It's just decided having an F next to your name means you get squat.

    Blackbox in this instance just means no user input. It doesn't mean that its criteria are a complete mystery. Turn up the logging level and rerun her application and you'll be able to see exactly how it came to that decision.

    And technically it's Goldman that is running the credit issuance side of things.

    Shut up, Mr. Burton! You were not brought upon this world to get it!
    Captain Inertia
  • durandal4532durandal4532 Registered User regular
    edited November 11
    I never assume any algorithm is even mildly complicated until I see evidence. So many times the actual practical decision-making amounts to a big IF ELSE statement, or a neural network that's been so improperly designed and poorly trained that it's basically just a way to hide an IF ELSE statement behind some layers of deniability.

    durandal4532 on
    Take a moment to donate what you can to the International Rescue Committee, the National Immigration Law Center, the Southern Poverty Law Center, and the American Civil Liberties Union. There has never been a more urgent moment to do so.
    DoodmannAiouaSleepCommander ZoomButterschrishallett83JragghenBlackDragon480Lord_AsmodeusRchanenPolaritieSmrtnikOghulkElldrenshrykeKayne Red RobeFencingsaxkimeMoridin889MrMonroeThe SauceHacksawShadowhope
  • AiouaAioua Ora Occidens Ora OptimaRegistered User regular
    daveNYC wrote: »
    Aioua wrote: »
    moniker wrote: »
    MrMonroe wrote: »
    Huh. I uh, would have tended to think it would have had the opposite result.

    A lot of gender neutral financial things basically screw women because it ultimately boils down to how much discretionary income you have, and due to women earning less in wages they always have worse appearing finances even if they behave more responsibly thanks to that gap. (The biggest and most obvious being retirement savings, since it's basically just the gender pay gap multiplied by compounding interest) The joint filing makes it seem weird, but I could see Apple somehow managing to look at W-2 information or equivalent somehow to disregard it.

    I mean, depending on how blackbox of an algorithm they're using it might not even be noticing the joint finances... It's just decided having an F next to your name means you get squat.

    Blackbox in this instance just means no user input. It doesn't mean that its criteria are a complete mystery. Turn up the logging level and rerun her application and you'll be able to see exactly how it came to that decision.

    And technically it's Goldman that is running the credit issuance side of things.

    Unless it's a poorly trained neural net.

    life's a game that you're bound to lose / like using a hammer to pound in screws
    fuck up once and you break your thumb / if you're happy at all then you're god damn dumb
    that's right we're on a fucked up cruise / God is dead but at least we have booze
    bad things happen, no one knows why / the sun burns out and everyone dies
    durandal4532mrondeauAimLord_AsmodeusOghulkElldrenFencingsax
  • mrondeaumrondeau Montréal, CanadaRegistered User regular
    Aioua wrote: »
    daveNYC wrote: »
    Aioua wrote: »
    moniker wrote: »
    MrMonroe wrote: »
    Huh. I uh, would have tended to think it would have had the opposite result.

    A lot of gender neutral financial things basically screw women because it ultimately boils down to how much discretionary income you have, and due to women earning less in wages they always have worse appearing finances even if they behave more responsibly thanks to that gap. (The biggest and most obvious being retirement savings, since it's basically just the gender pay gap multiplied by compounding interest) The joint filing makes it seem weird, but I could see Apple somehow managing to look at W-2 information or equivalent somehow to disregard it.

    I mean, depending on how blackbox of an algorithm they're using it might not even be noticing the joint finances... It's just decided having an F next to your name means you get squat.

    Blackbox in this instance just means no user input. It doesn't mean that its criteria are a complete mystery. Turn up the logging level and rerun her application and you'll be able to see exactly how it came to that decision.

    And technically it's Goldman that is running the credit issuance side of things.

    Unless it's a poorly trained neural net.

    Or a well trained neural net.
    The problem is most probably their training data, which is likely to be biased. That's an easy signal to pick up, so the model can be expected to learn that first.

    That's why you should test your model for fairness. Even if it means admitting that biases exist.

    AiouaAngelHedgieHefflingLord_AsmodeusDuke 2.0MazzyxElldrenshrykeFencingsaxkimeThe Sauce
  • HefflingHeffling No Pic EverRegistered User regular
    You would think a basic signoff of the model would include checking against a standard template, and variations of the template with one variable changed at a time.

    If a movement doesn't have someone that can sit down opposite those in a position of power and strike a deal, how can that movement achieve success?
  • Lord_AsmodeusLord_Asmodeus goeticSobriquet: Here is your magical cryptic riddle-tumour: I AM A TIME MACHINERegistered User regular
    So many tech types forget that a machine is only as logical as the programming the people make it allow it to be. It never seems to occur to them that their own biases, conscious or otherwise, might get unintentionally mixed in with the assumptions of a non-intelligent thinking machine just because of how they made it (and that's assuming the bias is unintentional)

    Lord_Asmodeus.gifLord_Asmodeus2.gifz1i30sg.png
    OrcaIncenjucarFencingsaxTicaldfjamGnome-InterruptuspainfulPleasanceHacksaw
  • Commander ZoomCommander Zoom Registered User regular
    On two occasions I have been asked, 'Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?' I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question.

    steam_sig.png
    Steam, Warframe: Megajoule
    Lord_AsmodeusRchanenBlackDragon480AegisPolaritieSmrtnikVeagleElldrenFencingsaxQuidTicaldfjamSkeithAresProphetMrMonroeTarantiochrishallett83David WalgaspainfulPleasanceShortyHacksawShadowhope
  • HamHamJHamHamJ Registered User regular
    I think the problem might be that algorithms are not written explicitly to obey the law. Like, you have to not just not feed it gender data but also block it from using second order indicators to determine gender. Just doing blind testing might not even be enough to catch it because you would need to understand all those indicators.

    While racing light mechs, your Urbanmech comes in second place, but only because it ran out of ammo.
    Kayne Red RobemrondeauFencingsaxkime
  • NotYouNotYou Registered User regular
    Isn't Apple allowed to do this? I thought car insurance was more expensive for men than women as well?

  • DevoutlyApatheticDevoutlyApathetic Registered User regular
    HamHamJ wrote: »
    I think the problem might be that algorithms are not written explicitly to obey the law. Like, you have to not just not feed it gender data but also block it from using second order indicators to determine gender. Just doing blind testing might not even be enough to catch it because you would need to understand all those indicators.

    If they're neural net based they aren't written at all. Which is the huge problem with so many of these things. They aren't an intentional act by a person or even a program. They're a result of "breeding" AIs that write themselves.

  • SiliconStewSiliconStew Registered User regular
    HamHamJ wrote: »
    I think the problem might be that algorithms are not written explicitly to obey the law. Like, you have to not just not feed it gender data but also block it from using second order indicators to determine gender. Just doing blind testing might not even be enough to catch it because you would need to understand all those indicators.

    As moniker pointed out, the algorithm used could be perfectly fair, e.g, credit received is always directly proportional to income with no other considerations, and it would still result in gender-biased outcomes because a gender-biased situation already exists in the input data, e.g women get paid less than men for the same job. And this "fair" algorithm just creates a positive feedback loop that works to increase economic gender gaps. The solution is actually the exact opposite, to get unbiased outcomes from biased data, the algorithm would need to know gender to attempt to correct for that input bias.

    Just remember that half the people you meet are below average intelligence.
    CouscousElldrenDuke 2.0Shadowhope
  • JragghenJragghen Registered User regular
    HamHamJ wrote: »
    I think the problem might be that algorithms are not written explicitly to obey the law. Like, you have to not just not feed it gender data but also block it from using second order indicators to determine gender. Just doing blind testing might not even be enough to catch it because you would need to understand all those indicators.

    As moniker pointed out, the algorithm used could be perfectly fair, e.g, credit received is always directly proportional to income with no other considerations, and it would still result in gender-biased outcomes because a gender-biased situation already exists in the input data, e.g women get paid less than men for the same job. And this "fair" algorithm just creates a positive feedback loop that works to increase economic gender gaps. The solution is actually the exact opposite, to get unbiased outcomes from biased data, the algorithm would need to know gender to attempt to correct for that input bias.

    "It's not enough to be not-racist in a racist world, one must be anti-racist."

    ElldrenDoodmannMartini_Philosophersilence1186FencingsaxRchanenMoridin889thatassemblyguyAresProphetchrishallett83Lord_AsmodeusMatev
  • OghulkOghulk biggest externality low-energy economistRegistered User regular
    daveNYC wrote: »
    Aioua wrote: »
    moniker wrote: »
    MrMonroe wrote: »
    Huh. I uh, would have tended to think it would have had the opposite result.

    A lot of gender neutral financial things basically screw women because it ultimately boils down to how much discretionary income you have, and due to women earning less in wages they always have worse appearing finances even if they behave more responsibly thanks to that gap. (The biggest and most obvious being retirement savings, since it's basically just the gender pay gap multiplied by compounding interest) The joint filing makes it seem weird, but I could see Apple somehow managing to look at W-2 information or equivalent somehow to disregard it.

    I mean, depending on how blackbox of an algorithm they're using it might not even be noticing the joint finances... It's just decided having an F next to your name means you get squat.

    Blackbox in this instance just means no user input. It doesn't mean that its criteria are a complete mystery. Turn up the logging level and rerun her application and you'll be able to see exactly how it came to that decision.

    And technically it's Goldman that is running the credit issuance side of things.

    Since it is Goldman I'm curious how this investigation/eventual lawsuit will play out. If GS is using Apple's proprietary algorithms then who is liable in this case?

    raoADVy.png
  • Blackhawk1313Blackhawk1313 Registered User regular
    Again, I think it bears repeating, it’s not Apple doing this, the algorithm and the card is managed by Goldman Sachs.

    Captain Inertia
  • I ZimbraI Zimbra Registered User regular
    Again, I think it bears repeating, it’s not Apple doing this, the algorithm and the card is managed by Goldman Sachs.

    Apple approved it, put their name on it, and are marketing it. They can absolutely eat shit alongside GS for this.

    AngelHedgieQanamiljmcdonaldHefflingOrcaKayne Red RobeJebus314Martini_PhilosopherOghulkBlackDragon480mrondeauFencingsaxQuidRchanenTicaldfjamMoridin889Phoenix-DSkeithAimchrishallett83Gnome-InterruptusEinzelShortyLord_AsmodeusHacksawShadowhopeMatevCaedwyr
  • ButtersButters A glass of some milks Registered User regular
    NotYou wrote: »
    Isn't Apple allowed to do this? I thought car insurance was more expensive for men than women as well?

    There are laws against this is in some states.

    https://www.finder.com/car-insurance-rates-by-gender

    PSN: idontworkhere582 | CFN: idontworkhere | Steam: lordbutters
  • Captain InertiaCaptain Inertia Registered User regular
    I, uh.....

    The only algorithms used for consumer credit are looking at database tables- for unsecured lending like a credit card, it’s a 2-dimension decision (credit score and income). “Algorithm,” while technically correct, is doing a LOT of lifting in this article.......

    I would bet a trillion dollars that Apple fucked up with info collection in their application and didn’t pass the correct income to GS because they don’t treat community property correctly

  • HefflingHeffling No Pic EverRegistered User regular
    Again, I think it bears repeating, it’s not Apple doing this, the algorithm and the card is managed by Goldman Sachs.

    I am fully capable of hating both the vampire squid and the forbidden fruit.
    Goldman Sachs was referred too as the vampire squid by The Rolling Stones Magazine. Vampire because it sucks wealth out of everything while contributing little or nothing. Squid, because it has many tentacles that reach into all facets of humanity.

    Apple being the forbidden fruit is from a biblical story.

    If a movement doesn't have someone that can sit down opposite those in a position of power and strike a deal, how can that movement achieve success?
  • GoumindongGoumindong Registered User regular
    edited November 11
    So many tech types forget that a machine is only as logical as the programming the people make it allow it to be. It never seems to occur to them that their own biases, conscious or otherwise, might get unintentionally mixed in with the assumptions of a non-intelligent thinking machine just because of how they made it (and that's assuming the bias is unintentional)

    That isn't what it is.

    Fundamentally AI is just a big correlation engine. This is fine in some cases because there are times when, even with a weak confidence interval/P-value, you would want to include that result. But it also means that you're going to have huge numbers of false positives, spurious results, and reverse causation issues. Adding data to a model doesn't resolve false positive, spurious correlation, and reverse causation results but rather makes it more likely that you're going to run into them*.

    So a black box fed a bunch of data is highly likely to find things like "female == worse credit" if such a spurious correlation exists (and such a spurious correlation likely does exist).

    *Lets say we have a simple model y=Bx where B is a 1 by K vector, x is a K by N vector, and y is a 1 by N vector. This is equivalent to a simple linear regression model. Each value in B corresponds to the result for the multi-variate regression for the data associated with the line in x. So the first value of B corresponds to the intercept because the first line of X is a line of 1's. And if the second line were "m=0, 1 = f" then the second value of B would correspond to the effect of being female. And if the third line was a line of random 0's and 1's then the value of B relative to it should be 0.

    So if we add more data N then what we get is lower p-values. Such that if we did this result many times we would get fewer false positives. Which is to say that the value of B relating to the random data would be LESS likely to say "oh yea this random line totally has an effect". What AI is largely doing (this includes black box algorithms) is slamming a BUNCH of data in so that K->Big and N->Big and then doing this all over with all the data in a bunch of different models and seeing what pops out.

    This should produce the best estimate of the available data for the effect you're looking for. AND the effect of false positives should be negated by the fact that they don't actually have an effect in aggregate. But its not good for actually answering the question of whether or not a certain thing really does have an effect. Because you're counting on the largeness of the operator space to essentially negate the errors the AI will have.

    In a business sense this works. Imagine you're hunting for unfair coins. And so you do a bunch of data and see that coin X has a .001% chance of being unfair while coin y has a .000001% chance. Well if you have enough coins then betting on the ones that have a high chance of being unfair will make you money even if the individual risk of each coin is too low to make the bet in isolation. If p-unfair is >50% you should bet on the coin given that you're going to be betting on thousands of coins and E(sum(fair*bet))=0 anyway. Alternately imagine a business running likelihoods for success of a particular action. If there are three actions on the table getting a 5% p-value on any of of them might not happen, so you choose the one that is the best regardless of the error bars... because its still the most likely success

    edit: but of course it will fail and fail more often at an individual/customer level where certain things specifically should not be taken into account

    Goumindong on
    wbBv3fj.png
    shrykeBlackDragon480Skeith
  • ViskodViskod Registered User regular
    Today in the sexist algorithms running our economy - turns out that if you're female, Apple thinks you're less trustworthy with credit:
    Apple’s new credit card is being investigated by financial regulators after customers complained that the card’s lending algorithms discriminated against women.

    The incident began when software developer David Heinemeier Hansson complained on Twitter that the credit line offered by his Apple Card was 20 times higher than that offered to his spouse, even though the two file joint tax returns and he has the worse credit score.

    “The @AppleCard is such a fucking sexist program,” wrote Hansson. “My wife and I filed joint tax returns, live in a community-property state, and have been married for a long time. Yet Apple’s black box algorithm thinks I deserve 20x the credit limit she does.”

    And the response from support about "but it's what THE ALGORITHM has decreed" just had me shaking my head. NYS is investigating, pointing out that sexist algorithms are against New York state law.

    Pretty much once per day if not multiple times in a day some dipshit walks into our office to our book keeper to pay her for an invoice and says something to equivalent of "Them women will sure take that money.."

    Artereis wrote: »
    It's not your fault, Viskod. 1 out of every 10 people just happens to be a monster.
  • AngelHedgieAngelHedgie Registered User regular
    Okay, there's evil, and then there's the unholy abomination payday lenders are trying to force through in Arizona:
    A proposed constitutional amendment that looks likely to hit the ballot there next year would limit future increases to the minimum wage, may claw back scheduled increases already set to take effect, and eliminate a week of paid sick leave. One of the payday lending industry’s leading trade associations has bankrolled the measure, making plain the connection between a lack of income and the spread of predatory financial services. “It’s pretty incredible,” says Rodd McLeod, who works with Arizonans for Fair Lending, which is fighting the proposed ballot measure. “We need people to be poor in order to continue to make money.”

    The ballot measure is actually a response to consumer advocates’ effort to eliminate high-dollar loans in Arizona. In 2008, the state soundly rejected payday lending; as an industry-backed ballot measure, Proposition 200, would have allowed those types of low-dollar, short-term, easy-to-roll-over loans, and it was defeated by a 60-40 popular vote. But payday lenders found an outlet nonetheless: About half of them switched their business model to auto title loans. These are similarly low-dollar loans that use as collateral a borrower’s car title. Typically, these loans run for two-to-four weeks, and the annual percentage rate (APR) can be as high as 204 percent in Arizona.

    According to figures from Arizonans for Fair Lending, one in three state borrowers end up extending their auto title loan, creating a cycle of debt. One in five wind up having their vehicle repossessed. Title loan borrowers spend $254 million per year in interest, an analysis from the Center for Responsible Lending found.

    After years of work, Arizonans for Fair Lending filed a ballot measure for the November 2020 election that would restrict car title loans in the state, reducing the permitted APR from 204 percent to 36 percent, making it equal to the maximum interest rate for other consumer loans in the state. “Usury is always wrong,” said Stephany Brown, president of the Society of St. Vincent de Paul in Tucson, in a statement after the announcement of the ballot measure.

    The lenders then struck back, and then some. Their initiative, a proposed constitutional amendment known as the “Arizona Economic Freedom Act,” is intended to “prohibit the government from dictating price terms in transactions between private persons.” In the lending realm, that means that the state government could not set any limits on interest rates for financial services—not at 36 percent, not at 204 percent. If it passed, it would override the Arizonans for Fair Lending ballot measure, because it would be written into the constitution. Payday loans would still be banned, but auto title and other lenders would be permitted to run wild, with no limits on their interest rates.

    However, the initiative goes well beyond that.

    Tax and utility rate setting would remain untouched. But any regulation of ATM fees, or late fees on various transactions, would be eliminated. And since the employment contract is also a contract between private persons, the Economic Freedom Act would also rescind mandates put into law governing that process. That broad directive would eliminate minimum wages in the state entirely. However, language in the initiative would retain any minimum wage “if in effect as of December 31, 2019.”

    That in itself could become controversial. Currently, thanks to the passage of Proposition 206 in 2016, Arizona’s minimum wage is scheduled to rise. Right now it stands at $11.00 an hour, and on January 1, 2020, it is supposed to go to $12.00, with an index for inflation thereafter. The Economic Freedom Act won’t be voted on until November 2020, but if it passes, the backers could potentially seek to claw the minimum wage back to $11.00 and freeze it there. The state Supreme Court experienced a shift to the right in 2016 when two extra justices were seated in a court-packing scheme. So the likelihood of a rollback in the minimum wage, if the initiative passes, is very possible.

    That is just vile.

    XBL: Nox Aeternum / PSN: NoxAeternum / NN:NoxAeternum / Steam: noxaeternum
    JragghenFencingsaxshrykePolaritieBlackDragon480Doodmanndestroyah87emp123TNTrooperRchanenkimeTicaldfjamAegisElldrenIncenjucarthatassemblyguySkeithHeirGennenalyse RuebenCommander ZoomCouscousMartini_PhilosopherMorganVwebguy20Hefflingchrishallett83Gnome-InterruptusBullheadBloodsheedOghulkButtersQuidDavid WalgasDuke 2.0The Saucenever dieLord_AsmodeusHacksawShadowhopeMatevDoctorArch
  • OrcaOrca Registered User regular
    That is not just vile, that is straight out evil.

    The 2012 issue of Fornax. | Steam and Origin: Espressosaurus
    ElldrenIncenjucarthatassemblyguyFencingsaxCommander ZoommonikerMillMartini_Philosopheremp123Polaritiewebguy20RchanenHefflingchrishallett83BullheadKipling217BloodsheedBlackDragon480OghulkButtersL Ron HowardStabbity StylebrynhrtmnThe Saucekimenever dieLord_AsmodeusHacksawShadowhopeMatevLovelyDoctorArch
  • IncenjucarIncenjucar Audio Game Developer Seattle, WARegistered User regular
    edited November 12
    Orca wrote: »
    That is not just vile, that is straight out evil.

    True patriotism.

    Not at all surprising. Lobbying for exploitation is pretty standard.

    Incenjucar on
  • daveNYCdaveNYC Why universe hate Waspinator? Registered User regular
    I, uh.....

    The only algorithms used for consumer credit are looking at database tables- for unsecured lending like a credit card, it’s a 2-dimension decision (credit score and income). “Algorithm,” while technically correct, is doing a LOT of lifting in this article.......

    I would bet a trillion dollars that Apple fucked up with info collection in their application and didn’t pass the correct income to GS because they don’t treat community property correctly

    Possibly, but it's even odds that GS up and decided to redevelop the wheel (now with the correct number of sides!) and put together some new hotness that isn't working properly.

    Shut up, Mr. Burton! You were not brought upon this world to get it!
    monikerElldren
  • OghulkOghulk biggest externality low-energy economistRegistered User regular
    Kinda gotta give props for not only going all-out on limiting government ability to regulate but also putting in blunt terms that they need people to be poor for their business model. Pretty amazing to see.

    raoADVy.png
    Moridin889ElldrenFencingsaxLovely
  • CouscousCouscous Registered User regular
    Our great Farmers will recieve another major round of “cash,” compliments of China Tariffs, prior to Thanksgiving. The smaller farms and farmers will be big beneficiaries. In the meantime, and as you may have noticed, China is starting to buy big again. Japan deal DONE. Enjoy!
    At what point does this just become bribery?

    thatassemblyguy
  • FencingsaxFencingsax It is difficult to get a man to understand, when his salary depends upon his not understanding GNU Terry PratchettRegistered User regular
    Couscous wrote: »
    Our great Farmers will recieve another major round of “cash,” compliments of China Tariffs, prior to Thanksgiving. The smaller farms and farmers will be big beneficiaries. In the meantime, and as you may have noticed, China is starting to buy big again. Japan deal DONE. Enjoy!
    At what point does this just become bribery?

    Well, that isn't how tariffs work, so never.

    torchlight-sig-80.jpg
    OrcaCouscousThe SauceMoridin889monikerTNTrooperKnight_Commander Zoomemp123Gennenalyse RuebenMartini_PhilosopherButtersElldrenBullheadBloodsheedchrishallett83L Ron Howardnever die38thDoeLord_AsmodeusMatevSkeith
Sign In or Register to comment.