As was foretold, we've added advertisements to the forums! If you have questions, or if you encounter any bugs, please visit this thread: https://forums.penny-arcade.com/discussion/240191/forum-advertisement-faq-and-reports-thread/
Options

Like a centipede waiting for the other shoe to drop in [The Economy] thread

1848587899099

Posts

  • Options
    Commander ZoomCommander Zoom Registered User regular
    The problem has been recognized for a long time, e.g. Standard Oil, AT&T/Bell, et al. We have anti-trust laws on the books. We've just... stopped enforcing them. :P

  • Options
    monikermoniker Registered User regular
    The problem has been recognized for a long time, e.g. Standard Oil, AT&T/Bell, et al. We have anti-trust laws on the books. We've just... stopped enforcing them. :P

    No, we just decided that Robert Bork's (yes, that Robert Bork) interpretation of them was right for some insane reason.

  • Options
    AngelHedgieAngelHedgie Registered User regular
    moniker wrote: »
    The problem has been recognized for a long time, e.g. Standard Oil, AT&T/Bell, et al. We have anti-trust laws on the books. We've just... stopped enforcing them. :P

    No, we just decided that Robert Bork's (yes, that Robert Bork) interpretation of them was right for some insane reason.

    That insane reason being the infiltration of the Federalist Society into the legal community.

    XBL: Nox Aeternum / PSN: NoxAeternum / NN:NoxAeternum / Steam: noxaeternum
  • Options
    shrykeshryke Member of the Beast Registered User regular
    moniker wrote: »
    The problem has been recognized for a long time, e.g. Standard Oil, AT&T/Bell, et al. We have anti-trust laws on the books. We've just... stopped enforcing them. :P

    No, we just decided that Robert Bork's (yes, that Robert Bork) interpretation of them was right for some insane reason.

    That insane reason being the infiltration of the Federalist Society into the legal community.

    That makes it sound like an outside force. The Federalist Society was part of the legal community from the start. They were founded by conservative members of the legal community to, like all these right-wing organizations, organize and coordinate their right-wing strategy.

  • Options
    GoumindongGoumindong Registered User regular
    edited November 2019
    yes and no. Its been a common critique of free market capitalism and there have been models that suggest its so (hell i had even built a simple one) but its very much not "accepted"*. I don't think this is going to move the needle.

    *A lot of this is just the power of the school of thought of free market capitalism. Many economists probably have an inkling but haven't formally examined it, if that makes sense, and many will reject it out of hand due to dogma.

    Edit: This was in response to dodman

    Goumindong on
    wbBv3fj.png
  • Options
    ButtersButters A glass of some milks Registered User regular
    shryke wrote: »
    moniker wrote: »
    The problem has been recognized for a long time, e.g. Standard Oil, AT&T/Bell, et al. We have anti-trust laws on the books. We've just... stopped enforcing them. :P

    No, we just decided that Robert Bork's (yes, that Robert Bork) interpretation of them was right for some insane reason.

    That insane reason being the infiltration of the Federalist Society into the legal community.

    That makes it sound like an outside force. The Federalist Society was part of the legal community from the start. They were founded by conservative members of the legal community to, like all these right-wing organizations, organize and coordinate their right-wing strategy.

    It was founded by conservative lawyers but they had zero influence until billionaire political donors started up and funded "law and economics" programs and used the Federalist Society as their networking engine. They had almost zero influence in the American Legal community before they started getting Koch money.

    PSN: idontworkhere582 | CFN: idontworkhere | Steam: lordbutters | Amazon Wishlist
  • Options
    schussschuss Registered User regular
    Yep, capital naturally accumulates unless you confiscate on death. This is because those with more capital can dictate terms on any number of things.
    Also, anyone decrying growth slowing should be highly interested in wealth redistribution as rich people are terrible at spending money compared to lower incomes.

  • Options
    AngelHedgieAngelHedgie Registered User regular
    Today in the sexist algorithms running our economy - turns out that if you're female, Apple thinks you're less trustworthy with credit:
    Apple’s new credit card is being investigated by financial regulators after customers complained that the card’s lending algorithms discriminated against women.

    The incident began when software developer David Heinemeier Hansson complained on Twitter that the credit line offered by his Apple Card was 20 times higher than that offered to his spouse, even though the two file joint tax returns and he has the worse credit score.

    “The @AppleCard is such a fucking sexist program,” wrote Hansson. “My wife and I filed joint tax returns, live in a community-property state, and have been married for a long time. Yet Apple’s black box algorithm thinks I deserve 20x the credit limit she does.”

    And the response from support about "but it's what THE ALGORITHM has decreed" just had me shaking my head. NYS is investigating, pointing out that sexist algorithms are against New York state law.

    XBL: Nox Aeternum / PSN: NoxAeternum / NN:NoxAeternum / Steam: noxaeternum
  • Options
    MrMonroeMrMonroe passed out on the floor nowRegistered User regular
    Huh. I uh, would have tended to think it would have had the opposite result.

  • Options
    TastyfishTastyfish Registered User regular
    MrMonroe wrote: »
    Huh. I uh, would have tended to think it would have had the opposite result.

    Perhaps the additional financial burdans that come from more often being the one with the kids after a divorce?

  • Options
    monikermoniker Registered User regular
    MrMonroe wrote: »
    Huh. I uh, would have tended to think it would have had the opposite result.

    A lot of gender neutral financial things basically screw women because it ultimately boils down to how much discretionary income you have, and due to women earning less in wages they always have worse appearing finances even if they behave more responsibly thanks to that gap. (The biggest and most obvious being retirement savings, since it's basically just the gender pay gap multiplied by compounding interest) The joint filing makes it seem weird, but I could see Apple somehow managing to look at W-2 information or equivalent somehow to disregard it.

  • Options
    AiouaAioua Ora Occidens Ora OptimaRegistered User regular
    moniker wrote: »
    MrMonroe wrote: »
    Huh. I uh, would have tended to think it would have had the opposite result.

    A lot of gender neutral financial things basically screw women because it ultimately boils down to how much discretionary income you have, and due to women earning less in wages they always have worse appearing finances even if they behave more responsibly thanks to that gap. (The biggest and most obvious being retirement savings, since it's basically just the gender pay gap multiplied by compounding interest) The joint filing makes it seem weird, but I could see Apple somehow managing to look at W-2 information or equivalent somehow to disregard it.

    I mean, depending on how blackbox of an algorithm they're using it might not even be noticing the joint finances... It's just decided having an F next to your name means you get squat.

    life's a game that you're bound to lose / like using a hammer to pound in screws
    fuck up once and you break your thumb / if you're happy at all then you're god damn dumb
    that's right we're on a fucked up cruise / God is dead but at least we have booze
    bad things happen, no one knows why / the sun burns out and everyone dies
  • Options
    daveNYCdaveNYC Why universe hate Waspinator? Registered User regular
    Aioua wrote: »
    moniker wrote: »
    MrMonroe wrote: »
    Huh. I uh, would have tended to think it would have had the opposite result.

    A lot of gender neutral financial things basically screw women because it ultimately boils down to how much discretionary income you have, and due to women earning less in wages they always have worse appearing finances even if they behave more responsibly thanks to that gap. (The biggest and most obvious being retirement savings, since it's basically just the gender pay gap multiplied by compounding interest) The joint filing makes it seem weird, but I could see Apple somehow managing to look at W-2 information or equivalent somehow to disregard it.

    I mean, depending on how blackbox of an algorithm they're using it might not even be noticing the joint finances... It's just decided having an F next to your name means you get squat.

    Blackbox in this instance just means no user input. It doesn't mean that its criteria are a complete mystery. Turn up the logging level and rerun her application and you'll be able to see exactly how it came to that decision.

    And technically it's Goldman that is running the credit issuance side of things.

    Shut up, Mr. Burton! You were not brought upon this world to get it!
  • Options
    durandal4532durandal4532 Registered User regular
    edited November 2019
    I never assume any algorithm is even mildly complicated until I see evidence. So many times the actual practical decision-making amounts to a big IF ELSE statement, or a neural network that's been so improperly designed and poorly trained that it's basically just a way to hide an IF ELSE statement behind some layers of deniability.

    durandal4532 on
    Take a moment to donate what you can to Critical Resistance and Black Lives Matter.
  • Options
    AiouaAioua Ora Occidens Ora OptimaRegistered User regular
    daveNYC wrote: »
    Aioua wrote: »
    moniker wrote: »
    MrMonroe wrote: »
    Huh. I uh, would have tended to think it would have had the opposite result.

    A lot of gender neutral financial things basically screw women because it ultimately boils down to how much discretionary income you have, and due to women earning less in wages they always have worse appearing finances even if they behave more responsibly thanks to that gap. (The biggest and most obvious being retirement savings, since it's basically just the gender pay gap multiplied by compounding interest) The joint filing makes it seem weird, but I could see Apple somehow managing to look at W-2 information or equivalent somehow to disregard it.

    I mean, depending on how blackbox of an algorithm they're using it might not even be noticing the joint finances... It's just decided having an F next to your name means you get squat.

    Blackbox in this instance just means no user input. It doesn't mean that its criteria are a complete mystery. Turn up the logging level and rerun her application and you'll be able to see exactly how it came to that decision.

    And technically it's Goldman that is running the credit issuance side of things.

    Unless it's a poorly trained neural net.

    life's a game that you're bound to lose / like using a hammer to pound in screws
    fuck up once and you break your thumb / if you're happy at all then you're god damn dumb
    that's right we're on a fucked up cruise / God is dead but at least we have booze
    bad things happen, no one knows why / the sun burns out and everyone dies
  • Options
    mrondeaumrondeau Montréal, CanadaRegistered User regular
    Aioua wrote: »
    daveNYC wrote: »
    Aioua wrote: »
    moniker wrote: »
    MrMonroe wrote: »
    Huh. I uh, would have tended to think it would have had the opposite result.

    A lot of gender neutral financial things basically screw women because it ultimately boils down to how much discretionary income you have, and due to women earning less in wages they always have worse appearing finances even if they behave more responsibly thanks to that gap. (The biggest and most obvious being retirement savings, since it's basically just the gender pay gap multiplied by compounding interest) The joint filing makes it seem weird, but I could see Apple somehow managing to look at W-2 information or equivalent somehow to disregard it.

    I mean, depending on how blackbox of an algorithm they're using it might not even be noticing the joint finances... It's just decided having an F next to your name means you get squat.

    Blackbox in this instance just means no user input. It doesn't mean that its criteria are a complete mystery. Turn up the logging level and rerun her application and you'll be able to see exactly how it came to that decision.

    And technically it's Goldman that is running the credit issuance side of things.

    Unless it's a poorly trained neural net.

    Or a well trained neural net.
    The problem is most probably their training data, which is likely to be biased. That's an easy signal to pick up, so the model can be expected to learn that first.

    That's why you should test your model for fairness. Even if it means admitting that biases exist.

  • Options
    HefflingHeffling No Pic EverRegistered User regular
    You would think a basic signoff of the model would include checking against a standard template, and variations of the template with one variable changed at a time.

  • Options
    Lord_AsmodeusLord_Asmodeus goeticSobriquet: Here is your magical cryptic riddle-tumour: I AM A TIME MACHINERegistered User regular
    So many tech types forget that a machine is only as logical as the programming the people make it allow it to be. It never seems to occur to them that their own biases, conscious or otherwise, might get unintentionally mixed in with the assumptions of a non-intelligent thinking machine just because of how they made it (and that's assuming the bias is unintentional)

    Capital is only the fruit of labor, and could never have existed if Labor had not first existed. Labor is superior to capital, and deserves much the higher consideration. - Lincoln
  • Options
    Commander ZoomCommander Zoom Registered User regular
    On two occasions I have been asked, 'Pray, Mr. Babbage, if you put into the machine wrong figures, will the right answers come out?' I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a question.

  • Options
    HamHamJHamHamJ Registered User regular
    I think the problem might be that algorithms are not written explicitly to obey the law. Like, you have to not just not feed it gender data but also block it from using second order indicators to determine gender. Just doing blind testing might not even be enough to catch it because you would need to understand all those indicators.

    While racing light mechs, your Urbanmech comes in second place, but only because it ran out of ammo.
  • Options
    NotYouNotYou Registered User regular
    Isn't Apple allowed to do this? I thought car insurance was more expensive for men than women as well?

  • Options
    DevoutlyApatheticDevoutlyApathetic Registered User regular
    HamHamJ wrote: »
    I think the problem might be that algorithms are not written explicitly to obey the law. Like, you have to not just not feed it gender data but also block it from using second order indicators to determine gender. Just doing blind testing might not even be enough to catch it because you would need to understand all those indicators.

    If they're neural net based they aren't written at all. Which is the huge problem with so many of these things. They aren't an intentional act by a person or even a program. They're a result of "breeding" AIs that write themselves.

    Nod. Get treat. PSN: Quippish
  • Options
    SiliconStewSiliconStew Registered User regular
    HamHamJ wrote: »
    I think the problem might be that algorithms are not written explicitly to obey the law. Like, you have to not just not feed it gender data but also block it from using second order indicators to determine gender. Just doing blind testing might not even be enough to catch it because you would need to understand all those indicators.

    As moniker pointed out, the algorithm used could be perfectly fair, e.g, credit received is always directly proportional to income with no other considerations, and it would still result in gender-biased outcomes because a gender-biased situation already exists in the input data, e.g women get paid less than men for the same job. And this "fair" algorithm just creates a positive feedback loop that works to increase economic gender gaps. The solution is actually the exact opposite, to get unbiased outcomes from biased data, the algorithm would need to know gender to attempt to correct for that input bias.

    Just remember that half the people you meet are below average intelligence.
  • Options
    JragghenJragghen Registered User regular
    HamHamJ wrote: »
    I think the problem might be that algorithms are not written explicitly to obey the law. Like, you have to not just not feed it gender data but also block it from using second order indicators to determine gender. Just doing blind testing might not even be enough to catch it because you would need to understand all those indicators.

    As moniker pointed out, the algorithm used could be perfectly fair, e.g, credit received is always directly proportional to income with no other considerations, and it would still result in gender-biased outcomes because a gender-biased situation already exists in the input data, e.g women get paid less than men for the same job. And this "fair" algorithm just creates a positive feedback loop that works to increase economic gender gaps. The solution is actually the exact opposite, to get unbiased outcomes from biased data, the algorithm would need to know gender to attempt to correct for that input bias.

    "It's not enough to be not-racist in a racist world, one must be anti-racist."

  • Options
    OghulkOghulk Tinychat Janitor TinychatRegistered User regular
    daveNYC wrote: »
    Aioua wrote: »
    moniker wrote: »
    MrMonroe wrote: »
    Huh. I uh, would have tended to think it would have had the opposite result.

    A lot of gender neutral financial things basically screw women because it ultimately boils down to how much discretionary income you have, and due to women earning less in wages they always have worse appearing finances even if they behave more responsibly thanks to that gap. (The biggest and most obvious being retirement savings, since it's basically just the gender pay gap multiplied by compounding interest) The joint filing makes it seem weird, but I could see Apple somehow managing to look at W-2 information or equivalent somehow to disregard it.

    I mean, depending on how blackbox of an algorithm they're using it might not even be noticing the joint finances... It's just decided having an F next to your name means you get squat.

    Blackbox in this instance just means no user input. It doesn't mean that its criteria are a complete mystery. Turn up the logging level and rerun her application and you'll be able to see exactly how it came to that decision.

    And technically it's Goldman that is running the credit issuance side of things.

    Since it is Goldman I'm curious how this investigation/eventual lawsuit will play out. If GS is using Apple's proprietary algorithms then who is liable in this case?

  • Options
    Blackhawk1313Blackhawk1313 Demon Hunter for Hire Time RiftRegistered User regular
    Again, I think it bears repeating, it’s not Apple doing this, the algorithm and the card is managed by Goldman Sachs.

  • Options
    I ZimbraI Zimbra Worst song, played on ugliest guitar Registered User regular
    Again, I think it bears repeating, it’s not Apple doing this, the algorithm and the card is managed by Goldman Sachs.

    Apple approved it, put their name on it, and are marketing it. They can absolutely eat shit alongside GS for this.

  • Options
    ButtersButters A glass of some milks Registered User regular
    NotYou wrote: »
    Isn't Apple allowed to do this? I thought car insurance was more expensive for men than women as well?

    There are laws against this is in some states.

    https://www.finder.com/car-insurance-rates-by-gender

    PSN: idontworkhere582 | CFN: idontworkhere | Steam: lordbutters | Amazon Wishlist
  • Options
    Captain InertiaCaptain Inertia Registered User regular
    I, uh.....

    The only algorithms used for consumer credit are looking at database tables- for unsecured lending like a credit card, it’s a 2-dimension decision (credit score and income). “Algorithm,” while technically correct, is doing a LOT of lifting in this article.......

    I would bet a trillion dollars that Apple fucked up with info collection in their application and didn’t pass the correct income to GS because they don’t treat community property correctly

  • Options
    HefflingHeffling No Pic EverRegistered User regular
    Again, I think it bears repeating, it’s not Apple doing this, the algorithm and the card is managed by Goldman Sachs.

    I am fully capable of hating both the vampire squid and the forbidden fruit.
    Goldman Sachs was referred too as the vampire squid by The Rolling Stones Magazine. Vampire because it sucks wealth out of everything while contributing little or nothing. Squid, because it has many tentacles that reach into all facets of humanity.

    Apple being the forbidden fruit is from a biblical story.

  • Options
    GoumindongGoumindong Registered User regular
    edited November 2019
    So many tech types forget that a machine is only as logical as the programming the people make it allow it to be. It never seems to occur to them that their own biases, conscious or otherwise, might get unintentionally mixed in with the assumptions of a non-intelligent thinking machine just because of how they made it (and that's assuming the bias is unintentional)

    That isn't what it is.

    Fundamentally AI is just a big correlation engine. This is fine in some cases because there are times when, even with a weak confidence interval/P-value, you would want to include that result. But it also means that you're going to have huge numbers of false positives, spurious results, and reverse causation issues. Adding data to a model doesn't resolve false positive, spurious correlation, and reverse causation results but rather makes it more likely that you're going to run into them*.

    So a black box fed a bunch of data is highly likely to find things like "female == worse credit" if such a spurious correlation exists (and such a spurious correlation likely does exist).

    *Lets say we have a simple model y=Bx where B is a 1 by K vector, x is a K by N vector, and y is a 1 by N vector. This is equivalent to a simple linear regression model. Each value in B corresponds to the result for the multi-variate regression for the data associated with the line in x. So the first value of B corresponds to the intercept because the first line of X is a line of 1's. And if the second line were "m=0, 1 = f" then the second value of B would correspond to the effect of being female. And if the third line was a line of random 0's and 1's then the value of B relative to it should be 0.

    So if we add more data N then what we get is lower p-values. Such that if we did this result many times we would get fewer false positives. Which is to say that the value of B relating to the random data would be LESS likely to say "oh yea this random line totally has an effect". What AI is largely doing (this includes black box algorithms) is slamming a BUNCH of data in so that K->Big and N->Big and then doing this all over with all the data in a bunch of different models and seeing what pops out.

    This should produce the best estimate of the available data for the effect you're looking for. AND the effect of false positives should be negated by the fact that they don't actually have an effect in aggregate. But its not good for actually answering the question of whether or not a certain thing really does have an effect. Because you're counting on the largeness of the operator space to essentially negate the errors the AI will have.

    In a business sense this works. Imagine you're hunting for unfair coins. And so you do a bunch of data and see that coin X has a .001% chance of being unfair while coin y has a .000001% chance. Well if you have enough coins then betting on the ones that have a high chance of being unfair will make you money even if the individual risk of each coin is too low to make the bet in isolation. If p-unfair is >50% you should bet on the coin given that you're going to be betting on thousands of coins and E(sum(fair*bet))=0 anyway. Alternately imagine a business running likelihoods for success of a particular action. If there are three actions on the table getting a 5% p-value on any of of them might not happen, so you choose the one that is the best regardless of the error bars... because its still the most likely success

    edit: but of course it will fail and fail more often at an individual/customer level where certain things specifically should not be taken into account

    Goumindong on
    wbBv3fj.png
  • Options
    ViskodViskod Registered User regular
    Today in the sexist algorithms running our economy - turns out that if you're female, Apple thinks you're less trustworthy with credit:
    Apple’s new credit card is being investigated by financial regulators after customers complained that the card’s lending algorithms discriminated against women.

    The incident began when software developer David Heinemeier Hansson complained on Twitter that the credit line offered by his Apple Card was 20 times higher than that offered to his spouse, even though the two file joint tax returns and he has the worse credit score.

    “The @AppleCard is such a fucking sexist program,” wrote Hansson. “My wife and I filed joint tax returns, live in a community-property state, and have been married for a long time. Yet Apple’s black box algorithm thinks I deserve 20x the credit limit she does.”

    And the response from support about "but it's what THE ALGORITHM has decreed" just had me shaking my head. NYS is investigating, pointing out that sexist algorithms are against New York state law.

    Pretty much once per day if not multiple times in a day some dipshit walks into our office to our book keeper to pay her for an invoice and says something to equivalent of "Them women will sure take that money.."

  • Options
    AngelHedgieAngelHedgie Registered User regular
    Okay, there's evil, and then there's the unholy abomination payday lenders are trying to force through in Arizona:
    A proposed constitutional amendment that looks likely to hit the ballot there next year would limit future increases to the minimum wage, may claw back scheduled increases already set to take effect, and eliminate a week of paid sick leave. One of the payday lending industry’s leading trade associations has bankrolled the measure, making plain the connection between a lack of income and the spread of predatory financial services. “It’s pretty incredible,” says Rodd McLeod, who works with Arizonans for Fair Lending, which is fighting the proposed ballot measure. “We need people to be poor in order to continue to make money.”

    The ballot measure is actually a response to consumer advocates’ effort to eliminate high-dollar loans in Arizona. In 2008, the state soundly rejected payday lending; as an industry-backed ballot measure, Proposition 200, would have allowed those types of low-dollar, short-term, easy-to-roll-over loans, and it was defeated by a 60-40 popular vote. But payday lenders found an outlet nonetheless: About half of them switched their business model to auto title loans. These are similarly low-dollar loans that use as collateral a borrower’s car title. Typically, these loans run for two-to-four weeks, and the annual percentage rate (APR) can be as high as 204 percent in Arizona.

    According to figures from Arizonans for Fair Lending, one in three state borrowers end up extending their auto title loan, creating a cycle of debt. One in five wind up having their vehicle repossessed. Title loan borrowers spend $254 million per year in interest, an analysis from the Center for Responsible Lending found.

    After years of work, Arizonans for Fair Lending filed a ballot measure for the November 2020 election that would restrict car title loans in the state, reducing the permitted APR from 204 percent to 36 percent, making it equal to the maximum interest rate for other consumer loans in the state. “Usury is always wrong,” said Stephany Brown, president of the Society of St. Vincent de Paul in Tucson, in a statement after the announcement of the ballot measure.

    The lenders then struck back, and then some. Their initiative, a proposed constitutional amendment known as the “Arizona Economic Freedom Act,” is intended to “prohibit the government from dictating price terms in transactions between private persons.” In the lending realm, that means that the state government could not set any limits on interest rates for financial services—not at 36 percent, not at 204 percent. If it passed, it would override the Arizonans for Fair Lending ballot measure, because it would be written into the constitution. Payday loans would still be banned, but auto title and other lenders would be permitted to run wild, with no limits on their interest rates.

    However, the initiative goes well beyond that.

    Tax and utility rate setting would remain untouched. But any regulation of ATM fees, or late fees on various transactions, would be eliminated. And since the employment contract is also a contract between private persons, the Economic Freedom Act would also rescind mandates put into law governing that process. That broad directive would eliminate minimum wages in the state entirely. However, language in the initiative would retain any minimum wage “if in effect as of December 31, 2019.”

    That in itself could become controversial. Currently, thanks to the passage of Proposition 206 in 2016, Arizona’s minimum wage is scheduled to rise. Right now it stands at $11.00 an hour, and on January 1, 2020, it is supposed to go to $12.00, with an index for inflation thereafter. The Economic Freedom Act won’t be voted on until November 2020, but if it passes, the backers could potentially seek to claw the minimum wage back to $11.00 and freeze it there. The state Supreme Court experienced a shift to the right in 2016 when two extra justices were seated in a court-packing scheme. So the likelihood of a rollback in the minimum wage, if the initiative passes, is very possible.

    That is just vile.

    XBL: Nox Aeternum / PSN: NoxAeternum / NN:NoxAeternum / Steam: noxaeternum
  • Options
    OrcaOrca Also known as Espressosaurus WrexRegistered User regular
    That is not just vile, that is straight out evil.

  • Options
    IncenjucarIncenjucar VChatter Seattle, WARegistered User regular
    edited November 2019
    Orca wrote: »
    That is not just vile, that is straight out evil.

    True patriotism.

    Not at all surprising. Lobbying for exploitation is pretty standard.

    Incenjucar on
  • Options
    daveNYCdaveNYC Why universe hate Waspinator? Registered User regular
    I, uh.....

    The only algorithms used for consumer credit are looking at database tables- for unsecured lending like a credit card, it’s a 2-dimension decision (credit score and income). “Algorithm,” while technically correct, is doing a LOT of lifting in this article.......

    I would bet a trillion dollars that Apple fucked up with info collection in their application and didn’t pass the correct income to GS because they don’t treat community property correctly

    Possibly, but it's even odds that GS up and decided to redevelop the wheel (now with the correct number of sides!) and put together some new hotness that isn't working properly.

    Shut up, Mr. Burton! You were not brought upon this world to get it!
  • Options
    OghulkOghulk Tinychat Janitor TinychatRegistered User regular
    Kinda gotta give props for not only going all-out on limiting government ability to regulate but also putting in blunt terms that they need people to be poor for their business model. Pretty amazing to see.

  • Options
    CouscousCouscous Registered User regular
    Our great Farmers will recieve another major round of “cash,” compliments of China Tariffs, prior to Thanksgiving. The smaller farms and farmers will be big beneficiaries. In the meantime, and as you may have noticed, China is starting to buy big again. Japan deal DONE. Enjoy!
    At what point does this just become bribery?

  • Options
    FencingsaxFencingsax It is difficult to get a man to understand, when his salary depends upon his not understanding GNU Terry PratchettRegistered User regular
    Couscous wrote: »
    Our great Farmers will recieve another major round of “cash,” compliments of China Tariffs, prior to Thanksgiving. The smaller farms and farmers will be big beneficiaries. In the meantime, and as you may have noticed, China is starting to buy big again. Japan deal DONE. Enjoy!
    At what point does this just become bribery?

    Well, that isn't how tariffs work, so never.

Sign In or Register to comment.