The new forums will be named Coin Return (based on the most recent vote)! You can check on the status and timeline of the transition to the new forums here.
The Guiding Principles and New Rules document is now in effect.

Effective Altruism, longtermers and charity

This seemed like a tangent worth exploring in its own thread.

If you're not familiar with it Effective Altruism is a school of thought that's getting lots of attention these days. SBS of the FTX Exchange and Elon Musk are both big proponents of it.

On its face EA seems fine. It's focused on using charity money and work in the most effective ways possible relying on research and facts rather than emotions. Of course this attracts techbros who think they're smarter than everyone else. So they can justify not paying taxes because its "inefficient" use of their money and only they know how to best spend it.

Its also hits some significant logical problems when you scratch the surface. Peter signer, one of AE's founders, says that all suffering and need should be considered equal and charity applied at a global scale. So helping your neighbor is selfish because the money you spent there could help more people somewhere else in the world. Completely ignoring the logistical differences of helping someone in front of you vs someone on the other side of the planet.

Perhaps the most distasteful branch of AE is longtermerism. the notion that regular charity's needs are eclipsed by existential threats to humanity. Why fight easy to cure childhood illness when global warming will kill us all anyway? Obviously its a waste of money right? William MacAskill's book What we Owe the future lays out arguments like this. That's where billionaires like Elon Musk find justifications for all kinds of shit. He's going to save us all by going to Mars so any bad he does in the process is negligible.

Anyway thats my opening spiel I don't have time to write out a longer essay on the topic. But its been coming up so I figured it might be worth having its own thread.

«13

Posts

  • nexuscrawlernexuscrawler Registered User regular
    I get a strong objectivist whiff of of everything to do with this philosophy. The appeal to the cold calculating superman to solve all our problems is too strong.

  • DarkPrimusDarkPrimus Registered User regular
    edited November 2022
    Nobody who is worth billions should be taken seriously when they want to talk about ethics.

    The ethical thing would have been to never have accrued such vast sums of wealth in the first place!

    DarkPrimus on
  • AntinumericAntinumeric Registered User regular
    edited November 2022
    Long-termism has caused the EA movement to make some questionable grants.

    From their own page on disbursements and justifications from their "long-term" fund:
    ($28,000)

    Giving copies of Harry Potter and the Methods of Rationality to the winners of EGMO 2019 and IMO 2020.
    ...
    What effects does reading HPMOR have on people?

    My models of the effects of HPMOR stem from my empirical observations and my inside view on rationality training.

    Empirically, a substantial number of top people in our community have (a) entered due to reading and feeling a deep connection to HPMOR and (b) attributed their approach to working on the long term future in substantial part to the insights they learned from reading HPMOR.
    ($20,000)

    Working to prevent burnout and boost productivity within the EA and X-risk communities

    From the application:

    (1) After 2 years as a CFAR instructor/researcher, I’m currently in a 6-12 month phase of reorienting around my goals and plans. I’m requesting a grant to spend the coming year thinking about rationality and testing new projects.

    (2) I want to help individuals and orgs in the x-risk community orient towards and achieve their goals.

    (A) I want to train the skill of dependability, in myself and others.
    (B) Thinking clearly about AI risk
    (C) Burnout
    (3) Some possible measurable outputs / artifacts:
    • A more effective version of myself (notable changes = gaining the ability to ride a bike / drive a car / exercise—a PTSD-related disability, ability to finish projects to completion, others noticing stark changes in me)
    CFAR ($150,000)

    [Edit: It seems relevant to mention that LessWrong is currently receiving operational support from CFAR, in a way that makes me technically an employee of CFAR (similar to how ACE and 80K were/are part of CEA for a long time)

    So here we have a grant to print some copies of a "rationalist" fanfic, fund someone recovering from burnout working for a "longtermism" org learn to ride a bike. And fund that same org despite admitting that the disbursment approver is an employee of that organisation.

    This is from a fund that is meant to be investing in things critical to the long term future of humanity. EA has been completely taken over by these people.

    Antinumeric on
    In this moment, I am euphoric. Not because of any phony god’s blessing. But because, I am enlightened by my intelligence.
  • ronyaronya Arrrrrf. the ivory tower's basementRegistered User regular
    continually hijacked by peter singer gedanken is about the flavour of it

    aRkpc.gif
  • Captain InertiaCaptain Inertia Central OhioRegistered User regular
    edited November 2022
    Longtermism is even more insidious than that

    It’s “it’s less effective to spend our wealth ameliorating despair for the billions alive now than spending it to ensure that the hundreds of billions of future humans over the coming several millennia suffer less”

    Which coincidentally means investing in their tech companies who are building intergalactic societies or whatever Musk said he was doing that justified him spending less time at Tesla during his testimony yesterday

    Using this frame, not even climate change is worth considering addressing for example, because that might only affect 10 billion people vs the potential future trillions who are exploring planets where the Earth’s climate means nothing

    Captain Inertia on
    l7ygmd1dd4p1.jpeg
    3b2y43dozpk3.jpeg
  • AntinumericAntinumeric Registered User regular
    It seems curious to me that all of the "most effective" things to do for the long term future of humanity seem to just happen to be things these people are already interested in. Guess mosquito nets and river blindness treatment just aren't cool enough anymore.

    In this moment, I am euphoric. Not because of any phony god’s blessing. But because, I am enlightened by my intelligence.
  • Captain InertiaCaptain Inertia Central OhioRegistered User regular
    Essays about this shit is credulously published in widely read and somehow still-respected publications

    l7ygmd1dd4p1.jpeg
    3b2y43dozpk3.jpeg
  • nexuscrawlernexuscrawler Registered User regular
    It seems curious to me that all of the "most effective" things to do for the long term future of humanity seem to just happen to be things these people are already interested in. Guess mosquito nets and river blindness treatment just aren't cool enough anymore.

    You see but that can't be it because they're hyper rational supermen who know best!

  • Captain InertiaCaptain Inertia Central OhioRegistered User regular
    DarkPrimus wrote: »
    Nobody who is worth billions should be taken seriously when they want to talk about ethics.

    The ethical thing would have been to never have accrued such vast sums of wealth in the first place!

    Never do they consider “what if no billionaires” as effective altruism

    You start with a decent-sounding premise (“do the most good”) and then just bullshit into doing what you want to do and point to your premise to shut down criticism

    l7ygmd1dd4p1.jpeg
    3b2y43dozpk3.jpeg
  • AntinumericAntinumeric Registered User regular
    Fundamentally the long-termism has the logical fallacy that lives in the future have equivalent moral value to lives today, since there are an unknown (but probably greater) number of lives in the future, anything that could cause all of them to cease to exist must be stopped, and anything that stops that must have infinite moral value.

    Will climate change kill all of humanity? no - then there is no value in stopping it.
    Will nuclear war kill all of humanity? no - then there is no value in stopping it.
    Will the creation of general AI that can improve itself potentially wipe out humanity no obviously not maybe? Then all efforts to improve safety of AI research have infinite value.

    It's just broken stupid ethics giving these people the warm fuzzies when they worship sociopathic billionaires.

    In this moment, I am euphoric. Not because of any phony god’s blessing. But because, I am enlightened by my intelligence.
  • override367override367 ALL minions Registered User regular
    edited November 2022
    This seemed like a tangent worth exploring in its own thread.

    If you're not familiar with it Effective Altruism is a school of thought that's getting lots of attention these days. SBS of the FTX Exchange and Elon Musk are both big proponents of it.

    On its face EA seems fine. It's focused on using charity money and work in the most effective ways possible relying on research and facts rather than emotions. Of course this attracts techbros who think they're smarter than everyone else. So they can justify not paying taxes because its "inefficient" use of their money and only they know how to best spend it.

    Its also hits some significant logical problems when you scratch the surface. Peter signer, one of AE's founders, says that all suffering and need should be considered equal and charity applied at a global scale. So helping your neighbor is selfish because the money you spent there could help more people somewhere else in the world. Completely ignoring the logistical differences of helping someone in front of you vs someone on the other side of the planet.

    Perhaps the most distasteful branch of AE is longtermerism. the notion that regular charity's needs are eclipsed by existential threats to humanity. Why fight easy to cure childhood illness when global warming will kill us all anyway? Obviously its a waste of money right? William MacAskill's book What we Owe the future lays out arguments like this. That's where billionaires like Elon Musk find justifications for all kinds of shit. He's going to save us all by going to Mars so any bad he does in the process is negligible.

    Anyway thats my opening spiel I don't have time to write out a longer essay on the topic. But its been coming up so I figured it might be worth having its own thread.

    See I get this part when we're talking about billionaires, because collectively they have enough money and influence to fix this

    Thing is they fuckin aren't

    If Tesla was a non profit I might spend more than a quarter second of thought before realizing how horseshit this is. If Musk had leveraged himself to start to try to worm his way into fossil fuel companies so that he could steer those collosi to a brighter future for humanity, likewise

    override367 on
  • MrMisterMrMister Jesus dying on the cross in pain? Morally better than us. One has to go "all in".Registered User regular
    I won’t have time to write anything substantial about this for a while—probably not until next week—but I just want to register that most of the common critiques of EA you see in left-wing thinkpieces, including some being recirculated here, just have no basis in the reality of what EAs think or do.

    Anyone working in EA spaces or having EA colleagues knows that longtermism has gotten more attention recently—and I do find that regrettable in many ways—but an enormous amount of the work is still about mosquito nets and clean-burning cook stoves. Even the speculative investments in hard to quantify risks of very bad outcomes are not clearly wrong. Part of the reason we were able to develop COVID vaccines so quickly is because the Wellcome Trust funded a bunch of speculative research into MERS on the theory that some coronavirus was likely to cause a pandemic sooner or later. Thank god they did!! And EA now is moving hard into pandemic preparedness as an underfunded high impact area.

    Like, what are EAs about? Well, they’re not a monolith so different people have different interests. But the friend/former colleague I have who is most active in EA is an academic working on cost-effectiveness analysis and global health priority setting; her last works in progress talk was about how funders uniformly prioritizing more certain interventions over less certain ones structurally exacerbates health inequalities, and therefore funders need to develop a more context-sensitive approach to managing uncertainty. And I go from sitting in a talk like that to reading, say, Corey Robin doing his masturbatory Jacobin keyboard commando routine about how EA lol tech bros lol, and it’s like, man, this fucking sucks.

  • Captain InertiaCaptain Inertia Central OhioRegistered User regular
    I’m glad MM is here

    I would say I came to discover EA through the Longtermism critique and then because I’ve been forced to learn about SBF, and then again yesterday with Musk’s testimony

    l7ygmd1dd4p1.jpeg
    3b2y43dozpk3.jpeg
  • FANTOMASFANTOMAS Flan ArgentavisRegistered User regular
    Isnt it just a way to avoid paying taxes?

    Yes, with a quick verbal "boom." You take a man's peko, you deny him his dab, all that is left is to rise up and tear down the walls of Jericho with a ".....not!" -TexiKen
  • Captain InertiaCaptain Inertia Central OhioRegistered User regular
    Seems the most notorious people who are claiming adherence to the philosophy use it to justify tax avoidance yes

    l7ygmd1dd4p1.jpeg
    3b2y43dozpk3.jpeg
  • DoodmannDoodmann Registered User regular
    Isn't this just a retread of Carnegie's Gospel of Wealth?

    Whippy wrote: »
    nope nope nope nope abort abort talk about anime
    I like to ART
  • IncenjucarIncenjucar VChatter Seattle, WARegistered User regular
    edited November 2022
    Sounds like Objectivism 2.0. Using academic language to justify being a bane on society.

    Incenjucar on
  • ShivahnShivahn Unaware of her barrel shifter privilege Western coastal temptressRegistered User, Moderator mod
    Its also hits some significant logical problems when you scratch the surface. Peter signer, one of AE's founders, says that all suffering and need should be considered equal and charity applied at a global scale. So helping your neighbor is selfish because the money you spent there could help more people somewhere else in the world. Completely ignoring the logistical differences of helping someone in front of you vs someone on the other side of the planet.

    I don't think this (the idea that your money or time could be spent better elsewhere) is wrong, though. Like, it's easier to help your neighbor, generally, but the degree of help and cost of help isn't going to be the same or anywhere close to proportional. My donation to basically anything that's going to save American children would save more lives if it was for mosquito nets, and my volunteering to do almost anything, while probably laudable, is not going to ameliorate nearly the suffering that some trivial donation elsewhere will. Some of the fruit out there is so, so low hanging. It's logistically easier to help Dan down the street, but it's going to be hard to have the same impact that a few sub-dollar mosquito nets can have, because so many children are dying from such easily preventable causes.

    (maybe soon we'll be able to spend that money on mosquito nets to save American children though, as tropical climates and mosquitos head further north..)

  • shrykeshryke Member of the Beast Registered User regular
    edited November 2022
    MrMister wrote: »
    I won’t have time to write anything substantial about this for a while—probably not until next week—but I just want to register that most of the common critiques of EA you see in left-wing thinkpieces, including some being recirculated here, just have no basis in the reality of what EAs think or do.

    Anyone working in EA spaces or having EA colleagues knows that longtermism has gotten more attention recently—and I do find that regrettable in many ways—but an enormous amount of the work is still about mosquito nets and clean-burning cook stoves. Even the speculative investments in hard to quantify risks of very bad outcomes are not clearly wrong. Part of the reason we were able to develop COVID vaccines so quickly is because the Wellcome Trust funded a bunch of speculative research into MERS on the theory that some coronavirus was likely to cause a pandemic sooner or later. Thank god they did!! And EA now is moving hard into pandemic preparedness as an underfunded high impact area.

    Like, what are EAs about? Well, they’re not a monolith so different people have different interests. But the friend/former colleague I have who is most active in EA is an academic working on cost-effectiveness analysis and global health priority setting; her last works in progress talk was about how funders uniformly prioritizing more certain interventions over less certain ones structurally exacerbates health inequalities, and therefore funders need to develop a more context-sensitive approach to managing uncertainty. And I go from sitting in a talk like that to reading, say, Corey Robin doing his masturbatory Jacobin keyboard commando routine about how EA lol tech bros lol, and it’s like, man, this fucking sucks.

    To jump off this point, I was saying in the thread that spawned this one that EA as a concept is good but EA as it's known colloquially (in so far as anything about this niche-ass discussion can be described as colloquial) is becoming associated with a specific flavour of tech bro "we're gonna colonize the stars and fear roko's basilisk" silliness. In part because that's all anyone writing about EA ever talks about. The rest of it gets no coverage by almost anyone.

    shryke on
  • IncenjucarIncenjucar VChatter Seattle, WARegistered User regular
    Seems to be going along the same path as evo psych. Something that us hypothetically useful being turned into a way for terrible people to justify their bullshit.

  • nexuscrawlernexuscrawler Registered User regular
    I think the billionaire douches have copted something that has some merit.

    I'm not opposed to the notion of more targeted charity work and research based on it. Its easy to point of examples of well meaning but pointless charity gestures. Look no further than floods of people sending teddy bears after disasters when dollars given to any charity on the ground would better spent.

  • monikermoniker Registered User regular
    Eh, I feel like there are levels. As someone who volunteers weekly at a food pantry, being effectively altruistic would mean giving them money rather than your two year old can of pumpkin pie filling. One actually manages to help people, the other does nothing but make you feel like you're helping without actually doing anything for people experiencing food insecurity. Same with volunteering. If you can show up at a predictable schedule (with caveats about life happening) it will be more meaningful than having your whole book club /football team /group of folks show up once in December.

    The existence of sites like Charity Navigator to make sure that nonprofits are actually doing good rather than just assuming is great. Because there is definitely a place for data based skepticism in charity. Also, process efficiencies and basic management improvements. "What's the harm" is a real concern that should be considered when making donations or providing some form of intervention. It's also beneficial to reduce the risk of giving money to a grift that spends all it's donations on Director pay and turning their logo pink or a rainbow once a year. Spending money to model terraforming a planet without a magnetosphere isn't altruism or charity.

  • Captain InertiaCaptain Inertia Central OhioRegistered User regular
    Doodmann wrote: »
    Isn't this just a retread of Carnegie's Gospel of Wealth?

    For the douches that have made this topical, yes

    l7ygmd1dd4p1.jpeg
    3b2y43dozpk3.jpeg
  • tinwhiskerstinwhiskers Registered User regular
    So my introduction to EA was via discussion about it in college ethics. Particularly some of Peter Singer's writings - I believe it was Famine, Affluence, and Morality.


    Singers A Life You Can Save is available for free as a PDF and as an audiobook narrated by Kristen Bell!


    It's not a terribly long book and I would propose most people maybe give it a quick skim.



    I'm not sure how intertwined long termism and AE actually are in practice. But given the human tendency to be incredible short term focused a philosophy organized around the idea that the most important charitable work we can do is for the longer term future doesn't strike me a something that is inherently bad or wrong. "A society grows great when old men plant trees whose shade they know they shall never sit in." and all that.

    6ylyzxlir2dz.png
  • IncenjucarIncenjucar VChatter Seattle, WARegistered User regular
    Unfortunately any time money is flowing in a direction the grifters are going to try to insert themselves into it as firmly as possible. Charities in general are a minefield of corruption, PR stunts, and decadent schmoozing, which is incredibly frustrating for people who actually want to help. The folks at the top LOVE to insert themselves into these, especially if they can use it to justify remaining wealthy in the eyes of voters.

  • DouglasDangerDouglasDanger PennsylvaniaRegistered User regular
    Billionaires shouldn't exist, that's all there is to it.

  • amateurhouramateurhour One day I'll be professionalhour The woods somewhere in TennesseeRegistered User regular
    edited November 2022
    I'm the opposite stance, wherein I do as much as I can locally to help out, because it's kind of a moot point for me to think globally (aside from the environment) due to my geographic location and finances.

    I guess for that reason, it's always kind of gone in one ear and out the other when a Billionaire talks about "I'm giving away (x) dollars (with (y) stipulations attached)

    maybe I'm missing something though, but even at scale, it seems like the rich guy who says "i'm not giving that person money, they're just going to use it on (z)"

    All of this is new to me (Effective Altruism) so I prolly am wrong : )

    amateurhour on
    are YOU on the beer list?
  • IncenjucarIncenjucar VChatter Seattle, WARegistered User regular
    Locally and globally both have their benefits. Helping local people is easier to predict the outcome of and can create a larger community able to affect larger and larger change over time as a population of altruistic people flourishes in an area that can't suddenly be wrecked by a local warlord or disaster.

  • tinwhiskerstinwhiskers Registered User regular
    edited November 2022
    I'm the opposite stance, wherein I do as much as I can locally to help out, because it's kind of a moot point for me to think globally (aside from the environment) due to my geographic location and finances.

    I guess for that reason, it's always kind of gone in one ear and out the other when a Billionaire talks about "I'm giving away (x) dollars (with (y) stipulations attached)

    maybe I'm missing something though, but even at scale, it seems like the rich guy who says "i'm not giving that person money, they're just going to use it on (z)"

    All of this is new to me (Effective Altruism) so I prolly am wrong : )

    EA is kind of a bit of a downer and is pretty coldly utilitarian.

    But as an example say your neighbors kid needs glasses and they can't afford it. And you have $300 you can spare.

    You could either

    A) Buy a new outdoor toy
    B) pay for little Cletus's glasses
    C) Donate to a charity that treats river blindness at $30/case


    What should you do with your $300?


    Estimates on this stuff are obviously not exact, but the lowest cost to save a life is probably mosquito nets to prevent malaria at roughly $3400/life or $100 for Disability Adjusted Life Year (DALY) prevented.

    So if you see some local charity trying to raise say $300k for library upgrades, that money could go and save 100 lives instead of providing more books or AV materials or w/e. But people have bias towards local things, and also things that are just more specific and less statistical. Those 100 lives are not specific. Its we will give 100,000 people mosquito nets, and so 2000 of them that would have gotten malaria in the next 3 years won't(though maybe 2000 still will) and off that 2000, 100 would have died so now the death toll is 100 instead of the 200 it would have been. You can't point to a specific person and go, we saved Malia's life. Maybe you did, but maybe it was someone a village over and she would have been fine regardless, or maybe only gotten slightly sick.

    tinwhiskers on
    6ylyzxlir2dz.png
  • redxredx I(x)=2(x)+1 whole numbersRegistered User regular
    I'm the opposite stance, wherein I do as much as I can locally to help out, because it's kind of a moot point for me to think globally (aside from the environment) due to my geographic location and finances.

    I guess for that reason, it's always kind of gone in one ear and out the other when a Billionaire talks about "I'm giving away (x) dollars (with (y) stipulations attached)

    maybe I'm missing something though, but even at scale, it seems like the rich guy who says "i'm not giving that person money, they're just going to use it on (z)"

    All of this is new to me (Effective Altruism) so I prolly am wrong : )

    EA is kind of a bit of a downer and is pretty coldly utilitarian.

    But as an example say your neighbors kid needs glasses and they can't afford it. And you have $300 you can spare.

    You could either

    A) Buy a new outdoor toy
    B) pay for little Cletus's glasses
    C) Donate to a charity that treats river blindness at $30/case


    What should you do with your $300?


    Estimates on this stuff are obviously not exact, but the lowest cost to save a life is probably mosquito nets to prevent malaria at roughly $3400/life or $100 for Disability Adjusted Life Year (DALY) prevented.

    So if you see some local charity trying to raise say $300k for library upgrades, that money could go and save 1000 lives instead of providing more books or AV materials or w/e.

    If I spend $300 on a toy it will allow me to play harder. That will enable me to work harder, and over the long term increase the amount on money I can spend on helping people by more than $300.

    this is the best solution.

    They moistly come out at night, moistly.
  • tbloxhamtbloxham Registered User regular
    If you have a reasonable amount of money for a rich person, and then choose to use some of it giving away books you like, then no worries. You do you.

    If you are a person with a reasonable amount of money who chooses to focus on long term thinking in your charitable giving, then again, good idea. Probably need to balance that with urgent needs, but, sick kids bring in more dollars than issues with Antarctic plankton bloom or whatnot. I very much doubt we are anywhere close to sufficient fraction of charitable actions being spent on the future.

    As ever, the issue is the wealth of the 0.01% making everything broken. If you have $50 million, you are incredibly wealthy but not so wealthy that your existence warps any good works you attempt into evil. You’re on the borderline, and might be evil if you aren’t consciously being a good person. But, if you have a billion, and approach one of these activities then your very existance corrupts them. It becomes nearly impossible for you to do long term good due to the greed of other people. Long term thinking doesn't deliver $1000 of mosquito nets in exchange for a $1050 dollar donation. Its like, a new paper on climate equity in exchange for a $1000 dollar donation. This makes billionaires uniquely poorly situated to do this work, because writing a new paper is something that might cost $1000, or $10000, or $1 million. And if some idiot billionaire is here, why not say it costs a million?

    So, if you are a billionaire, and MUST remain a billionaire. You need to focus on short term, "Itemized" giving. Mosquito nets, food, small cash grants to local communities to build wells or plant trees and so on. Convert your desirable liquid cash into boring workaday stuff and have that be all you offer. People will take what they need, and thats that.

    If you wish to engage in long term altruism as a Billionaire, step one is to become merely rich. Give away 95% of your money today, to the government, and pledge to self tax at a rate of 95% hereafter on money which would take you above 50 million in assets. Then you can engage in long term thinking with the rest if you so choose.

    If you are NOT a billionaire, and wish to engage with a climate charity which focuses on long term solutions (say, their plan is to buy federal oil and gas grants and refuse to drill there, thus raising the price of fossil fuels) then go for it like you would with any charity.

    "That is cool" - Abraham Lincoln
  • nexuscrawlernexuscrawler Registered User regular
    Even these cold utilitarian solutions require people on the ground who want to do good. The top down view of EA kinda forgets that part.

  • Phoenix-DPhoenix-D Registered User regular
    redx wrote: »
    I'm the opposite stance, wherein I do as much as I can locally to help out, because it's kind of a moot point for me to think globally (aside from the environment) due to my geographic location and finances.

    I guess for that reason, it's always kind of gone in one ear and out the other when a Billionaire talks about "I'm giving away (x) dollars (with (y) stipulations attached)

    maybe I'm missing something though, but even at scale, it seems like the rich guy who says "i'm not giving that person money, they're just going to use it on (z)"

    All of this is new to me (Effective Altruism) so I prolly am wrong : )

    EA is kind of a bit of a downer and is pretty coldly utilitarian.

    But as an example say your neighbors kid needs glasses and they can't afford it. And you have $300 you can spare.

    You could either

    A) Buy a new outdoor toy
    B) pay for little Cletus's glasses
    C) Donate to a charity that treats river blindness at $30/case


    What should you do with your $300?


    Estimates on this stuff are obviously not exact, but the lowest cost to save a life is probably mosquito nets to prevent malaria at roughly $3400/life or $100 for Disability Adjusted Life Year (DALY) prevented.

    So if you see some local charity trying to raise say $300k for library upgrades, that money could go and save 1000 lives instead of providing more books or AV materials or w/e.

    If I spend $300 on a toy it will allow me to play harder. That will enable me to work harder, and over the long term increase the amount on money I can spend on helping people by more than $300.

    this is the best solution.

    The stupid part is this is to a limited extent an actual problem. Spend wealth on fixing problem, or spend wealth to generate MORE wealth to fix the problem more.

    At the scale of Musk and co that's not relevant. And of course FTX gave the game away, mentioning he literally does it as a front. Which a bunch of the other billionaires almost assuredly do too.

  • redxredx I(x)=2(x)+1 whole numbersRegistered User regular
    Phoenix-D wrote: »
    redx wrote: »
    I'm the opposite stance, wherein I do as much as I can locally to help out, because it's kind of a moot point for me to think globally (aside from the environment) due to my geographic location and finances.

    I guess for that reason, it's always kind of gone in one ear and out the other when a Billionaire talks about "I'm giving away (x) dollars (with (y) stipulations attached)

    maybe I'm missing something though, but even at scale, it seems like the rich guy who says "i'm not giving that person money, they're just going to use it on (z)"

    All of this is new to me (Effective Altruism) so I prolly am wrong : )

    EA is kind of a bit of a downer and is pretty coldly utilitarian.

    But as an example say your neighbors kid needs glasses and they can't afford it. And you have $300 you can spare.

    You could either

    A) Buy a new outdoor toy
    B) pay for little Cletus's glasses
    C) Donate to a charity that treats river blindness at $30/case


    What should you do with your $300?


    Estimates on this stuff are obviously not exact, but the lowest cost to save a life is probably mosquito nets to prevent malaria at roughly $3400/life or $100 for Disability Adjusted Life Year (DALY) prevented.

    So if you see some local charity trying to raise say $300k for library upgrades, that money could go and save 1000 lives instead of providing more books or AV materials or w/e.

    If I spend $300 on a toy it will allow me to play harder. That will enable me to work harder, and over the long term increase the amount on money I can spend on helping people by more than $300.

    this is the best solution.

    The stupid part is this is to a limited extent an actual problem. Spend wealth on fixing problem, or spend wealth to generate MORE wealth to fix the problem more.

    At the scale of Musk and co that's not relevant. And of course FTX gave the game away, mentioning he literally does it as a front. Which a bunch of the other billionaires almost assuredly do too.

    Like, it's fine when it is being used by people with an understanding of the issues, access to accurate information, the ability to make correct judgements, and intellectual honesty.


    It's popular with a whole bunch of tech-bros, and it is popular with a bunch of finance billionaires.

    They moistly come out at night, moistly.
  • tinwhiskerstinwhiskers Registered User regular
    edited November 2022
    I think it will be interesting to see if any additional billionaires follow the Patagonia example, with Purpose Trusts.
    Yvon Chouinard, the founder of clothing maker Patagonia, has transferred the voting stock (representing 2% of the value of the company) of this $3 billion company to a purpose trust. And, the non-voting stock (representing 98% of the value of the company), that receives the annual profits of the company worth $100 million, transferred to a private foundation. This structure was decided upon as the best way of meeting Chouinard’s objectives, which includes worker well-being and climate action, and after his children decided that they did not want to own the company.


    Though I'm not sure if something like this is more effective than just selling the company / shares and donating or setting up a foundation with the cash.

    tinwhiskers on
    6ylyzxlir2dz.png
  • DoodmannDoodmann Registered User regular
    Whippy wrote: »
    nope nope nope nope abort abort talk about anime
    I like to ART
  • PaladinPaladin Registered User regular
    I've always wondered what would happen if you became a trillionaire and just burned your money.

    If every billionaire liquidated all their funds and gave everyone thousands of dollars, what would happen? What if they just torched it all? What's the difference?

    Marty: The future, it's where you're going?
    Doc: That's right, twenty five years into the future. I've always dreamed on seeing the future, looking beyond my years, seeing the progress of mankind. I'll also be able to see who wins the next twenty-five world series.
  • rahkeesh2000rahkeesh2000 Registered User regular
    Its also hits some significant logical problems when you scratch the surface. Peter signer, one of AE's founders, says that all suffering and need should be considered equal and charity applied at a global scale. So helping your neighbor is selfish because the money you spent there could help more people somewhere else in the world. Completely ignoring the logistical differences of helping someone in front of you vs someone on the other side of the planet.

    Seems like a gross misrepresentation. His point has always been a bunch of starbucks coffee to you is someone else's life via a vaccine they can't otherwise afford. Even taking into account logistical differences the huge disparity in quality of life for the same money can be staggering. He believes (maybe wrongly) that people only ignore contributing to the realtively easy fix to this situation because its not local to them.

    In terms of charity he promotes really sure-fire cheap causes with big upside that only exist due to basic neglect, because those are easy utilitarian calculuses. Usually addressing very curable diseases where there just isn't the money or care to do the curing. As long as you focus in on such extreme scenarios the downsides of utilitarianism don't come in to the fore so much. And there's enough areas of extreme need that all but the richest don't have to look into murkier scenarios to spend their cash.

    This is also pretty far disconnected from the billionare assholes dodging taxes. Singer donated like a fourth of his shitty teacher salary to charity, by the same standard those rich guys should be doing at least 95%.

  • nexuscrawlernexuscrawler Registered User regular
    edited November 2022
    That's kind of a super oversimplification though. You can say a vaccine costs blank and your coffee costs blank but that's meaningless. Once you're doing anything at scale you need supporting people, infrastructure and the associated costs.

    nexuscrawler on
  • monikermoniker Registered User regular
    If only there were some way to coercively raise funds from the rich on a percentage basis to redistribute them to a more beneficial purpose...

Sign In or Register to comment.