As was foretold, we've added advertisements to the forums! If you have questions, or if you encounter any bugs, please visit this thread: https://forums.penny-arcade.com/discussion/240191/forum-advertisement-faq-and-reports-thread/

[Cambridge Analytica], [Facebook], and Data Security.

1343537394046

Posts

  • PaladinPaladin Registered User regular
    Nyysjan wrote: »
    daveNYC wrote: »
    Nyysjan wrote: »
    daveNYC wrote: »
    Played straight or sarcastic it’s still a good point. This is Facebook we’re talking about, their history of judgement calls isn’t great.

    Therefore we should not ask them to even try and just accept free speech absolutism that pretty much always favors fascists and liars?

    Pretty sure that’s exactly what I said you goose.
    Well, i'm going to disagree.
    Just accepting the status quo is not the answer.

    It's a pretty low stakes argument, because how does accepting vs rejecting the status quo change your life? What can you do that's not purely performative?

    Marty: The future, it's where you're going?
    Doc: That's right, twenty five years into the future. I've always dreamed on seeing the future, looking beyond my years, seeing the progress of mankind. I'll also be able to see who wins the next twenty-five world series.
  • NyysjanNyysjan FinlandRegistered User regular
    Paladin wrote: »
    Nyysjan wrote: »
    daveNYC wrote: »
    Nyysjan wrote: »
    daveNYC wrote: »
    Played straight or sarcastic it’s still a good point. This is Facebook we’re talking about, their history of judgement calls isn’t great.

    Therefore we should not ask them to even try and just accept free speech absolutism that pretty much always favors fascists and liars?

    Pretty sure that’s exactly what I said you goose.
    Well, i'm going to disagree.
    Just accepting the status quo is not the answer.

    It's a pretty low stakes argument, because how does accepting vs rejecting the status quo change your life? What can you do that's not purely performative?
    Me, personally? I can argue here on the internets in hopes that maybe i can convince someone else, who might convince someone else, who can convince someone else, etc..., until maybe someone who can eventually gets convinced.

    And, really, what here is not going to be utterly low stakes? If us not being able to make meaningful changes to things is somehow a strike against an argument, we might as well shut down the forums.
    Like, this feels very close to accusing people of just virtue signalling, which usually is not a sign of a good discourse.

  • PaladinPaladin Registered User regular
    Nyysjan wrote: »
    Paladin wrote: »
    Nyysjan wrote: »
    daveNYC wrote: »
    Nyysjan wrote: »
    daveNYC wrote: »
    Played straight or sarcastic it’s still a good point. This is Facebook we’re talking about, their history of judgement calls isn’t great.

    Therefore we should not ask them to even try and just accept free speech absolutism that pretty much always favors fascists and liars?

    Pretty sure that’s exactly what I said you goose.
    Well, i'm going to disagree.
    Just accepting the status quo is not the answer.

    It's a pretty low stakes argument, because how does accepting vs rejecting the status quo change your life? What can you do that's not purely performative?
    Me, personally? I can argue here on the internets in hopes that maybe i can convince someone else, who might convince someone else, who can convince someone else, etc..., until maybe someone who can eventually gets convinced.

    And, really, what here is not going to be utterly low stakes? If us not being able to make meaningful changes to things is somehow a strike against an argument, we might as well shut down the forums.
    Like, this feels very close to accusing people of just virtue signalling, which usually is not a sign of a good discourse.

    I don't care at all about virtue signaling unless it's shutting down avenues of discussion with a "how dare you" factor. Things got a little heated because someone was being jokey and unclear in their intention. I don't feel like we need to severely quash that kind of behavior unless we're doing something real intensive that requires everybody to be 100% on board and on task. Since we don't have a high stakes deliverable like writing templates for congress or researching ways to support improvement of social media, I feel like we don't have to strictly enforce a particular tone of discussion.

    Marty: The future, it's where you're going?
    Doc: That's right, twenty five years into the future. I've always dreamed on seeing the future, looking beyond my years, seeing the progress of mankind. I'll also be able to see who wins the next twenty-five world series.
  • daveNYCdaveNYC Why universe hate Waspinator? Registered User regular
    Nyysjan wrote: »
    daveNYC wrote: »
    Nyysjan wrote: »
    daveNYC wrote: »
    Played straight or sarcastic it’s still a good point. This is Facebook we’re talking about, their history of judgement calls isn’t great.

    Therefore we should not ask them to even try and just accept free speech absolutism that pretty much always favors fascists and liars?

    Pretty sure that’s exactly what I said you goose.
    Well, i'm going to disagree.
    Just accepting the status quo is not the answer.

    The point was that Facebook tends to suck when it comes to critical thinking, so they’re as likely to ban The Onion or something. But you do you.

    Shut up, Mr. Burton! You were not brought upon this world to get it!
  • shrykeshryke Member of the Beast Registered User regular
    daveNYC wrote: »
    Nyysjan wrote: »
    daveNYC wrote: »
    Nyysjan wrote: »
    daveNYC wrote: »
    Played straight or sarcastic it’s still a good point. This is Facebook we’re talking about, their history of judgement calls isn’t great.

    Therefore we should not ask them to even try and just accept free speech absolutism that pretty much always favors fascists and liars?

    Pretty sure that’s exactly what I said you goose.
    Well, i'm going to disagree.
    Just accepting the status quo is not the answer.

    The point was that Facebook tends to suck when it comes to critical thinking, so they’re as likely to ban The Onion or something. But you do you.

    How do they "tend to suck" when they never even really try?

    Like, this Pelosi thing is the perfect example of how they aren't really trying to do anything but avoid getting yelled at by Donald Trump.

  • daveNYCdaveNYC Why universe hate Waspinator? Registered User regular
    shryke wrote: »
    daveNYC wrote: »
    Nyysjan wrote: »
    daveNYC wrote: »
    Nyysjan wrote: »
    daveNYC wrote: »
    Played straight or sarcastic it’s still a good point. This is Facebook we’re talking about, their history of judgement calls isn’t great.

    Therefore we should not ask them to even try and just accept free speech absolutism that pretty much always favors fascists and liars?

    Pretty sure that’s exactly what I said you goose.
    Well, i'm going to disagree.
    Just accepting the status quo is not the answer.

    The point was that Facebook tends to suck when it comes to critical thinking, so they’re as likely to ban The Onion or something. But you do you.

    How do they "tend to suck" when they never even really try?

    Like, this Pelosi thing is the perfect example of how they aren't really trying to do anything but avoid getting yelled at by Donald Trump.

    They seem to suck at everything else that they do, so I’m just assuming it’s a core company value at this point.

    If Apple were doing something like this I wouldn’t be happy because I’m generally a free speech absolutist type and the internet is an overlap of a government communication infrastructure that’s been handed over to private companies to do what they wish with it. Within that space, and especially with social media which seems to be a natural monopoly type situation, yeah, some first amendment protections need to be in play. However, Apple generally seems somewhat competent sometimes on some issues. I might not like what they’re doing, but at least they wouldn’t be crapping on everything and everyone. Maybe.

    Facebook though? No-talent ass-clowns doesn’t even begin to cover them.

    Shut up, Mr. Burton! You were not brought upon this world to get it!
  • CelestialBadgerCelestialBadger Registered User regular
    Apple doesn’t run any social media. That’s why they don’t run into free speech problems.

  • daveNYCdaveNYC Why universe hate Waspinator? Registered User regular
    Apple doesn’t run any social media. That’s why they don’t run into free speech problems.

    Well... yeah, I know. Hypothetical situation and whatnot. Point being that Facebook is generally bad at everything they do, so their policy will probably end up being a shitshow.

    Shut up, Mr. Burton! You were not brought upon this world to get it!
  • CelestialBadgerCelestialBadger Registered User regular
    edited June 2019
    Exactly 100% of the big social media platforms are running into issues where free speech breeds Nazis. It seems to be a hard but to crack. Only small hand-moderated forums like this one seem to be able to get rid of the Nazis.

    The only thing like social media that Apple does is the App Store approval process - which is hand-moderated.

    CelestialBadger on
  • HamHamJHamHamJ Registered User regular
    HamHamJ wrote: »
    I'm sure Facebook will make the determination between defamation and satire in a fair and objective manner.

    So tell me - should Facebook not remove blatantly defamatory material? Because that is actually what is going on at the moment - they are refusing to remove a clearly faked video of Pelosi that was posted with the intent to defame her.

    Also, if your satire reads as defamation, then it's shitty satire.

    I've taken a while to think about this and I think that the decision that something is defamatory should be made in a court of law and not by an anonymous Facebook moderator being paid minimum wage. And then the court can instruct Facebook to take it down.

    I don't think there is a clear bright line between this and Daily Show interviews that make politicians look like idiots or all the Bush memes or that thing with Trump staring into the solar eclipse.

    While racing light mechs, your Urbanmech comes in second place, but only because it ran out of ammo.
  • CelestialBadgerCelestialBadger Registered User regular
    edited June 2019
    4-Chan can make 100,000 defamatory memes in the time it takes for a defamation court case to wind its way through the legal system.

    Here’s an idea: how about the courts should be able to force Facebook to issue retractions like newspapers have to do. Every person who has had a defamatory meme struck down by the courts displayed on their feed gets a retraction planted at the top of their feed which will not disappear until manually dismissed.

    This would quickly become annoying enough for social media to have to do something about it or lose users.

    CelestialBadger on
  • PaladinPaladin Registered User regular
    I'm in favor of third party oversight with decisions challenged in court. Having the court rule in every case is impractical.

    Marty: The future, it's where you're going?
    Doc: That's right, twenty five years into the future. I've always dreamed on seeing the future, looking beyond my years, seeing the progress of mankind. I'll also be able to see who wins the next twenty-five world series.
  • AngelHedgieAngelHedgie Registered User regular
    HamHamJ wrote: »
    HamHamJ wrote: »
    I'm sure Facebook will make the determination between defamation and satire in a fair and objective manner.

    So tell me - should Facebook not remove blatantly defamatory material? Because that is actually what is going on at the moment - they are refusing to remove a clearly faked video of Pelosi that was posted with the intent to defame her.

    Also, if your satire reads as defamation, then it's shitty satire.

    I've taken a while to think about this and I think that the decision that something is defamatory should be made in a court of law and not by an anonymous Facebook moderator being paid minimum wage. And then the court can instruct Facebook to take it down.

    I don't think there is a clear bright line between this and Daily Show interviews that make politicians look like idiots or all the Bush memes or that thing with Trump staring into the solar eclipse.

    There's a very bright line - falsification. The Pelosi video was intentionally edited to make her appear to be drunk/ill. That is a vast difference from what TDS does when they hand a politician enough rope to hang themselves with, and is the reason Stephen Colbert broke kayfabe to chew out Julian Assange over the Collateral Murder video.

    XBL: Nox Aeternum / PSN: NoxAeternum / NN:NoxAeternum / Steam: noxaeternum
  • shrykeshryke Member of the Beast Registered User regular
    Exactly 100% of the big social media platforms are running into issues where free speech breeds Nazis. It seems to be a hard but to crack. Only small hand-moderated forums like this one seem to be able to get rid of the Nazis.

    The only thing like social media that Apple does is the App Store approval process - which is hand-moderated.

    It's not really near as hard as you'd think. To get rid of every nazi? Yeah, sure, that's basically impossible. To massively tamp down on them? Not that hard overall.

    The problem is you need to be willing to take a stand against nazis. We saw this exact thing recently pop up about twitter:



    Like, it's not hard to, say, ban that video about Pelosi. They know it's fucking fake. Making the determination was not the problem. It was what to do about it that they fail on.
    Facebook has said it will only downgrade its visibility in users’ newsfeeds and attach a link to a third-party factchecking site pointing out that the clip is misleading.
    https://www.theguardian.com/technology/2019/may/24/facebook-leaves-fake-nancy-pelosi-video-on-site

  • CelestialBadgerCelestialBadger Registered User regular
    Of course it’s notable that in banning white supremacists, PA forums has also become a very unfriendly place for regular Republicans too... draw your conclusions from that...

  • DarkPrimusDarkPrimus Registered User regular
    Of course it’s notable that in banning white supremacists, PA forums has also become a very unfriendly place for regular Republicans too... draw your conclusions from that...

    It's difficult to defend the Republican party when its elected members defend white supremacists.

  • schussschuss Registered User regular
    DarkPrimus wrote: »
    Of course it’s notable that in banning white supremacists, PA forums has also become a very unfriendly place for regular Republicans too... draw your conclusions from that...

    It's difficult to defend the Republican party when its elected members defend white supremacists.

    Yep, until the party actually does something to combat the incompetent racists in charge, they're part and parcel to what's going on.

  • MillMill Registered User regular
    Pretty sure the courts will find that video to qualify as intentional defamation. I just did a pass through their ToS, unfortunately can't hit them for not being consistent on the ToS enforcement. They have fuck all about telling people to fuck off with intentionally false shit, designed intentionally to defame someone. I'm sure a good judge will tell FB to fuck off there and that "yes, they need to be acting against people that use their platform as a staging point to defame people. Closest thing I see on BS information containment is that they'll downgrade fake news and seem to act like it's the same thing as satire (pretty sure the Onion and most other honest satire sites, will let you know they are satire, where fake news tries to hide the fact that they are BS.

  • AngelHedgieAngelHedgie Registered User regular
    Facebook lawyer says the quiet parts out loud to try to dismiss Cambridge Analytica lawsuit:
    A lawyer for Facebook argued on Wednesday that its users had no expectation of privacy when using the social network, pushing for a judge to throw out a class-action lawsuit related to the Cambridge Analytica scandal.

    “There is no invasion of privacy at all, because there is no privacy,” on Facebook or any other social media site, company attorney Orin Snyder told U.S. District Judge Vince Chhabria.

    That may come as a surprise to those who have followed CEO Mark Zuckerberg’s pivot to privacy in recent months. He even wrote a manifesto for a “privacy-focused vision” of social media in March, saying he believes the future of communication lies in private, encrypted services.

    But his lawyer’s line of reasoning in court echoes what the company and Zuckerberg previously said both publicly and privately in past years — and explains how the company built an online advertising business that is now rivaled only by Google.

    So, unsurprisingly, the "privacy" pivot is solely CYA. Zuckerberg needs to go.

    XBL: Nox Aeternum / PSN: NoxAeternum / NN:NoxAeternum / Steam: noxaeternum
  • PaladinPaladin Registered User regular
    Tall order considering he is CEO and chairman of the board and majority stakeholder of his own company and wrote pretty much the whole book on how to remove threats to your own power over a company. He just survived a vote to step down two days ago, mainly because he controls 60% of the votes.

    Marty: The future, it's where you're going?
    Doc: That's right, twenty five years into the future. I've always dreamed on seeing the future, looking beyond my years, seeing the progress of mankind. I'll also be able to see who wins the next twenty-five world series.
  • AngelHedgieAngelHedgie Registered User regular
    Paladin wrote: »
    Tall order considering he is CEO and chairman of the board and majority stakeholder of his own company and wrote pretty much the whole book on how to remove threats to your own power over a company. He just survived a vote to step down two days ago, mainly because he controls 60% of the votes.

    Yes, that's why he needs to go. His entire life has been spent making sure he would never be accountable to anyone,with unsurprising results.

    Also, the Daily Beast found the person behind pushing the fake Pelosi video - unsurprisingly, he's a fake news peddler with a history of domestic abuse:
    On May 22, a Donald Trump superfan and occasional sports blogger from the Bronx named Shawn Brooks posted a video clip of Nancy Pelosi on his personal Facebook page. The clip showed Pelosi at her most excitable, stammering during a press conference as she voiced frustration over an abortive infrastructure meeting with the president. Brooks’ commentary on the video was succinct: “Is Pelosi drunk?”

    Thirteen minutes later, a Facebook official told The Daily Beast, Brooks posted a very different Pelosi video to a Facebook page called Politics WatchDog—one of a series of hyperpartisan news operations Brooks runs (with help, he claims). This clip had been altered to slow Pelosi down without lowering the pitch of her voice. The effect was to make it sound as though the Speaker of the House was slurring her words drunkenly while criticizing Donald Trump.

    Fifteen minutes after that, the same doctored video appeared on a second Facebook page Brooks manages, AllNews 24/7. This clip was identical to the Politics WatchDog video on every way, except that it didn’t carry the Politics WatchDog branding that was superimposed over the earlier video. Whoever posted it had access to the director’s cut. On both pages the clip was accompanied by the exact same dispassionate, newsy prose: “House Speaker Nancy Pelosi on President Trump walking out infrastructure meeting: ‘It was very, very, very strange.’”

    The video was an instant social media smash, surging through the internet’s well-worn ley lines of credulity and venom. It was shared more than 60,000 times on Facebook and accumulated 4 million page views from links. “Drunk as a skunk,” mused actor turned alt-right curmudgeon James Woods, whose tweet of the video scored 17,000 retweets and 55,000 likes. “What is wrong with Nancy Pelosi?”, wrote Rudy Giuliani, the president’s personal lawyer, in a tweet linking to the AllNews 24/7 post. “Her speech pattern is bizarre.”

    Brooks, a 34-year-old day laborer currently on probation after pleading guilty to domestic battery, claims that his “drunk” commentary on an unaltered Pelosi video had no connection to the now-infamous fake clip that premiered less than 15 minutes later. “I wasn’t the individual who created that Pelosi video,” he insisted in a telephone interview.

    XBL: Nox Aeternum / PSN: NoxAeternum / NN:NoxAeternum / Steam: noxaeternum
  • Descendant XDescendant X Skyrim is my god now. Outpost 31Registered User regular
    Facebook lawyer says the quiet parts out loud to try to dismiss Cambridge Analytica lawsuit:
    A lawyer for Facebook argued on Wednesday that its users had no expectation of privacy when using the social network, pushing for a judge to throw out a class-action lawsuit related to the Cambridge Analytica scandal.

    “There is no invasion of privacy at all, because there is no privacy,” on Facebook or any other social media site, company attorney Orin Snyder told U.S. District Judge Vince Chhabria.

    That may come as a surprise to those who have followed CEO Mark Zuckerberg’s pivot to privacy in recent months. He even wrote a manifesto for a “privacy-focused vision” of social media in March, saying he believes the future of communication lies in private, encrypted services.

    But his lawyer’s line of reasoning in court echoes what the company and Zuckerberg previously said both publicly and privately in past years — and explains how the company built an online advertising business that is now rivaled only by Google.

    So, unsurprisingly, the "privacy" pivot is solely CYA. Zuckerberg needs to go.

    That's not going nearly far enough. Facebook itself needs to go.

    Garry: I know you gentlemen have been through a lot, but when you find the time I'd rather not spend the rest of the winter TIED TO THIS FUCKING COUCH!
  • AngelHedgieAngelHedgie Registered User regular
    Oh hey, today we have Google, a Russian troll farm, and one hell of an unethical experiment:
    FOR MORE THAN two years, the notion of social media disinformation campaigns has conjured up images of Russia's Internet Research Agency, an entire company housed on multiple floors of a corporate building in St. Petersburg, concocting propaganda at the Kremlin's bidding. But a targeted troll campaign today can come much cheaper—as little as $250, says Andrew Gully, a research manager at Alphabet subsidiary Jigsaw. He knows because that's the price Jigsaw paid for one last year.

    As part of research into state-sponsored disinformation that it undertook in the spring of 2018, Jigsaw set out to test just how easily and cheaply social media disinformation campaigns, or "influence operations," could be bought in the shadier corners of the Russian-speaking web. In March 2018, after negotiating with several underground disinformation vendors, Jigsaw analysts went so far as to hire one to carry out an actual disinformation operation, assigning the paid troll service to attack a political activism website Jigsaw had itself created as a target.
    Two weeks later, SEOTweet reported back to Jigsaw that it had posted 730 Russian-language tweets attacking the anti-Stalin site from 25 different Twitter accounts, as well as 100 posts to forums and blog comment sections of seemingly random sites, from regional news sites to automotive and arts-and-crafts forums. Jigsaw says a significant number of the tweets and comments appeared to be original post written by humans, rather than simple copy-paste bots. "These aren't large numbers, and that’s intentional," says Jigsaw's Gully. "We weren’t trying to create a worldwide disinformation campaign about this. We just wanted to see if threat actors could provide a proof of concept."

    Wow. This is just disturbing. "Hey, for research, let's fund a criminal to attack a fake target we set up through real channels affecting real people!"

    XBL: Nox Aeternum / PSN: NoxAeternum / NN:NoxAeternum / Steam: noxaeternum
  • AngelHedgieAngelHedgie Registered User regular
    And Pelosi is calling out Facebook on their gooseshit:
    The Speaker of the House publicly castigated Facebook on Wednesday for both enabling Russian interference in the 2016 presidential election and its refusal to take down a video of the congresswoman that was edited to make it appear she was drunk.

    Pelosi didn't even come close to pulling her punches, arguing in an interview with KQED that the Mark Zuckerberg-helmed behemoth wasn't just an innocent victim of a foreign campaign to spread disinformation — rather, it happily played along.

    "I think they have proven that they were willing enablers of the Russian interference in our election," Pelosi said, according to Scott Shafer, a senior editor at KQED. "They're lying to the public."

    She also pointed out that if the video was about Zuckerberg, the response would be very different:


    "They are lying to the public, I wonder what they would do if Mark Zuckerberg was portrayed -- slowed down, appeared drunk - I don't even drink ... but if it was one of their own, would it be their policy, or is it just a woman?"

    Marisa Lagos is a reporter for KQED.

    In follow-up, someone has decided to test Pelosi's theory:
    This week, the world saw a deepfake video of Mark Zuckerberg on Instagram. The video resurrected the age-old internet joke: How can you tell the difference between Zuckerberg and a piece of software? And the video has finally given us a definitive answer: The software is more likable.

    ...The video was a clear challenge to Instagram (which is owned by Facebook) to see if they would take it down and be exposed as hypocrites, since they refused to so with the clearly doctored Drunk Nancy Pelosi video from this past May. But for now, Facebook is sticking to its guns of not caring if "information you post on Facebook must be true," much to the relief of the site's biggest demographic, which we're guessing is anti-vaxxer dog moms nowadays.

    XBL: Nox Aeternum / PSN: NoxAeternum / NN:NoxAeternum / Steam: noxaeternum
  • AngelHedgieAngelHedgie Registered User regular
    The Verge has an expose of Facebook's content moderation farms, where contractors work for a pittance, subjected to the worst of humanity - a place where the management refused to tell the employees that one of their own died there.

    I'll post the key findings in a spoiler, but first, here's the warning at the top of the story:
    Content warning: This story contains descriptions of violent acts against people and animals, accounts of sexual harassment and post-traumatic stress disorder, and other potentially disturbing content.
    KEY FINDINGS
    Facebook’s content moderation site in Tampa, FL, which is operated by the professional services firm Cognizant, is its lowest-performing site in North America. It has never consistently enforced Facebook’s policies with 98 percent accuracy, as stipulated in Cognizant’s contract.
    For the first time, three former Facebook moderators in North America are breaking their nondisclosure agreements and going on the record to discuss working conditions on the site.
    A Facebook content moderator working for Cognizant in Tampa had a heart attack at his desk and died last year. Senior management initially discouraged employees from discussing the incident, for fear it would hurt productivity.
    Tampa workers have filed two sexual harassment cases against coworkers since April. They are now before the US Equal Employment Opportunity Commission.
    Facilities at the Tampa site are often filthy, with workers reporting that the office’s only bathroom has repeatedly been found smeared with feces and menstrual blood.
    Workers have also found pubic hair and fingernails at their desks, along with other bodily waste.
    Verbal and physical fights at the office are common. So are reports of theft.
    The Phoenix site has been dealing with an infestation of bed bugs for the past three months.
    Facebook says it will conduct an audit of its partner sites and make other changes to promote the well-being of its contractors. It said it would consider making more moderators full-time employees in the future, and hopes to someday provide counseling for moderators after they leave.

    Facebook delenda est.

    XBL: Nox Aeternum / PSN: NoxAeternum / NN:NoxAeternum / Steam: noxaeternum
  • Descendant XDescendant X Skyrim is my god now. Outpost 31Registered User regular
    edited June 2019
    Everyone interested in this thread so go have a gander at Neal Stephenson's new book Fall; or, Dodge in Hell. I've not finished it yet and I won't spoil anything, but it has some very interesting ideas on what might have to happen in order for things to get better. Of course, the thing that might have to happen makes things worse, but that's just Neal Stephenson for you.

    Edit: Just had a read of the article @AngelHedgie posted. If you need to employ that many people to moderate content and they are getting fucking PTSD from it, your platform is the problem. Facebook needs to be shut down.

    Descendant X on
    Garry: I know you gentlemen have been through a lot, but when you find the time I'd rather not spend the rest of the winter TIED TO THIS FUCKING COUCH!
  • Inquisitor77Inquisitor77 2 x Penny Arcade Fight Club Champion A fixed point in space and timeRegistered User regular
    edited June 2019
    Yeah human content moderation has actually been a growing issue for a while. People clamor for "curated" content but then don't realize that someone will have to watch all the filth and dreck in order to determine if it needs to be removed. Wired had an article about it a while ago: https://www.wired.com/2014/10/content-moderation/

    I think if open platforms like Youtube and Facebook actually had to properly moderate their content, and then had to provide the physical and mental health support actually commensurate with those jobs, there is an open question as to whether their business models would be sustainable. The fact of the matter is that a place where you can upload literally anything and show it to literally everyone in the world requires a level of moderation that is likely impossible to support if you were to apply, say, American public television broadcast standards to them. Facebook would get shut down immediately based on the fines alone.

    Inquisitor77 on
  • FANTOMASFANTOMAS Flan ArgentavisRegistered User regular
    Yeah human content moderation has actually been a growing issue for a while. People clamor for "curated" content but then don't realize that someone will have to watch all the filth and dreck in order to determine if it needs to be removed. Wired had an article about it a while ago: https://www.wired.com/2014/10/content-moderation/

    I think if open platforms like Youtube and Facebook actually had to properly moderate their content, and then had to provide the physical and mental health support actually commensurate with those jobs, there is an open question as to whether their business models would be sustainable. The fact of the matter is that a place where you can upload literally anything and show it to literally everyone in the world requires a level of moderation that is likely impossible to support if you were to apply, say, American public television broadcast standards to them. Facebook would get shut down immediately based on the fines alone.

    I dont think its a sustainability issue, from Hedgies article it wouldnt seem that Facebooks budget is primarily spent on moderators.

    Yes, with a quick verbal "boom." You take a man's peko, you deny him his dab, all that is left is to rise up and tear down the walls of Jericho with a ".....not!" -TexiKen
  • PaladinPaladin Registered User regular
    edited June 2019
    FANTOMAS wrote: »
    Yeah human content moderation has actually been a growing issue for a while. People clamor for "curated" content but then don't realize that someone will have to watch all the filth and dreck in order to determine if it needs to be removed. Wired had an article about it a while ago: https://www.wired.com/2014/10/content-moderation/

    I think if open platforms like Youtube and Facebook actually had to properly moderate their content, and then had to provide the physical and mental health support actually commensurate with those jobs, there is an open question as to whether their business models would be sustainable. The fact of the matter is that a place where you can upload literally anything and show it to literally everyone in the world requires a level of moderation that is likely impossible to support if you were to apply, say, American public television broadcast standards to them. Facebook would get shut down immediately based on the fines alone.

    I dont think its a sustainability issue, from Hedgies article it wouldnt seem that Facebooks budget is primarily spent on moderators.

    Edit: ah oops misread

    Paladin on
    Marty: The future, it's where you're going?
    Doc: That's right, twenty five years into the future. I've always dreamed on seeing the future, looking beyond my years, seeing the progress of mankind. I'll also be able to see who wins the next twenty-five world series.
  • CelestialBadgerCelestialBadger Registered User regular
    AI moderation is the big dream for social media companies. That way you don’t need to pay for or traumatize humans.

  • Captain InertiaCaptain Inertia Registered User regular
    I think the greater point is not the dollars or business model impact but whether such platforms should exist if this is the toll moderating it takes on humans...

  • PaladinPaladin Registered User regular
    I think the greater point is not the dollars or business model impact but whether such platforms should exist if this is the toll moderating it takes on humans...

    There is a laundry list of jobs much more hellish than this that we should consider if this is going to be our stand. Phasing out the worst jobs in the world is not something humanity has yet set its sights on.

    Marty: The future, it's where you're going?
    Doc: That's right, twenty five years into the future. I've always dreamed on seeing the future, looking beyond my years, seeing the progress of mankind. I'll also be able to see who wins the next twenty-five world series.
  • HamHamJHamHamJ Registered User regular
    AI moderation is the big dream for social media companies. That way you don’t need to pay for or traumatize humans.

    So what you're saying is that Skynet is going to be a Facebook moderation AI that decides hacking the nuclear launch system is better than putting up with humans any longer.

    While racing light mechs, your Urbanmech comes in second place, but only because it ran out of ammo.
  • ForarForar #432 Toronto, Ontario, CanadaRegistered User regular
    If this isn’t in the reboot, it damned well should be.

    First they came for the Muslims, and we said NOT TODAY, MOTHERFUCKER!
  • Captain InertiaCaptain Inertia Registered User regular
    edited June 2019
    Paladin wrote: »
    I think the greater point is not the dollars or business model impact but whether such platforms should exist if this is the toll moderating it takes on humans...

    There is a laundry list of jobs much more hellish than this that we should consider if this is going to be our stand. Phasing out the worst jobs in the world is not something humanity has yet set its sights on.

    Quite a big difference in necessity of crime scene clean up and the necessity of making it easy for edge lords to post videos of said crime happening

    Captain Inertia on
  • MonwynMonwyn Apathy's a tragedy, and boredom is a crime. A little bit of everything, all of the time.Registered User regular
    Yeah human content moderation has actually been a growing issue for a while. People clamor for "curated" content but then don't realize that someone will have to watch all the filth and dreck in order to determine if it needs to be removed. Wired had an article about it a while ago: https://www.wired.com/2014/10/content-moderation/

    I think if open platforms like Youtube and Facebook actually had to properly moderate their content, and then had to provide the physical and mental health support actually commensurate with those jobs, there is an open question as to whether their business models would be sustainable. The fact of the matter is that a place where you can upload literally anything and show it to literally everyone in the world requires a level of moderation that is likely impossible to support if you were to apply, say, American public television broadcast standards to them. Facebook would get shut down immediately based on the fines alone.

    I've long thought that YouTube (at the very least) should require a nominal fee per, say, fifteen minutes uploaded. $5 or something. It's not much of a barrier to entry, but it's enough to make people have to actually think before tossing shit up.

    Implicit in the idea of a Marketplace of Ideas is that there are costs of doing business. That's... not really the case with the internet, anymore.

    uH3IcEi.png
  • evilmrhenryevilmrhenry Registered User regular
    Monwyn wrote: »
    Yeah human content moderation has actually been a growing issue for a while. People clamor for "curated" content but then don't realize that someone will have to watch all the filth and dreck in order to determine if it needs to be removed. Wired had an article about it a while ago: https://www.wired.com/2014/10/content-moderation/

    I think if open platforms like Youtube and Facebook actually had to properly moderate their content, and then had to provide the physical and mental health support actually commensurate with those jobs, there is an open question as to whether their business models would be sustainable. The fact of the matter is that a place where you can upload literally anything and show it to literally everyone in the world requires a level of moderation that is likely impossible to support if you were to apply, say, American public television broadcast standards to them. Facebook would get shut down immediately based on the fines alone.

    I've long thought that YouTube (at the very least) should require a nominal fee per, say, fifteen minutes uploaded. $5 or something. It's not much of a barrier to entry, but it's enough to make people have to actually think before tossing shit up.

    Implicit in the idea of a Marketplace of Ideas is that there are costs of doing business. That's... not really the case with the internet, anymore.

    Or the Something Awful Forum's joining fee of $10. The real problem is that none of the major social media companies have realized that there's content that they don't want on their platform, so they don't want to put any barriers in place.

  • HamHamJHamHamJ Registered User regular
    Monwyn wrote: »
    Yeah human content moderation has actually been a growing issue for a while. People clamor for "curated" content but then don't realize that someone will have to watch all the filth and dreck in order to determine if it needs to be removed. Wired had an article about it a while ago: https://www.wired.com/2014/10/content-moderation/

    I think if open platforms like Youtube and Facebook actually had to properly moderate their content, and then had to provide the physical and mental health support actually commensurate with those jobs, there is an open question as to whether their business models would be sustainable. The fact of the matter is that a place where you can upload literally anything and show it to literally everyone in the world requires a level of moderation that is likely impossible to support if you were to apply, say, American public television broadcast standards to them. Facebook would get shut down immediately based on the fines alone.

    I've long thought that YouTube (at the very least) should require a nominal fee per, say, fifteen minutes uploaded. $5 or something. It's not much of a barrier to entry, but it's enough to make people have to actually think before tossing shit up.

    Implicit in the idea of a Marketplace of Ideas is that there are costs of doing business. That's... not really the case with the internet, anymore.
    Monwyn wrote: »
    Yeah human content moderation has actually been a growing issue for a while. People clamor for "curated" content but then don't realize that someone will have to watch all the filth and dreck in order to determine if it needs to be removed. Wired had an article about it a while ago: https://www.wired.com/2014/10/content-moderation/

    I think if open platforms like Youtube and Facebook actually had to properly moderate their content, and then had to provide the physical and mental health support actually commensurate with those jobs, there is an open question as to whether their business models would be sustainable. The fact of the matter is that a place where you can upload literally anything and show it to literally everyone in the world requires a level of moderation that is likely impossible to support if you were to apply, say, American public television broadcast standards to them. Facebook would get shut down immediately based on the fines alone.

    I've long thought that YouTube (at the very least) should require a nominal fee per, say, fifteen minutes uploaded. $5 or something. It's not much of a barrier to entry, but it's enough to make people have to actually think before tossing shit up.

    Implicit in the idea of a Marketplace of Ideas is that there are costs of doing business. That's... not really the case with the internet, anymore.

    That's not nominal. I think that would literally collapse Youtube as a business model.

    While racing light mechs, your Urbanmech comes in second place, but only because it ran out of ammo.
  • Captain InertiaCaptain Inertia Registered User regular
    HamHamJ wrote: »
    Monwyn wrote: »
    Yeah human content moderation has actually been a growing issue for a while. People clamor for "curated" content but then don't realize that someone will have to watch all the filth and dreck in order to determine if it needs to be removed. Wired had an article about it a while ago: https://www.wired.com/2014/10/content-moderation/

    I think if open platforms like Youtube and Facebook actually had to properly moderate their content, and then had to provide the physical and mental health support actually commensurate with those jobs, there is an open question as to whether their business models would be sustainable. The fact of the matter is that a place where you can upload literally anything and show it to literally everyone in the world requires a level of moderation that is likely impossible to support if you were to apply, say, American public television broadcast standards to them. Facebook would get shut down immediately based on the fines alone.

    I've long thought that YouTube (at the very least) should require a nominal fee per, say, fifteen minutes uploaded. $5 or something. It's not much of a barrier to entry, but it's enough to make people have to actually think before tossing shit up.

    Implicit in the idea of a Marketplace of Ideas is that there are costs of doing business. That's... not really the case with the internet, anymore.
    Monwyn wrote: »
    Yeah human content moderation has actually been a growing issue for a while. People clamor for "curated" content but then don't realize that someone will have to watch all the filth and dreck in order to determine if it needs to be removed. Wired had an article about it a while ago: https://www.wired.com/2014/10/content-moderation/

    I think if open platforms like Youtube and Facebook actually had to properly moderate their content, and then had to provide the physical and mental health support actually commensurate with those jobs, there is an open question as to whether their business models would be sustainable. The fact of the matter is that a place where you can upload literally anything and show it to literally everyone in the world requires a level of moderation that is likely impossible to support if you were to apply, say, American public television broadcast standards to them. Facebook would get shut down immediately based on the fines alone.

    I've long thought that YouTube (at the very least) should require a nominal fee per, say, fifteen minutes uploaded. $5 or something. It's not much of a barrier to entry, but it's enough to make people have to actually think before tossing shit up.

    Implicit in the idea of a Marketplace of Ideas is that there are costs of doing business. That's... not really the case with the internet, anymore.

    That's not nominal. I think that would literally collapse Youtube as a business model.

    Probably

    I don’t know if I hold the big, irresponsible social media and tech companies in any less contempt than I do AEP for frying the planet though. And I want AEP to pay the costs of the carbon they put in the air even if it collapses their business model.

    There are definite costs to society from Facebook’s and YouTube’s and Twitter’s shit

  • DarkPrimusDarkPrimus Registered User regular
    "But [X] would collapse if we forced them to change their business model" is sometimes not so much a defense against forcing a company to change as it is revealing that perhaps a company's current business model is bad, actually.

Sign In or Register to comment.