Club PA 2.0 has arrived! If you'd like to access some extra PA content and help support the forums, check it out at patreon.com/ClubPA
The image size limit has been raised to 1mb! Anything larger than that should be linked to. This is a HARD limit, please do not abuse it.
Our new Indie Games subforum is now open for business in G&T. Go and check it out, you might land a code for a free game. If you're developing an indie game and want to post about it, follow these directions. If you don't, he'll break your legs! Hahaha! Seriously though.
Our rules have been updated and given their own forum. Go and look at them! They are nice, and there may be new ones that you didn't know about! Hooray for rules! Hooray for The System! Hooray for Conforming!

[Freedom Of Speech]: More Than The First Amendment

14243454748

Posts

  • DarkPrimusDarkPrimus Registered User regular
    edited July 26
    Paladin wrote: »
    DarkPrimus wrote: »
    If you want to argue that there is merit in white supremacy, go ahead and nail yourself to that cross. I've made my position very clear.

    I doubt that both of us have made anything very clear. I am not sure how the concepts we discussed apply in terms of legal precedent, current events, constitutionality, or philosophy of free expression, and I believe this is because we may have differing ideas of a very abstract concept: merit. I'm willing to discuss it if you are, but if you feel you cannot discuss this calmly with me, then I respect your self awareness and will drop the issue.

    I have provided my opinion, which was conveniently worded so as to be applicable to multiple meanings of the word "merit." If you were intending a specific definition of merit that you feel is not properly addressed by what I have already said, then please say so. Otherwise, I don't see this as an actual discussion occurring, but you layering on semantics question over semantics question like the reverse peeling of an onion, trying to force us into a rhetorical corner without taking any definitive stance yourself.

    DarkPrimus on
    dt3GeqU.png
    Gamertag: PrimusD | Rock Band DLC | GW:OttW - arrcd | WLD - Thortar
    Kristmas KthulhuFANTOMASQanamilForarGnome-InterruptusHappy Little Machine
  • PaladinPaladin Registered User regular
    edited July 27
    Julius wrote: »
    Paladin wrote: »
    DarkPrimus wrote: »
    White supremacy has merit?

    Depends on how you define merit. Is there merit in knowing what it is, how it is expressed, who believes some or all of it, what causes them to believe it, what causes them to act on it, etc?

    Paladin, the problem here is likely that unless you define merit really weirdly, the very obvious answer here is "No.". There is no intrinsic merit in knowledge. The definition of merit as value/worth is about moral value, not value as usefulness. There may be value to knowing what white supremacy is etc. in order to better understand the world, but no merit.

    White supremacy has no merit. It is entirely worthless, there is no good to it.

    Thanks for clarifying. I brought up the topic after reading an article about campus speech codes spurred by Phillishere's link to the story about the professor in order to better understand why such a dividing line is placed between on-campus speech and off-campus speech.

    In that article, the statement of "The First Amendment offers absolute protection of speech only when its purpose is to advance worthy societal and political objectives," was interesting to me, and I felt it could be encapsulated into a definition of merit that, while weird and counterintuitive, could help explain the philosophy of why some amoral speech is protected by law and why other amoral speech is not. This amorality is also reflected in the legal definition of merit, which is the inherent right and wrong of a case, absent of any emotional or technical bias. Does this also mean absence of any moral bias?

    The only reason I go into this is because the free expression of repugnant and hateful ideas have some kind of value before the law. We have defended the expression of these ideas because of this. If merit is defined as morality, then we defend the expression of meritless ideas. That kind of feels like a wrong statement to make, and the more I talk about it, the more I agree that the concept of merit is a red herring. It is not evident in the language of our laws or the decisions our courts make.* Therefore, I now contend that whether an idea has merit (aka morality) is nonessential to whether it should be allowed to be expressed. Other attributes govern its status as protected or unprotected speech.

    * I've been reading an article on the freedom of expression in the decisionmaking process. In AFL v. Swing, the court stated that “the group in power at any moment may not impose penal sanctions on peaceful and truthful discussion of matters of public interest merely on a showing that others may thereby be persuaded to take action inconsistent with its interests. Abridgment of the liberty of such discussion can be justified only where the clear danger of substantive evils arises under circumstances affording no opportunity to test the merits of ideas by competition for acceptance in the market of public opinion.” This suggests that the merit of ideas is secondary to the decision to allow any instance of speech.


    Therefore, in summary:
    1. merit = morality: White supremacy is inherently amoral, and therefore without merit.
    2. merit = unbiased truth from emotion or technicality: White supremacy is inherently biased, and its value through expression is indirect and technical, and therefore without merit.
    3. merit = value: there is value in all knowledge that incorporates reasoning or fact and communication has inherent value. There is also value in any communication of an idea that does not guarantee a violation of the rights of others. White supremacy is also a matter of public interest. Therefore, the merit of the ideas of any individual that believes in white supremacy can be evaluated on a case by case basis.
    - In addition, while merit may have real implications in extrajudicial decisionmaking, it is not a deciding factor in laws regarding freedom of expression.

    This third definition - if you do not believe it qualifies as a definition of merit and you only accept one or more of the previous ones - then I personally agree that the idea of white supremacy has no merit. Julius, since you have said as much, I must agree with your logic, given the assumptions. Darkprimus, I hope that this helps clarify things for you so we may reach some sort of consensus.

    Paladin on
    Marty: The future, it's where you're going?
    Doc: That's right, twenty five years into the future. I've always dreamed on seeing the future, looking beyond my years, seeing the progress of mankind. I'll also be able to see who wins the next twenty-five world series.
    DarkPrimus
  • Jebus314Jebus314 Registered User regular
    Jebus314 wrote: »
    The Sauce wrote: »
    Phoenix-D wrote: »
    Jebus314 wrote: »
    Jebus314 wrote: »
    Nova_C wrote: »
    Free speech protections are more than just what you say, though.

    Like, do I not have the right to refuse to fund speech I disagree with?

    So donors pulling their funding from a school that has a teacher openly saying non-whites are less capable than whites, or that only whites should be allowed to immigrate is an example of those donors exercising their own rights.

    And does the rest of the faculty have any speech or association rights of their own?

    Or is this a case where people that do not want to be associated with that speech must be the ones to leave?

    I honestly don't think anyone here wants to promote white supremacy, but the end result of saying that white supremacists should be free from any societal consequence for promoting hatred and violence against non-whites is that white supremacy is the only perspective that will be allowed to be promoted.

    I think this ignores all of the times social pressure has been used in grossly unfair and unjust ways. Are donors pulling funding because the school is now accepting minorities also just exercising freedom of speech/association? Does the faculty also have the right to not "associate with"/teach minorities if they so desire?

    I'm not saying there should be no consequences for anything. For example, I do believe that acting in a racist manner in a classroom (grading minoties differently for example), should definitely be fireable. But I do believe that making it easier to fire people for their beliefs, when they are completely professional at work, is not a great outcome, even if it can at times be used for good.

    There's a saying I've heard in regards to the argument that you can have a "professional bigot" - "there is no Dr. Jekyll and Mr. Hyde." What that means is that it is ridiculous to argue that people can easily compartmentalize their beliefs - someone who openly lies about the performance of black students (like Wax) cannot be trusted to treat black students fairly (hence why in part she was pulled from teaching mandatory 1L courses.)

    As for your two questions: Yes, donors can legally pull their funding over minorities - and we can then hold them accountable for doing that by pointing out what they are doing and making society at large aware of their bigotry. No, teachers do not have the right to not teach minorities if they wish - but this is because we have specific laws stating this.

    If the professional bigot cannot exist, then it should be easy to prove their bigotry inside the classroom, without relying on statements they made elsewhere. If it is not easy, then maybe that says something about the assumption that the professional bigot cannot exist.

    The point of those examples was to counter the argument that teachers/donors/faculty should be allowed to exercise their freedoms by forcing a school to fire a racist. But as my examples attempt to point out, we already limit the freedoms of teachers/faculty (maybe not donors that was bad example) whilst on campus or dealing with school matters. The racist teacher on the other hand is not on campus, or dealing with school matters.

    And again I would point out that social pressure is quite often used for ill. How often does the racist get fired rather than the lone liberal, or environmentalist, or some other "outsider" for whatever region you happen to be in? I don't think that employment is the right area to target for social justice, because the risks for poor outcomes are too high.

    You make the mistaken assumption that not doing so will prevent the bad outcomes you're worrying about.

    Also the bigot doesn't have to say anything in the classroom to have a negative effect. There's any number of other things, from being harsher with scoring to "just happening" to never be available to assist students they're biased against.
    Yes! And knowing the professor is a bigot is enough of a negative effect. I'm not intending to return to university any time soon, but if I did start taking courses again, and I was in a class where the professor had a history of being bluntly transphobic outside the classroom? I'd spend every moment in that class somewhere between uncomfortable and terrified. If it's an optional class I'd get the hell out, even if the material was something I was otherwise very interested in and intended to integrate into my career pursuits afterward. Either situation does actual and obvious harm.

    I'd feel those ways about a professor even if I weren't in their specific targeted demographic, because who knows who else they hate and what it might result in.

    I don't want to listen to people like that, much less try to learn anything from them, and it's unreasonable and frankly hostile to expect students to work around such people when trying to navigate their curriculum.

    That’s just in the classroom.

    Professors have a mentoring role beyond teaching. Would a student have the same comfort seeking a professor for assistance during office hours, asking for a reference, picking them for a thesis committee, etc. if that professor was publicly bigoted against the student?

    The question is what should we consider a reasonable fear. There are lots of teachers who students fear and don't approach for completely benign reasons. They seem mean. They seem really smart and you don't want to look stupid. They seem really busy and you don't want to anger them by disrupting. Etc. Fear is not in and of itself enough cause to terminate the professor.

    Which isn't to say feelings never matter. For example, threats/intimidation would likely change the calculus so that we would say their fear is well founded and significant enough to warrant action. But this is where intent of the professor is important. In Wax's case, stating that america would be better off with less minority immigrants coming in is definitely racist and stupid, but I seriously doubt could in any way be construed as a threat. Thus I don't believe a minority student would have cause to say their fear (that something bad would happen to them during office hours) is well founded enough to warrant action.

    The issue is not "oh, her comments threaten students", it's that people don't work the way you seem to think they do. Bigotry isn't something that gets compartmentalized by most people, which is why students of the group the professor is targeting are going to view them as unsafe and likely to harm them in the classroom.

    I still maintain that you have to decide what is and is not a reasonable fear. The examples I gave are real fears too. People feeling those things might legitimately feel unsafe. But we don't accommodate them because their fears are either unfounded or not significant enough. The same can and should be said about "racism."

    I put it in quotes only to drive home the point that there is a spectrum here. If we are talking about a klan leader who has participated in cross burnings, intimidation/threats, and possibly violence against minorities, even if those all happened off campus, I would say the fear of harm is significant and well founded for minorities taking a class with them or going to their office hours.

    If we are talking about someone who made a single racist joke at a party one time (off campus and in their own residence), then I would say that someones fear that there would be some kind of harm during office hours is unfounded and non-significant. Because the act is "racist" does not mean it automatically becomes significant enough to warrant removal, and I would say speech in general rarely rises to such a level (although clearly it can).

    "The world is a mess, and I just need to rule it" - Dr Horrible
    WhiteZinfandel
  • DarkPrimusDarkPrimus Registered User regular
    edited July 27
    Yes, that clarifies things nicely, thank you Paladin. I essentially agree with your summary. Where we diverge is about whether or not "repugnant and hateful" speech is worthy of defense.

    DarkPrimus on
    dt3GeqU.png
    Gamertag: PrimusD | Rock Band DLC | GW:OttW - arrcd | WLD - Thortar
  • Jebus314Jebus314 Registered User regular
    edited July 27
    Julius wrote: »
    Jebus314 wrote: »
    The Sauce wrote: »
    Phoenix-D wrote: »
    Jebus314 wrote: »
    Jebus314 wrote: »
    Nova_C wrote: »
    Free speech protections are more than just what you say, though.

    Like, do I not have the right to refuse to fund speech I disagree with?

    So donors pulling their funding from a school that has a teacher openly saying non-whites are less capable than whites, or that only whites should be allowed to immigrate is an example of those donors exercising their own rights.

    And does the rest of the faculty have any speech or association rights of their own?

    Or is this a case where people that do not want to be associated with that speech must be the ones to leave?

    I honestly don't think anyone here wants to promote white supremacy, but the end result of saying that white supremacists should be free from any societal consequence for promoting hatred and violence against non-whites is that white supremacy is the only perspective that will be allowed to be promoted.

    I think this ignores all of the times social pressure has been used in grossly unfair and unjust ways. Are donors pulling funding because the school is now accepting minorities also just exercising freedom of speech/association? Does the faculty also have the right to not "associate with"/teach minorities if they so desire?

    I'm not saying there should be no consequences for anything. For example, I do believe that acting in a racist manner in a classroom (grading minoties differently for example), should definitely be fireable. But I do believe that making it easier to fire people for their beliefs, when they are completely professional at work, is not a great outcome, even if it can at times be used for good.

    There's a saying I've heard in regards to the argument that you can have a "professional bigot" - "there is no Dr. Jekyll and Mr. Hyde." What that means is that it is ridiculous to argue that people can easily compartmentalize their beliefs - someone who openly lies about the performance of black students (like Wax) cannot be trusted to treat black students fairly (hence why in part she was pulled from teaching mandatory 1L courses.)

    As for your two questions: Yes, donors can legally pull their funding over minorities - and we can then hold them accountable for doing that by pointing out what they are doing and making society at large aware of their bigotry. No, teachers do not have the right to not teach minorities if they wish - but this is because we have specific laws stating this.

    If the professional bigot cannot exist, then it should be easy to prove their bigotry inside the classroom, without relying on statements they made elsewhere. If it is not easy, then maybe that says something about the assumption that the professional bigot cannot exist.

    The point of those examples was to counter the argument that teachers/donors/faculty should be allowed to exercise their freedoms by forcing a school to fire a racist. But as my examples attempt to point out, we already limit the freedoms of teachers/faculty (maybe not donors that was bad example) whilst on campus or dealing with school matters. The racist teacher on the other hand is not on campus, or dealing with school matters.

    And again I would point out that social pressure is quite often used for ill. How often does the racist get fired rather than the lone liberal, or environmentalist, or some other "outsider" for whatever region you happen to be in? I don't think that employment is the right area to target for social justice, because the risks for poor outcomes are too high.

    You make the mistaken assumption that not doing so will prevent the bad outcomes you're worrying about.

    Also the bigot doesn't have to say anything in the classroom to have a negative effect. There's any number of other things, from being harsher with scoring to "just happening" to never be available to assist students they're biased against.
    Yes! And knowing the professor is a bigot is enough of a negative effect. I'm not intending to return to university any time soon, but if I did start taking courses again, and I was in a class where the professor had a history of being bluntly transphobic outside the classroom? I'd spend every moment in that class somewhere between uncomfortable and terrified. If it's an optional class I'd get the hell out, even if the material was something I was otherwise very interested in and intended to integrate into my career pursuits afterward. Either situation does actual and obvious harm.

    I'd feel those ways about a professor even if I weren't in their specific targeted demographic, because who knows who else they hate and what it might result in.

    I don't want to listen to people like that, much less try to learn anything from them, and it's unreasonable and frankly hostile to expect students to work around such people when trying to navigate their curriculum.

    That’s just in the classroom.

    Professors have a mentoring role beyond teaching. Would a student have the same comfort seeking a professor for assistance during office hours, asking for a reference, picking them for a thesis committee, etc. if that professor was publicly bigoted against the student?

    The question is what should we consider a reasonable fear. There are lots of teachers who students fear and don't approach for completely benign reasons. They seem mean. They seem really smart and you don't want to look stupid. They seem really busy and you don't want to anger them by disrupting. Etc. Fear is not in and of itself enough cause to terminate the professor.

    Which isn't to say feelings never matter. For example, threats/intimidation would likely change the calculus so that we would say their fear is well founded and significant enough to warrant action. But this is where intent of the professor is important. In Wax's case, stating that america would be better off with less minority immigrants coming in is definitely racist and stupid, but I seriously doubt could in any way be construed as a threat. Thus I don't believe a minority student would have cause to say their fear (that something bad would happen to them during office hours) is well founded enough to warrant action.

    I don't think things have to rise to the level of threats to be actionable. I think some form of action is warranted so long as a student can reasonably believe they will not be treated fairly or taken seriously/accepted. e.g. a professor openly and frequently stating minorities are intellectually inferior or transgender persons are mentally ill makes it reasonable for students to believe they will not be treated fairly, so it seems reasonable that the professor should be dismissed from at least some teaching duties.

    I dunno about Wax, but I would point out that she also previously claimed black students rarely graduated in the top half of class without any data to back that up. I think it is reasonable to take previous statements into account in considering the reasonability of fear.

    I mostly agree with this, although I would again state that rather than relying on the possibility of unequal treatment we should be requiring proof of actual unequal treatment before we fire someone. If it is basically a forgone conclusion that someone who made racist remarks (say wax) can not, or chooses not, to treat minorities the same in terms of grading or communication, then it should not be that hard to prove those things are happening. In which case you can very easily fire them for being unequal inside the classroom instead of firing them for making comments outside of it, that may indicate a bias that could possibly affect their ability to do their job.

    Jebus314 on
    "The world is a mess, and I just need to rule it" - Dr Horrible
  • AngelHedgieAngelHedgie Registered User regular
    Jebus314 wrote: »
    Julius wrote: »
    Jebus314 wrote: »
    The Sauce wrote: »
    Phoenix-D wrote: »
    Jebus314 wrote: »
    Jebus314 wrote: »
    Nova_C wrote: »
    Free speech protections are more than just what you say, though.

    Like, do I not have the right to refuse to fund speech I disagree with?

    So donors pulling their funding from a school that has a teacher openly saying non-whites are less capable than whites, or that only whites should be allowed to immigrate is an example of those donors exercising their own rights.

    And does the rest of the faculty have any speech or association rights of their own?

    Or is this a case where people that do not want to be associated with that speech must be the ones to leave?

    I honestly don't think anyone here wants to promote white supremacy, but the end result of saying that white supremacists should be free from any societal consequence for promoting hatred and violence against non-whites is that white supremacy is the only perspective that will be allowed to be promoted.

    I think this ignores all of the times social pressure has been used in grossly unfair and unjust ways. Are donors pulling funding because the school is now accepting minorities also just exercising freedom of speech/association? Does the faculty also have the right to not "associate with"/teach minorities if they so desire?

    I'm not saying there should be no consequences for anything. For example, I do believe that acting in a racist manner in a classroom (grading minoties differently for example), should definitely be fireable. But I do believe that making it easier to fire people for their beliefs, when they are completely professional at work, is not a great outcome, even if it can at times be used for good.

    There's a saying I've heard in regards to the argument that you can have a "professional bigot" - "there is no Dr. Jekyll and Mr. Hyde." What that means is that it is ridiculous to argue that people can easily compartmentalize their beliefs - someone who openly lies about the performance of black students (like Wax) cannot be trusted to treat black students fairly (hence why in part she was pulled from teaching mandatory 1L courses.)

    As for your two questions: Yes, donors can legally pull their funding over minorities - and we can then hold them accountable for doing that by pointing out what they are doing and making society at large aware of their bigotry. No, teachers do not have the right to not teach minorities if they wish - but this is because we have specific laws stating this.

    If the professional bigot cannot exist, then it should be easy to prove their bigotry inside the classroom, without relying on statements they made elsewhere. If it is not easy, then maybe that says something about the assumption that the professional bigot cannot exist.

    The point of those examples was to counter the argument that teachers/donors/faculty should be allowed to exercise their freedoms by forcing a school to fire a racist. But as my examples attempt to point out, we already limit the freedoms of teachers/faculty (maybe not donors that was bad example) whilst on campus or dealing with school matters. The racist teacher on the other hand is not on campus, or dealing with school matters.

    And again I would point out that social pressure is quite often used for ill. How often does the racist get fired rather than the lone liberal, or environmentalist, or some other "outsider" for whatever region you happen to be in? I don't think that employment is the right area to target for social justice, because the risks for poor outcomes are too high.

    You make the mistaken assumption that not doing so will prevent the bad outcomes you're worrying about.

    Also the bigot doesn't have to say anything in the classroom to have a negative effect. There's any number of other things, from being harsher with scoring to "just happening" to never be available to assist students they're biased against.
    Yes! And knowing the professor is a bigot is enough of a negative effect. I'm not intending to return to university any time soon, but if I did start taking courses again, and I was in a class where the professor had a history of being bluntly transphobic outside the classroom? I'd spend every moment in that class somewhere between uncomfortable and terrified. If it's an optional class I'd get the hell out, even if the material was something I was otherwise very interested in and intended to integrate into my career pursuits afterward. Either situation does actual and obvious harm.

    I'd feel those ways about a professor even if I weren't in their specific targeted demographic, because who knows who else they hate and what it might result in.

    I don't want to listen to people like that, much less try to learn anything from them, and it's unreasonable and frankly hostile to expect students to work around such people when trying to navigate their curriculum.

    That’s just in the classroom.

    Professors have a mentoring role beyond teaching. Would a student have the same comfort seeking a professor for assistance during office hours, asking for a reference, picking them for a thesis committee, etc. if that professor was publicly bigoted against the student?

    The question is what should we consider a reasonable fear. There are lots of teachers who students fear and don't approach for completely benign reasons. They seem mean. They seem really smart and you don't want to look stupid. They seem really busy and you don't want to anger them by disrupting. Etc. Fear is not in and of itself enough cause to terminate the professor.

    Which isn't to say feelings never matter. For example, threats/intimidation would likely change the calculus so that we would say their fear is well founded and significant enough to warrant action. But this is where intent of the professor is important. In Wax's case, stating that america would be better off with less minority immigrants coming in is definitely racist and stupid, but I seriously doubt could in any way be construed as a threat. Thus I don't believe a minority student would have cause to say their fear (that something bad would happen to them during office hours) is well founded enough to warrant action.

    I don't think things have to rise to the level of threats to be actionable. I think some form of action is warranted so long as a student can reasonably believe they will not be treated fairly or taken seriously/accepted. e.g. a professor openly and frequently stating minorities are intellectually inferior or transgender persons are mentally ill makes it reasonable for students to believe they will not be treated fairly, so it seems reasonable that the professor should be dismissed from at least some teaching duties.

    I dunno about Wax, but I would point out that she also previously claimed black students rarely graduated in the top half of class without any data to back that up. I think it is reasonable to take previous statements into account in considering the reasonability of fear.

    I mostly agree with this, although I would again state that rather than relying on the possibility of unequal treatment we should be requiring proof of actual unequal treatment before we fire someone. If it is basically a forgone conclusion that someone who made racist remarks (say wax) can not, or chooses not, to treat minorities the same in terms of grading or communication, then it should not be that hard to prove those things are happening. In which case you can very easily fire them for being unequal inside the classroom instead of firing them for making comments outside of it, that may indicate a bias that could possibly affect their ability to do their job.

    This is arguing that nothing can be done until someone is hurt in a manner that you consider harm, because you're unwilling to believe marginalized people when they tell you how the presence of bigots in positions of authority makes them feel unsafe and that they don't belong. I find this need to protect bigots - not the targets of their bigotry, who have done nothing wrong and are yet attacked for who they are - to be baffling. Not to mention that when the school winds up in court over this, "yes we knew about what they had said, but they hadn't done anything" does not work as a defense.

    Also, in good freedom of speech news, the MAGA hat prep school goose suing the Washington Post just had his lawsuit tossed:
    The $250 million lawsuit filed by Nick Sandmann against the Washington Post has been dismissed by a federal judge.

    William Bertelsman, who heard oral arguments in the case earlier this month, issued the ruling on Friday.

    Nick and his attorneys, Todd McMurtry and L. Lin Wood, alleged that the gist of The Washington Post’s first article conveyed that Nick had assaulted or physically intimidated Nathan Phillips, engaged in racist conduct, and engaged in taunts.

    But, Bertelsman wrote, “this is not supported by the plain language in the article, which states none of these things.”

    XBL: Nox Aeternum / PSN: NoxAeternum / NN:NoxAeternum / Steam: noxaeternum
    Youtube
  • PaladinPaladin Registered User regular
    DarkPrimus wrote: »
    Yes, that clarifies things nicely, thank you Paladin. I essentially agree with your summary. Where we diverge is about whether or not "repugnant and hateful" speech is worthy of defense.

    This is the essential disagreement, isn't it? I think it has always been there and has created a fundamental schism between the history of law in the United States of America and of English common law, dating back to 1066. The foundation of European law regarding freedom of expression upon the tenets of common law rather than the philosophy of the American Constitution can be seen in this quote from Sir William Blackstone, the savior of common law and the reason why it became the predominant guiding legal foundation of western law:
    The liberty of the press is indeed essential to the nature of a free state; but this consists in laying no previous restraints upon publications, and not in freedom from censure for criminal matter when published. Every freeman has an undoubted right to lay what sentiments he pleases before the public; to forbid this, is to destroy the freedom of the press: but if he publishes what is improper, mischievous, or illegal, he must take the consequences of his own temerity. To subject the press to the restrictive power of a licenser, as was formerly done, both before and since the Revolution, is to subject all freedom of sentiment to the prejudices of one man, and make him the arbitrary and infallible judge of all controverted points in learning, religion and government. But to punish as the law does at present any dangerous or offensive writings, which, when published, shall on a fair and impartial trial be adjudged of a pernicious tendency, is necessary for the preservation of peace and good order, of government and religion, the only solid foundations of civil liberty. Thus, the will of individuals is still left free: the abuse only of that free will is the object of legal punishment. Neither is any restraint hereby laid upon freedom of thought or inquiry; liberty of private sentiment is still left; the disseminating, or making public, of bad sentiments, destructive to the ends of society, is the crime which society corrects.

    This is a logical argument encompassing the foundations of all rationale for legal restriction on harmful speech that gets to the meat of essential and unassailable freedom, before the paradox of tolerance, before free speech, of the will of the individual, and states that this can be preserved despite the rule of law against absolute freedom of speech as long as society is the law. However, both Sir William Blackstone and the honorable Oliver Wendell Holmes would be rolling in their graves at my suggestion that this 250 year old quote be treated as gospel, as the latter once said:
    It is revolting to have no better reason for a rule of law than that so it was laid down in the time of Henry IV. It is still more revolting if the grounds upon which it was laid down have vanished long since, and the rule simply persists from blind imitation of the past.

    Which I suppose you could again turn around as saying that precedence sufficiently antiquated is suitable to challenge, even in such a treasured maxim as "Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances."

    So, if you were feeling glum that the status of US law makes it impossible to see a US hate speech law in your lifetime, an honest reckoning of the experiences of society as they relate to the laws we have set for ourselves and the rights we endorse may inch you closer to your dream. Which I suppose is a lengthy way of saying that the discussion we are having may have some merit after all.

    Marty: The future, it's where you're going?
    Doc: That's right, twenty five years into the future. I've always dreamed on seeing the future, looking beyond my years, seeing the progress of mankind. I'll also be able to see who wins the next twenty-five world series.
    DarkPrimusFrankiedarlingMeeqespool32
  • AngelHedgieAngelHedgie Registered User regular
    Julius wrote: »
    Julius wrote: »
    Julius wrote: »
    Julius wrote: »
    And here's the thing - I think tenure is a good thing. But I also realize that when a principle is used to justify harm, the result is that it delegitimizes the principle in the eyes of the harmed. If people use tenure to defend Wax, why should the people targeted by her position, targeted by those whom she is working to rehabilitate consider it to be legitimate? It's being used as a bludgeon against them, after all.

    How is it used to bludgeon them? How is it used to justify harm? Nobody is saying that it is just that she says that shit. Nobody is claiming that it is good that she is hurting people with her speech.

    No, you just have people saying that nothing should be done about it, because she has tenure. (And yes, people have done such, like the Penn board member who resigned in protest and threatened to withdraw his donations over Wax's removal from 1L teaching assignments.) Why should the people whom she's targeting be interested in preserving a principle that's protecting their interlocutor?

    ?
    This guy protesting by threatening to withdraw donations is not appealing to tenure but directly to the value of her differing views to the debate. Because he is an idiot and didn't read up on the case. I can't stress enough how ridiculous it is to defend your claim by providing an article that itself notes this guy is an idiot and also not doing what you say he did. He is saying she shouldn't lose her classes because her views are valuable, not because she has tenure.

    You can tell because tenure is strictly about employment and termination. Again, the statement you yourself linked provided a list of actions including removal of teaching assignments, yet did not touch tenure. They do not call for her to be fired.

    Why should the people whom...etc.? Because an important number of those people made a statement that preserved that principle!

    Isn't the whole purpose of tenure to protect diversity of thought in academia by allowing professors the ability to articulate controversial ideas? Saying "he wasn't talking about tenure, just the underpinnings of tenure" strikes me as a distinction without difference.

    Tenure is just one of the measures taken to protect academic freedom, it is not itself academic freedom. But my point is that he is specifically claiming that her statements have some (academic) merit and her "attackers" are condemning them without reasons or facts. That is, he is appealing to the idea that controversial/radical views contribute to scientific/philosophical developments. But the principle of academic freedom is much broader. It doesn't claim all views contribute to the debate/search for truth. Some views are clearly without merit, some may even actively hinder! The idea, though, is that to ensure the true free search for truth (which benefits from antagonistic views) you have to have complete freedom. The dumb and offensive views are the price paid for this.

    Now, you may disagree with this idea! It seems dubious! I myself think the free search for truth isn't meaningfully harmed by firing academics for making absurd and hurtful claims like how the earth revolves around the sun. Science requires us to consider honestly and without prejudice ideas that seem controversial and radical, but not clearly meritless ones.

    The point is: You haven't yet shown anyone actually appealing to tenure or the principle of academic freedom to claim Wax should face no consequences and no actions should be taken. Not that it would matter if you did because the existence of alternative responses/claims/views is enough to dismiss your point. These "harmed people" you continue to point to can just look to those alternatives.

    Oh, you'd like me to point you to arguing that Wax shouldn't be treated as normalizing white supremacy - alright:

    No I'd like you to stop doing that because it is making the opposite of the point you think it is. People arguing along the lines of "she never said such a thing" aren't appealing to the principle of academic freedom. You need someone who says she is saying such a thing but nothing should be done because of academic tenure.
    But of course, it doesn't "matter" that Wax is, as the folks over at LGM define it, part of the part of the "right wing academic martyr racket", in which right wing academics make outrageous, unsupported (you did say that we are not obliged to consider meritless ideas, after all), and bigoted comments, refuse to engage with actual scholarly criticism, and claim "persecution" by academia in the right wing media. It doesn't matter that Wax is part of a system on the right wing designed to normalize and defend white supremacy, which has been fueling the rise of white supremacists we've been seeing. Because there are "alternative views".

    Ignoring for a moment that I said none of that, my point is that those " "alternative views" " actually have merit. It doesn't matter if someone offers a bad defense of academic freedom if someone else offers a good one.

    Well, here's Reason outright saying "yes, what she said is racist, but firing her for it would harm academic freedom":
    The thinking here must be that Wax's immigration views are so odious that students should not be required to take any of her classes—that college is meant to be a place where students never feel uncomfortable or offended by the views of their professors.

    Well no, that's not the thinking. The thinking is that allowing professors who openly espouse bigoted views makes students who are targeted by them feel unwelcome and unsafe in the university, that they don't actually belong - and that in turn impacts their freedom to participate and engage in the academic community. It's funny how the argument for "academic freedom" winds up ignoring so many people.

    XBL: Nox Aeternum / PSN: NoxAeternum / NN:NoxAeternum / Steam: noxaeternum
    mrondeauMegaMekKristmas KthulhuFANTOMASNyysjanMan in the MistsJaysonFourshrykekimeGnome-InterruptusYoutubeHappy Little Machine
  • PhillisherePhillishere Registered User regular
    Julius wrote: »
    Julius wrote: »
    Julius wrote: »
    Julius wrote: »
    And here's the thing - I think tenure is a good thing. But I also realize that when a principle is used to justify harm, the result is that it delegitimizes the principle in the eyes of the harmed. If people use tenure to defend Wax, why should the people targeted by her position, targeted by those whom she is working to rehabilitate consider it to be legitimate? It's being used as a bludgeon against them, after all.

    How is it used to bludgeon them? How is it used to justify harm? Nobody is saying that it is just that she says that shit. Nobody is claiming that it is good that she is hurting people with her speech.

    No, you just have people saying that nothing should be done about it, because she has tenure. (And yes, people have done such, like the Penn board member who resigned in protest and threatened to withdraw his donations over Wax's removal from 1L teaching assignments.) Why should the people whom she's targeting be interested in preserving a principle that's protecting their interlocutor?

    ?
    This guy protesting by threatening to withdraw donations is not appealing to tenure but directly to the value of her differing views to the debate. Because he is an idiot and didn't read up on the case. I can't stress enough how ridiculous it is to defend your claim by providing an article that itself notes this guy is an idiot and also not doing what you say he did. He is saying she shouldn't lose her classes because her views are valuable, not because she has tenure.

    You can tell because tenure is strictly about employment and termination. Again, the statement you yourself linked provided a list of actions including removal of teaching assignments, yet did not touch tenure. They do not call for her to be fired.

    Why should the people whom...etc.? Because an important number of those people made a statement that preserved that principle!

    Isn't the whole purpose of tenure to protect diversity of thought in academia by allowing professors the ability to articulate controversial ideas? Saying "he wasn't talking about tenure, just the underpinnings of tenure" strikes me as a distinction without difference.

    Tenure is just one of the measures taken to protect academic freedom, it is not itself academic freedom. But my point is that he is specifically claiming that her statements have some (academic) merit and her "attackers" are condemning them without reasons or facts. That is, he is appealing to the idea that controversial/radical views contribute to scientific/philosophical developments. But the principle of academic freedom is much broader. It doesn't claim all views contribute to the debate/search for truth. Some views are clearly without merit, some may even actively hinder! The idea, though, is that to ensure the true free search for truth (which benefits from antagonistic views) you have to have complete freedom. The dumb and offensive views are the price paid for this.

    Now, you may disagree with this idea! It seems dubious! I myself think the free search for truth isn't meaningfully harmed by firing academics for making absurd and hurtful claims like how the earth revolves around the sun. Science requires us to consider honestly and without prejudice ideas that seem controversial and radical, but not clearly meritless ones.

    The point is: You haven't yet shown anyone actually appealing to tenure or the principle of academic freedom to claim Wax should face no consequences and no actions should be taken. Not that it would matter if you did because the existence of alternative responses/claims/views is enough to dismiss your point. These "harmed people" you continue to point to can just look to those alternatives.

    Oh, you'd like me to point you to arguing that Wax shouldn't be treated as normalizing white supremacy - alright:

    No I'd like you to stop doing that because it is making the opposite of the point you think it is. People arguing along the lines of "she never said such a thing" aren't appealing to the principle of academic freedom. You need someone who says she is saying such a thing but nothing should be done because of academic tenure.
    But of course, it doesn't "matter" that Wax is, as the folks over at LGM define it, part of the part of the "right wing academic martyr racket", in which right wing academics make outrageous, unsupported (you did say that we are not obliged to consider meritless ideas, after all), and bigoted comments, refuse to engage with actual scholarly criticism, and claim "persecution" by academia in the right wing media. It doesn't matter that Wax is part of a system on the right wing designed to normalize and defend white supremacy, which has been fueling the rise of white supremacists we've been seeing. Because there are "alternative views".

    Ignoring for a moment that I said none of that, my point is that those " "alternative views" " actually have merit. It doesn't matter if someone offers a bad defense of academic freedom if someone else offers a good one.

    Well, here's Reason outright saying "yes, what she said is racist, but firing her for it would harm academic freedom":
    The thinking here must be that Wax's immigration views are so odious that students should not be required to take any of her classes—that college is meant to be a place where students never feel uncomfortable or offended by the views of their professors.

    Well no, that's not the thinking. The thinking is that allowing professors who openly espouse bigoted views makes students who are targeted by them feel unwelcome and unsafe in the university, that they don't actually belong - and that in turn impacts their freedom to participate and engage in the academic community. It's funny how the argument for "academic freedom" winds up ignoring so many people.

    What conservatives miss is that this isn't an intellectual debate about their values. You simply can't run a modern education system that takes money from students from all races, and then put them under the direct supervision of someone who is openly advocating against them as human beings. Only an idiot would trust a bigot to evaluate them fairly.

    MegaMekFANTOMASDarkPrimusMan in the MistsJaysonFourkimeForarGnome-InterruptusYoutubeHappy Little MachineIncenjucar
  • NyysjanNyysjan FinlandRegistered User regular
    Julius wrote: »
    Julius wrote: »
    Julius wrote: »
    Julius wrote: »
    And here's the thing - I think tenure is a good thing. But I also realize that when a principle is used to justify harm, the result is that it delegitimizes the principle in the eyes of the harmed. If people use tenure to defend Wax, why should the people targeted by her position, targeted by those whom she is working to rehabilitate consider it to be legitimate? It's being used as a bludgeon against them, after all.

    How is it used to bludgeon them? How is it used to justify harm? Nobody is saying that it is just that she says that shit. Nobody is claiming that it is good that she is hurting people with her speech.

    No, you just have people saying that nothing should be done about it, because she has tenure. (And yes, people have done such, like the Penn board member who resigned in protest and threatened to withdraw his donations over Wax's removal from 1L teaching assignments.) Why should the people whom she's targeting be interested in preserving a principle that's protecting their interlocutor?

    ?
    This guy protesting by threatening to withdraw donations is not appealing to tenure but directly to the value of her differing views to the debate. Because he is an idiot and didn't read up on the case. I can't stress enough how ridiculous it is to defend your claim by providing an article that itself notes this guy is an idiot and also not doing what you say he did. He is saying she shouldn't lose her classes because her views are valuable, not because she has tenure.

    You can tell because tenure is strictly about employment and termination. Again, the statement you yourself linked provided a list of actions including removal of teaching assignments, yet did not touch tenure. They do not call for her to be fired.

    Why should the people whom...etc.? Because an important number of those people made a statement that preserved that principle!

    Isn't the whole purpose of tenure to protect diversity of thought in academia by allowing professors the ability to articulate controversial ideas? Saying "he wasn't talking about tenure, just the underpinnings of tenure" strikes me as a distinction without difference.

    Tenure is just one of the measures taken to protect academic freedom, it is not itself academic freedom. But my point is that he is specifically claiming that her statements have some (academic) merit and her "attackers" are condemning them without reasons or facts. That is, he is appealing to the idea that controversial/radical views contribute to scientific/philosophical developments. But the principle of academic freedom is much broader. It doesn't claim all views contribute to the debate/search for truth. Some views are clearly without merit, some may even actively hinder! The idea, though, is that to ensure the true free search for truth (which benefits from antagonistic views) you have to have complete freedom. The dumb and offensive views are the price paid for this.

    Now, you may disagree with this idea! It seems dubious! I myself think the free search for truth isn't meaningfully harmed by firing academics for making absurd and hurtful claims like how the earth revolves around the sun. Science requires us to consider honestly and without prejudice ideas that seem controversial and radical, but not clearly meritless ones.

    The point is: You haven't yet shown anyone actually appealing to tenure or the principle of academic freedom to claim Wax should face no consequences and no actions should be taken. Not that it would matter if you did because the existence of alternative responses/claims/views is enough to dismiss your point. These "harmed people" you continue to point to can just look to those alternatives.

    Oh, you'd like me to point you to arguing that Wax shouldn't be treated as normalizing white supremacy - alright:

    No I'd like you to stop doing that because it is making the opposite of the point you think it is. People arguing along the lines of "she never said such a thing" aren't appealing to the principle of academic freedom. You need someone who says she is saying such a thing but nothing should be done because of academic tenure.
    But of course, it doesn't "matter" that Wax is, as the folks over at LGM define it, part of the part of the "right wing academic martyr racket", in which right wing academics make outrageous, unsupported (you did say that we are not obliged to consider meritless ideas, after all), and bigoted comments, refuse to engage with actual scholarly criticism, and claim "persecution" by academia in the right wing media. It doesn't matter that Wax is part of a system on the right wing designed to normalize and defend white supremacy, which has been fueling the rise of white supremacists we've been seeing. Because there are "alternative views".

    Ignoring for a moment that I said none of that, my point is that those " "alternative views" " actually have merit. It doesn't matter if someone offers a bad defense of academic freedom if someone else offers a good one.

    Well, here's Reason outright saying "yes, what she said is racist, but firing her for it would harm academic freedom":
    The thinking here must be that Wax's immigration views are so odious that students should not be required to take any of her classes—that college is meant to be a place where students never feel uncomfortable or offended by the views of their professors.

    Well no, that's not the thinking. The thinking is that allowing professors who openly espouse bigoted views makes students who are targeted by them feel unwelcome and unsafe in the university, that they don't actually belong - and that in turn impacts their freedom to participate and engage in the academic community. It's funny how the argument for "academic freedom" winds up ignoring so many people.

    What conservatives miss is that this isn't an intellectual debate about their values. You simply can't run a modern education system that takes money from students from all races, and then put them under the direct supervision of someone who is openly advocating against them as human beings. Only an idiot would trust a bigot to evaluate them fairly.

    I don't think they're missing it.

    MayabirdMegaMekDarkPrimusmrondeauPhillishereJaysonFourFANTOMASAistanYoutubeHappy Little MachineIncenjucarJulius
  • AngelHedgieAngelHedgie Registered User regular
    Telegram becomes an object example of the failure of free speech absolutism:
    This development might be inevitable on a platform designed to circumvent censorship in countries like Russia and mass surveillance in the United States. “An absolutist approach to free speech is a classic door that white supremacists have been driving their trucks through for decades,” said Jessie Daniels, a professor of sociology at Hunter College, CUNY, who has written several books about white supremacy and technology. “They know that there’s no real counter to that stance if they show up.” Free speech, says Daniels, is often something that people on the left want to protect no matter what as a bedrock of democracy, even when white supremacists have always used free speech to elevate anti-democratic, racist, and even genocidal ideas. White supremacists are often quick to adopt new technologies where there aren’t rules or community standards that will get in their way, Daniels said. For that reason, she calls these groups “innovation opportunists.”
    In some ways, Telegram faces the same challenge as all social media: It wants to provide a safe platform for everyone, but some of those people want to cause harm. One big difference between Telegram and other platforms is that Telegram isn’t American (its developers are based in Dubai), so it’s not beholden to pressures from Congress to clean up its act or face regulation. It’s also not trying to make money, so fears of a tarnished brand probably don’t factor into the app’s governance, either.

    As with the ISIS episode, Telegram will sometimes intervene. Last year, Telegram got booted from Apple’s App Store because it hosted “inappropriate content”; it was restored within 48 hours after new filters were put in place. Now, when attempting to visit some neo-Nazi or racist meme channels on an iPhone—for example, one called “There Is No Political Solution”—you’ll encounter a message that reads: “Unfortunately, this channel couldn’t be displayed on your device because it violates Apple App Store Review Guidelines, section 1.1.1.” The same channel can be found if you aren’t using an Apple device. Telegram, which wants to be available to any possible user, needs to interoperate with major infrastructure companies, like Apple and Google. Which means if it faces a form of accountability, it’s from Silicon Valley. (Telegram did not respond to questions about when it decides to moderate content.)

    XBL: Nox Aeternum / PSN: NoxAeternum / NN:NoxAeternum / Steam: noxaeternum
    MegaMekYoutube
  • KamarKamar Registered User regular
    edited August 11
    I'm not entirely sure this is the right place for this, just based on what the focus of the conversation's been, but...

    https://www.politico.com/story/2019/08/07/white-house-tech-censorship-1639051
    https://www.commondreams.org/news/2019/08/11/leaked-draft-trump-executive-order-censor-internet-denounced-dangerous
    https://www.cnn.com/2019/08/09/tech/white-house-social-media-executive-order-fcc-ftc/index.html

    tl;dr Trump administration has a number of different drafts floating around of an executive order to 'fight anti-conservative bias on social media'.

    Possibilities apparently include giving the FTC and FCC broad authority to assess how sites moderate and punish those moderating 'inappropriately'.



    Kamar on
    Youtube
  • FencingsaxFencingsax It is difficult to get a man to understand, when his salary depends upon his not understanding GNU Terry PratchettRegistered User regular
    Would that include here?

    torchlight-sig-80.jpg
  • kimekime Queen of Blades Registered User regular
    Fencingsax wrote: »
    Would that include here?

    Looks like the leaked draft is specifically targeting "social media sites." I don't think this would traditionally be counted, but probably technically fits the super vague definition?

    PA HotS Group
    Battle.net ID: kime#1822
    3DS Friend Code: 3110-5393-4113
    Steam profile
  • Phoenix-DPhoenix-D Registered User regular
    That will get injunctioned before the ink is dry if they ever actually issue it.

    Polaritie
  • PaladinPaladin Registered User regular
    heh, everybody wins and everybody loses no matter what happens. Section 230 takes a hit but Trump gets the power to shape social media as he wants, or it gets bolstered and he has to sit and stew

    Marty: The future, it's where you're going?
    Doc: That's right, twenty five years into the future. I've always dreamed on seeing the future, looking beyond my years, seeing the progress of mankind. I'll also be able to see who wins the next twenty-five world series.
  • MillMill Registered User regular
    Funny thing is, I think a fair bit of the reporting I've seen on this nonsense of Trump in regards to social media, is hinting that even a fair number of republicans aren't comfortable with this idea. I suspect it's more that enough are smart enough to realize that this won't be a win for them because Trump will pretty much use it to fuck them over because Trump only cares about Trump and doesn't tolerate dissent against him. Not to mention, they might be smart enough to realize they do have a significant chunk of voters won't accept IOIYAR as an excuse go down this right.

    I also doubt there are even four people on SCOTUS that would go for this. So if Trump follows through on this, this EO might end up either being the EO that gets shut down the fastest in history via injunction or it'll be damn close.

  • A Kobold's KoboldA Kobold's Kobold He/Him MississippiRegistered User regular
    I get that it's highly unlikely for this to actually take effect, but shit like this makes me sick to my stomach. It should be clear to anybody who isn't in Trump's cult of personality and who can read between the lines even a little bit that this is entirely constructed out of persecution complex. And it really, really seems like this order is designed to keep social media sites as vectors for radicalization. Which is, of course, fucking goosey.

    Switch Friend Code: SW-3011-6091-2364
    Twitter delenda est
    ElvenshaeMorganVXaquin
  • TryCatcherTryCatcher Registered User regular
    Kamar wrote: »
    I'm not entirely sure this is the right place for this, just based on what the focus of the conversation's been, but...

    https://www.politico.com/story/2019/08/07/white-house-tech-censorship-1639051
    https://www.commondreams.org/news/2019/08/11/leaked-draft-trump-executive-order-censor-internet-denounced-dangerous
    https://www.cnn.com/2019/08/09/tech/white-house-social-media-executive-order-fcc-ftc/index.html

    tl;dr Trump administration has a number of different drafts floating around of an executive order to 'fight anti-conservative bias on social media'.

    Possibilities apparently include giving the FTC and FCC broad authority to assess how sites moderate and punish those moderating 'inappropriately'.



    Been checking the news on Trumpland, apparently some more friends of Damore at Google decided to leak Google info to O'Keefe and the DOJ. For those that don't want to click the link, Google CEO Sundar Pichai said to Congress that "the algorithm is impartial", which is obvious bullshit (the algorithm is made by people, and people are not impartial) and those leaks contain the info that is, in fact, obvious bullshit. So Pichai can get into hot water for lying to Congress.

    MorganVYoutube
  • themightypuckthemightypuck MontanaRegistered User regular
    TryCatcher wrote: »
    Kamar wrote: »
    I'm not entirely sure this is the right place for this, just based on what the focus of the conversation's been, but...

    https://www.politico.com/story/2019/08/07/white-house-tech-censorship-1639051
    https://www.commondreams.org/news/2019/08/11/leaked-draft-trump-executive-order-censor-internet-denounced-dangerous
    https://www.cnn.com/2019/08/09/tech/white-house-social-media-executive-order-fcc-ftc/index.html

    tl;dr Trump administration has a number of different drafts floating around of an executive order to 'fight anti-conservative bias on social media'.

    Possibilities apparently include giving the FTC and FCC broad authority to assess how sites moderate and punish those moderating 'inappropriately'.



    Been checking the news on Trumpland, apparently some more friends of Damore at Google decided to leak Google info to O'Keefe and the DOJ. For those that don't want to click the link, Google CEO Sundar Pichai said to Congress that "the algorithm is impartial", which is obvious bullshit (the algorithm is made by people, and people are not impartial) and those leaks contain the info that is, in fact, obvious bullshit. So Pichai can get into hot water for lying to Congress.

    I think Pichai will be fine. There is plenty of wiggle room around what "the algorithm is impartial" means. My take was they didn't have people messing with algo results after the fact since, as you say, algorithms are obviously not impartial in how they are designed.

    “Reject your sense of injury and the injury itself disappears.”
    ― Marcus Aurelius

    Path of Exile: themightypuck
    Fencingsax
  • AngelHedgieAngelHedgie Registered User regular
    TryCatcher wrote: »
    Kamar wrote: »
    I'm not entirely sure this is the right place for this, just based on what the focus of the conversation's been, but...

    https://www.politico.com/story/2019/08/07/white-house-tech-censorship-1639051
    https://www.commondreams.org/news/2019/08/11/leaked-draft-trump-executive-order-censor-internet-denounced-dangerous
    https://www.cnn.com/2019/08/09/tech/white-house-social-media-executive-order-fcc-ftc/index.html

    tl;dr Trump administration has a number of different drafts floating around of an executive order to 'fight anti-conservative bias on social media'.

    Possibilities apparently include giving the FTC and FCC broad authority to assess how sites moderate and punish those moderating 'inappropriately'.



    Been checking the news on Trumpland, apparently some more friends of Damore at Google decided to leak Google info to O'Keefe and the DOJ. For those that don't want to click the link, Google CEO Sundar Pichai said to Congress that "the algorithm is impartial", which is obvious bullshit (the algorithm is made by people, and people are not impartial) and those leaks contain the info that is, in fact, obvious bullshit. So Pichai can get into hot water for lying to Congress.

    I think Pichai will be fine. There is plenty of wiggle room around what "the algorithm is impartial" means. My take was they didn't have people messing with algo results after the fact since, as you say, algorithms are obviously not impartial in how they are designed.

    Also, it turns out that the "whistle-blower" is an anti-Semitic conspiracy theorist:
    What O’Keefe’s video leaves out, though, is that his much-hyped insider is not as credible as he claims. On social media, Vorhies is an avid promoter of anti-Semitic slanders that banks, the media, and the United States government are controlled by “Zionists.” He’s also pushed conspiracy theories like QAnon, Pizzagate, and the discredited claim that vaccines cause autism.
    [...]
    YouTube, where Vorhies worked until recently, has been criticized for a video recommendation algorithm that can push users towards more extreme content — and thus towards radicalization and conspiracy theories.

    Vorhies himself saw YouTube as a reliable source of evidence for his conspiracy theories, according to one tweet he sent promoting anti-vaccine paranoia. In a January tweet, he urged other Twitter users to look up anti-vaccine information on YouTube.

    XBL: Nox Aeternum / PSN: NoxAeternum / NN:NoxAeternum / Steam: noxaeternum
  • Phoenix-DPhoenix-D Registered User regular
    He also went to O'Keefe so there's a pretty decent chance the "evidence" is falsified or hilariously misleading.

    DarkPrimusKristmas KthulhuMan in the MistsYoutubeshrykeZomroMegaMekA Kobold's Kobold
  • MillMill Registered User regular
    Also pretty sure what google was alluding with their algorithm being impartial is that it's not biased towards one view point over another. Also giving this is an alt-right shit weasel, the accusation they are trying to push is that google unfairly buries alt-right favorable bullshit, which is absolutely false.

  • JepheryJephery Registered User regular
    edited August 15
    I do want to mention that, while I think hate speech should be policed, I think any policing of hate speech should be super specific and narrow, and have little or no executive privilege on that policing.

    Trump ordering the FTC to police social media at their own whim is fucking stupid and way out of bounds for the executive.

    Jephery on
    }
    "Orkses never lose a battle. If we win we win, if we die we die fightin so it don't count. If we runs for it we don't die neither, cos we can come back for annuver go, see!".
  • tinwhiskerstinwhiskers Registered User regular
    Jephery wrote: »
    I do want to mention that, while I think hate speech should be policed, I think any policing of hate speech should be super specific and narrow, and have little or no executive privilege on that policing.

    Trump ordering the FTC to police social media at their own whim is fucking stupid and way out of bounds for the executive.

    All law enforcement in the US has large amounts of "executive" privilege baked in. At the federal level you can see how Sessions/Trump gutted the DOJ divisions responsible for civil rights or checking racist police departments.

    On the local level, not only is their the city hall dynamics, but DAs have extremely broad discretion in what crimes to prosecute and what charges to bring for those crimes. This is only growing as ALAC/GOP state houses push more laws criminalizing protest increasing the range of potential charges that can be levied.

    How do you spell Justice?B D S Non-Violent Resistance to Israel Apartheid & Occupation.
    YoutubeJuliusGnome-Interruptus
  • JepheryJephery Registered User regular
    Jephery wrote: »
    I do want to mention that, while I think hate speech should be policed, I think any policing of hate speech should be super specific and narrow, and have little or no executive privilege on that policing.

    Trump ordering the FTC to police social media at their own whim is fucking stupid and way out of bounds for the executive.

    All law enforcement in the US has large amounts of "executive" privilege baked in. At the federal level you can see how Sessions/Trump gutted the DOJ divisions responsible for civil rights or checking racist police departments.

    On the local level, not only is their the city hall dynamics, but DAs have extremely broad discretion in what crimes to prosecute and what charges to bring for those crimes. This is only growing as ALAC/GOP state houses push more laws criminalizing protest increasing the range of potential charges that can be levied.

    Yeah its a pretty huge source of corruption in our justice system, but that discussion would go pretty off topic.

    }
    "Orkses never lose a battle. If we win we win, if we die we die fightin so it don't count. If we runs for it we don't die neither, cos we can come back for annuver go, see!".
  • JuliusJulius Registered User regular
    Telegram becomes an object example of the failure of free speech absolutism:
    This development might be inevitable on a platform designed to circumvent censorship in countries like Russia and mass surveillance in the United States. “An absolutist approach to free speech is a classic door that white supremacists have been driving their trucks through for decades,” said Jessie Daniels, a professor of sociology at Hunter College, CUNY, who has written several books about white supremacy and technology. “They know that there’s no real counter to that stance if they show up.” Free speech, says Daniels, is often something that people on the left want to protect no matter what as a bedrock of democracy, even when white supremacists have always used free speech to elevate anti-democratic, racist, and even genocidal ideas. White supremacists are often quick to adopt new technologies where there aren’t rules or community standards that will get in their way, Daniels said. For that reason, she calls these groups “innovation opportunists.”
    In some ways, Telegram faces the same challenge as all social media: It wants to provide a safe platform for everyone, but some of those people want to cause harm. One big difference between Telegram and other platforms is that Telegram isn’t American (its developers are based in Dubai), so it’s not beholden to pressures from Congress to clean up its act or face regulation. It’s also not trying to make money, so fears of a tarnished brand probably don’t factor into the app’s governance, either.

    As with the ISIS episode, Telegram will sometimes intervene. Last year, Telegram got booted from Apple’s App Store because it hosted “inappropriate content”; it was restored within 48 hours after new filters were put in place. Now, when attempting to visit some neo-Nazi or racist meme channels on an iPhone—for example, one called “There Is No Political Solution”—you’ll encounter a message that reads: “Unfortunately, this channel couldn’t be displayed on your device because it violates Apple App Store Review Guidelines, section 1.1.1.” The same channel can be found if you aren’t using an Apple device. Telegram, which wants to be available to any possible user, needs to interoperate with major infrastructure companies, like Apple and Google. Which means if it faces a form of accountability, it’s from Silicon Valley. (Telegram did not respond to questions about when it decides to moderate content.)

    I'm reading this and I have to see I really don't see why this is an abject example of the failure of free speech absolutism or whatever you wanna call it.

    Yes, obviously a messaging app that protects user's privacy and does not hand over info is appealing to any group that wants that, and that includes white supremacists. Contrary to what professor Daniels seems to be suggesting, this is not an unforeseen and unintended consequence. It is a predictable and acceptable consequence. Anyone could have told you that! This is in no way a failure. Protecting against free speech interference is protecting against free speech interference.

    What you probably meant to say was "Telegram becomes an example of something I don't like".

    NSDFRandHamHamJElvenshaeFrankiedarling
  • AngelHedgieAngelHedgie Registered User regular
    Julius wrote: »
    Telegram becomes an object example of the failure of free speech absolutism:
    This development might be inevitable on a platform designed to circumvent censorship in countries like Russia and mass surveillance in the United States. “An absolutist approach to free speech is a classic door that white supremacists have been driving their trucks through for decades,” said Jessie Daniels, a professor of sociology at Hunter College, CUNY, who has written several books about white supremacy and technology. “They know that there’s no real counter to that stance if they show up.” Free speech, says Daniels, is often something that people on the left want to protect no matter what as a bedrock of democracy, even when white supremacists have always used free speech to elevate anti-democratic, racist, and even genocidal ideas. White supremacists are often quick to adopt new technologies where there aren’t rules or community standards that will get in their way, Daniels said. For that reason, she calls these groups “innovation opportunists.”
    In some ways, Telegram faces the same challenge as all social media: It wants to provide a safe platform for everyone, but some of those people want to cause harm. One big difference between Telegram and other platforms is that Telegram isn’t American (its developers are based in Dubai), so it’s not beholden to pressures from Congress to clean up its act or face regulation. It’s also not trying to make money, so fears of a tarnished brand probably don’t factor into the app’s governance, either.

    As with the ISIS episode, Telegram will sometimes intervene. Last year, Telegram got booted from Apple’s App Store because it hosted “inappropriate content”; it was restored within 48 hours after new filters were put in place. Now, when attempting to visit some neo-Nazi or racist meme channels on an iPhone—for example, one called “There Is No Political Solution”—you’ll encounter a message that reads: “Unfortunately, this channel couldn’t be displayed on your device because it violates Apple App Store Review Guidelines, section 1.1.1.” The same channel can be found if you aren’t using an Apple device. Telegram, which wants to be available to any possible user, needs to interoperate with major infrastructure companies, like Apple and Google. Which means if it faces a form of accountability, it’s from Silicon Valley. (Telegram did not respond to questions about when it decides to moderate content.)

    I'm reading this and I have to see I really don't see why this is an abject example of the failure of free speech absolutism or whatever you wanna call it.

    Yes, obviously a messaging app that protects user's privacy and does not hand over info is appealing to any group that wants that, and that includes white supremacists. Contrary to what professor Daniels seems to be suggesting, this is not an unforeseen and unintended consequence. It is a predictable and acceptable consequence. Anyone could have told you that! This is in no way a failure. Protecting against free speech interference is protecting against free speech interference.

    What you probably meant to say was "Telegram becomes an example of something I don't like".

    The fact that you call the providing of white supremacists safe places to organize and coordinate harassment "a predictable and acceptable consequence" continues to show why free speech absolutism is a failed concept. Why should the price of "free speech" be the enabling of harassment and violence against minorities (which, let's remember, has a chilling effect on free speech as those targeted leave the discussion to protect their own safety)?

    After reading the New York Times' retrospective on Giggledygoop 5 years out, and the damage that they did, the way they made people fear for their lives, fear to actually speak out - I don't find the "hate speech is the price of free speech" argument to be a compelling one. Bigots will, by their very nature, use their speech to chill the speech of their targets. There is no coexistence.

    XBL: Nox Aeternum / PSN: NoxAeternum / NN:NoxAeternum / Steam: noxaeternum
    Martini_Philosopherdestroyah87mrondeauMegaMekshrykeNyysjanDarkPrimusAistanPhillishereKristmas KthulhuMaclayA Kobold's KoboldMeeqeQanamilMan in the MistsMrVyngaardYoutube
  • PhillisherePhillishere Registered User regular
    edited August 16
    Julius wrote: »
    Telegram becomes an object example of the failure of free speech absolutism:
    This development might be inevitable on a platform designed to circumvent censorship in countries like Russia and mass surveillance in the United States. “An absolutist approach to free speech is a classic door that white supremacists have been driving their trucks through for decades,” said Jessie Daniels, a professor of sociology at Hunter College, CUNY, who has written several books about white supremacy and technology. “They know that there’s no real counter to that stance if they show up.” Free speech, says Daniels, is often something that people on the left want to protect no matter what as a bedrock of democracy, even when white supremacists have always used free speech to elevate anti-democratic, racist, and even genocidal ideas. White supremacists are often quick to adopt new technologies where there aren’t rules or community standards that will get in their way, Daniels said. For that reason, she calls these groups “innovation opportunists.”
    In some ways, Telegram faces the same challenge as all social media: It wants to provide a safe platform for everyone, but some of those people want to cause harm. One big difference between Telegram and other platforms is that Telegram isn’t American (its developers are based in Dubai), so it’s not beholden to pressures from Congress to clean up its act or face regulation. It’s also not trying to make money, so fears of a tarnished brand probably don’t factor into the app’s governance, either.

    As with the ISIS episode, Telegram will sometimes intervene. Last year, Telegram got booted from Apple’s App Store because it hosted “inappropriate content”; it was restored within 48 hours after new filters were put in place. Now, when attempting to visit some neo-Nazi or racist meme channels on an iPhone—for example, one called “There Is No Political Solution”—you’ll encounter a message that reads: “Unfortunately, this channel couldn’t be displayed on your device because it violates Apple App Store Review Guidelines, section 1.1.1.” The same channel can be found if you aren’t using an Apple device. Telegram, which wants to be available to any possible user, needs to interoperate with major infrastructure companies, like Apple and Google. Which means if it faces a form of accountability, it’s from Silicon Valley. (Telegram did not respond to questions about when it decides to moderate content.)

    I'm reading this and I have to see I really don't see why this is an abject example of the failure of free speech absolutism or whatever you wanna call it.

    Yes, obviously a messaging app that protects user's privacy and does not hand over info is appealing to any group that wants that, and that includes white supremacists. Contrary to what professor Daniels seems to be suggesting, this is not an unforeseen and unintended consequence. It is a predictable and acceptable consequence. Anyone could have told you that! This is in no way a failure. Protecting against free speech interference is protecting against free speech interference.

    What you probably meant to say was "Telegram becomes an example of something I don't like".

    The fact that you call the providing of white supremacists safe places to organize and coordinate harassment "a predictable and acceptable consequence" continues to show why free speech absolutism is a failed concept. Why should the price of "free speech" be the enabling of harassment and violence against minorities (which, let's remember, has a chilling effect on free speech as those targeted leave the discussion to protect their own safety)?

    After reading the New York Times' retrospective on Giggledygoop 5 years out, and the damage that they did, the way they made people fear for their lives, fear to actually speak out - I don't find the "hate speech is the price of free speech" argument to be a compelling one. Bigots will, by their very nature, use their speech to chill the speech of their targets. There is no coexistence.

    Both leftism and liberalism have a huge weakness in that they can be turned into weapons and shields for white supremacists who can speak their language and manipulate their ideas. It is another reason why diversity and intersectionalism is important - to call out the bullshit.

    If your leftist principles support enabling white supremacists in creating a safe space to organize attacks against the oppressed, they aren’t leftist principles. You are just practicing another form of white supremacy.

    Phillishere on
    mrondeauMegaMekAistanAngelHedgieMeeqeMan in the MistsMrVyngaardYoutube
  • PaladinPaladin Registered User regular
    Julius wrote: »
    Telegram becomes an object example of the failure of free speech absolutism:
    This development might be inevitable on a platform designed to circumvent censorship in countries like Russia and mass surveillance in the United States. “An absolutist approach to free speech is a classic door that white supremacists have been driving their trucks through for decades,” said Jessie Daniels, a professor of sociology at Hunter College, CUNY, who has written several books about white supremacy and technology. “They know that there’s no real counter to that stance if they show up.” Free speech, says Daniels, is often something that people on the left want to protect no matter what as a bedrock of democracy, even when white supremacists have always used free speech to elevate anti-democratic, racist, and even genocidal ideas. White supremacists are often quick to adopt new technologies where there aren’t rules or community standards that will get in their way, Daniels said. For that reason, she calls these groups “innovation opportunists.”
    In some ways, Telegram faces the same challenge as all social media: It wants to provide a safe platform for everyone, but some of those people want to cause harm. One big difference between Telegram and other platforms is that Telegram isn’t American (its developers are based in Dubai), so it’s not beholden to pressures from Congress to clean up its act or face regulation. It’s also not trying to make money, so fears of a tarnished brand probably don’t factor into the app’s governance, either.

    As with the ISIS episode, Telegram will sometimes intervene. Last year, Telegram got booted from Apple’s App Store because it hosted “inappropriate content”; it was restored within 48 hours after new filters were put in place. Now, when attempting to visit some neo-Nazi or racist meme channels on an iPhone—for example, one called “There Is No Political Solution”—you’ll encounter a message that reads: “Unfortunately, this channel couldn’t be displayed on your device because it violates Apple App Store Review Guidelines, section 1.1.1.” The same channel can be found if you aren’t using an Apple device. Telegram, which wants to be available to any possible user, needs to interoperate with major infrastructure companies, like Apple and Google. Which means if it faces a form of accountability, it’s from Silicon Valley. (Telegram did not respond to questions about when it decides to moderate content.)

    I'm reading this and I have to see I really don't see why this is an abject example of the failure of free speech absolutism or whatever you wanna call it.

    Yes, obviously a messaging app that protects user's privacy and does not hand over info is appealing to any group that wants that, and that includes white supremacists. Contrary to what professor Daniels seems to be suggesting, this is not an unforeseen and unintended consequence. It is a predictable and acceptable consequence. Anyone could have told you that! This is in no way a failure. Protecting against free speech interference is protecting against free speech interference.

    What you probably meant to say was "Telegram becomes an example of something I don't like".

    The fact that you call the providing of white supremacists safe places to organize and coordinate harassment "a predictable and acceptable consequence" continues to show why free speech absolutism is a failed concept. Why should the price of "free speech" be the enabling of harassment and violence against minorities (which, let's remember, has a chilling effect on free speech as those targeted leave the discussion to protect their own safety)?

    After reading the New York Times' retrospective on Giggledygoop 5 years out, and the damage that they did, the way they made people fear for their lives, fear to actually speak out - I don't find the "hate speech is the price of free speech" argument to be a compelling one. Bigots will, by their very nature, use their speech to chill the speech of their targets. There is no coexistence.

    That article kind of ignores that Gamergate was a long time coming. The type of people involved in that weren't new and were a result of limited diversity in silence. With rampant social media, a lot of this kind of stuff boiled over at once. But the ingredients were always there.

    Marty: The future, it's where you're going?
    Doc: That's right, twenty five years into the future. I've always dreamed on seeing the future, looking beyond my years, seeing the progress of mankind. I'll also be able to see who wins the next twenty-five world series.
  • PhillisherePhillishere Registered User regular
    edited August 16
    Paladin wrote: »
    Julius wrote: »
    Telegram becomes an object example of the failure of free speech absolutism:
    This development might be inevitable on a platform designed to circumvent censorship in countries like Russia and mass surveillance in the United States. “An absolutist approach to free speech is a classic door that white supremacists have been driving their trucks through for decades,” said Jessie Daniels, a professor of sociology at Hunter College, CUNY, who has written several books about white supremacy and technology. “They know that there’s no real counter to that stance if they show up.” Free speech, says Daniels, is often something that people on the left want to protect no matter what as a bedrock of democracy, even when white supremacists have always used free speech to elevate anti-democratic, racist, and even genocidal ideas. White supremacists are often quick to adopt new technologies where there aren’t rules or community standards that will get in their way, Daniels said. For that reason, she calls these groups “innovation opportunists.”
    In some ways, Telegram faces the same challenge as all social media: It wants to provide a safe platform for everyone, but some of those people want to cause harm. One big difference between Telegram and other platforms is that Telegram isn’t American (its developers are based in Dubai), so it’s not beholden to pressures from Congress to clean up its act or face regulation. It’s also not trying to make money, so fears of a tarnished brand probably don’t factor into the app’s governance, either.

    As with the ISIS episode, Telegram will sometimes intervene. Last year, Telegram got booted from Apple’s App Store because it hosted “inappropriate content”; it was restored within 48 hours after new filters were put in place. Now, when attempting to visit some neo-Nazi or racist meme channels on an iPhone—for example, one called “There Is No Political Solution”—you’ll encounter a message that reads: “Unfortunately, this channel couldn’t be displayed on your device because it violates Apple App Store Review Guidelines, section 1.1.1.” The same channel can be found if you aren’t using an Apple device. Telegram, which wants to be available to any possible user, needs to interoperate with major infrastructure companies, like Apple and Google. Which means if it faces a form of accountability, it’s from Silicon Valley. (Telegram did not respond to questions about when it decides to moderate content.)

    I'm reading this and I have to see I really don't see why this is an abject example of the failure of free speech absolutism or whatever you wanna call it.

    Yes, obviously a messaging app that protects user's privacy and does not hand over info is appealing to any group that wants that, and that includes white supremacists. Contrary to what professor Daniels seems to be suggesting, this is not an unforeseen and unintended consequence. It is a predictable and acceptable consequence. Anyone could have told you that! This is in no way a failure. Protecting against free speech interference is protecting against free speech interference.

    What you probably meant to say was "Telegram becomes an example of something I don't like".

    The fact that you call the providing of white supremacists safe places to organize and coordinate harassment "a predictable and acceptable consequence" continues to show why free speech absolutism is a failed concept. Why should the price of "free speech" be the enabling of harassment and violence against minorities (which, let's remember, has a chilling effect on free speech as those targeted leave the discussion to protect their own safety)?

    After reading the New York Times' retrospective on Giggledygoop 5 years out, and the damage that they did, the way they made people fear for their lives, fear to actually speak out - I don't find the "hate speech is the price of free speech" argument to be a compelling one. Bigots will, by their very nature, use their speech to chill the speech of their targets. There is no coexistence.

    That article kind of ignores that Gamergate was a long time coming. The type of people involved in that weren't new and were a result of limited diversity in silence. With rampant social media, a lot of this kind of stuff boiled over at once. But the ingredients were always there.

    I have a longstanding suspicion that it will eventually come out that Gamergate, Sad Puppies, and some of the other earlier outbreaks of online fascism were actually testbeds for the Russian troll farms and similar conservative political actors to test out techniques and refine how best to recruit and radicalize Americans via the Internet.

    Phillishere on
    SleepAistanMrVyngaardBlackDragon480ElvenshaeYoutubeForar
  • JuliusJulius Registered User regular
    Julius wrote: »
    Telegram becomes an object example of the failure of free speech absolutism:
    This development might be inevitable on a platform designed to circumvent censorship in countries like Russia and mass surveillance in the United States. “An absolutist approach to free speech is a classic door that white supremacists have been driving their trucks through for decades,” said Jessie Daniels, a professor of sociology at Hunter College, CUNY, who has written several books about white supremacy and technology. “They know that there’s no real counter to that stance if they show up.” Free speech, says Daniels, is often something that people on the left want to protect no matter what as a bedrock of democracy, even when white supremacists have always used free speech to elevate anti-democratic, racist, and even genocidal ideas. White supremacists are often quick to adopt new technologies where there aren’t rules or community standards that will get in their way, Daniels said. For that reason, she calls these groups “innovation opportunists.”
    In some ways, Telegram faces the same challenge as all social media: It wants to provide a safe platform for everyone, but some of those people want to cause harm. One big difference between Telegram and other platforms is that Telegram isn’t American (its developers are based in Dubai), so it’s not beholden to pressures from Congress to clean up its act or face regulation. It’s also not trying to make money, so fears of a tarnished brand probably don’t factor into the app’s governance, either.

    As with the ISIS episode, Telegram will sometimes intervene. Last year, Telegram got booted from Apple’s App Store because it hosted “inappropriate content”; it was restored within 48 hours after new filters were put in place. Now, when attempting to visit some neo-Nazi or racist meme channels on an iPhone—for example, one called “There Is No Political Solution”—you’ll encounter a message that reads: “Unfortunately, this channel couldn’t be displayed on your device because it violates Apple App Store Review Guidelines, section 1.1.1.” The same channel can be found if you aren’t using an Apple device. Telegram, which wants to be available to any possible user, needs to interoperate with major infrastructure companies, like Apple and Google. Which means if it faces a form of accountability, it’s from Silicon Valley. (Telegram did not respond to questions about when it decides to moderate content.)

    I'm reading this and I have to see I really don't see why this is an abject example of the failure of free speech absolutism or whatever you wanna call it.

    Yes, obviously a messaging app that protects user's privacy and does not hand over info is appealing to any group that wants that, and that includes white supremacists. Contrary to what professor Daniels seems to be suggesting, this is not an unforeseen and unintended consequence. It is a predictable and acceptable consequence. Anyone could have told you that! This is in no way a failure. Protecting against free speech interference is protecting against free speech interference.

    What you probably meant to say was "Telegram becomes an example of something I don't like".

    The fact that you call the providing of white supremacists safe places to organize and coordinate harassment "a predictable and acceptable consequence" continues to show why free speech absolutism is a failed concept. Why should the price of "free speech" be the enabling of harassment and violence against minorities

    Because as the article notes and you conspicuously failed to quote, this "free speech" also protects political and human rights activists from prosecution and censorship in places like Russia, Hong Kong and Iran. It allows socialists and antifascists everywhere to communicate and organize in spite of persecution. The app is designed to ensure this.

    The price paid for principled protection against politically motivated censorship is that it protects even when you believe think the particular politically motivated censorship is good.

    HamHamJFrankiedarlingElvenshae
  • PhillisherePhillishere Registered User regular
    edited August 16
    Julius wrote: »
    Julius wrote: »
    Telegram becomes an object example of the failure of free speech absolutism:
    This development might be inevitable on a platform designed to circumvent censorship in countries like Russia and mass surveillance in the United States. “An absolutist approach to free speech is a classic door that white supremacists have been driving their trucks through for decades,” said Jessie Daniels, a professor of sociology at Hunter College, CUNY, who has written several books about white supremacy and technology. “They know that there’s no real counter to that stance if they show up.” Free speech, says Daniels, is often something that people on the left want to protect no matter what as a bedrock of democracy, even when white supremacists have always used free speech to elevate anti-democratic, racist, and even genocidal ideas. White supremacists are often quick to adopt new technologies where there aren’t rules or community standards that will get in their way, Daniels said. For that reason, she calls these groups “innovation opportunists.”
    In some ways, Telegram faces the same challenge as all social media: It wants to provide a safe platform for everyone, but some of those people want to cause harm. One big difference between Telegram and other platforms is that Telegram isn’t American (its developers are based in Dubai), so it’s not beholden to pressures from Congress to clean up its act or face regulation. It’s also not trying to make money, so fears of a tarnished brand probably don’t factor into the app’s governance, either.

    As with the ISIS episode, Telegram will sometimes intervene. Last year, Telegram got booted from Apple’s App Store because it hosted “inappropriate content”; it was restored within 48 hours after new filters were put in place. Now, when attempting to visit some neo-Nazi or racist meme channels on an iPhone—for example, one called “There Is No Political Solution”—you’ll encounter a message that reads: “Unfortunately, this channel couldn’t be displayed on your device because it violates Apple App Store Review Guidelines, section 1.1.1.” The same channel can be found if you aren’t using an Apple device. Telegram, which wants to be available to any possible user, needs to interoperate with major infrastructure companies, like Apple and Google. Which means if it faces a form of accountability, it’s from Silicon Valley. (Telegram did not respond to questions about when it decides to moderate content.)

    I'm reading this and I have to see I really don't see why this is an abject example of the failure of free speech absolutism or whatever you wanna call it.

    Yes, obviously a messaging app that protects user's privacy and does not hand over info is appealing to any group that wants that, and that includes white supremacists. Contrary to what professor Daniels seems to be suggesting, this is not an unforeseen and unintended consequence. It is a predictable and acceptable consequence. Anyone could have told you that! This is in no way a failure. Protecting against free speech interference is protecting against free speech interference.

    What you probably meant to say was "Telegram becomes an example of something I don't like".

    The fact that you call the providing of white supremacists safe places to organize and coordinate harassment "a predictable and acceptable consequence" continues to show why free speech absolutism is a failed concept. Why should the price of "free speech" be the enabling of harassment and violence against minorities

    Because as the article notes and you conspicuously failed to quote, this "free speech" also protects political and human rights activists from prosecution and censorship in places like Russia, Hong Kong and Iran. It allows socialists and antifascists everywhere to communicate and organize in spite of persecution. The app is designed to ensure this.

    The price paid for principled protection against politically motivated censorship is that it protects even when you believe think the particular politically motivated censorship is good.

    I think the point being made, that you refuse to address directly, is that the price isn’t being paid by you. That’s why you get to use the lofty rhetoric, and minorities, women, LGBT, and other marginalized voices have to leave online spaces for fear of being harassed or murdered by fascists.

    Phillishere on
    mrondeauAngelHedgieA Kobold's KoboldAistanMan in the MistsYoutubeFANTOMAS
  • Jebus314Jebus314 Registered User regular
    Julius wrote: »
    Julius wrote: »
    Telegram becomes an object example of the failure of free speech absolutism:
    This development might be inevitable on a platform designed to circumvent censorship in countries like Russia and mass surveillance in the United States. “An absolutist approach to free speech is a classic door that white supremacists have been driving their trucks through for decades,” said Jessie Daniels, a professor of sociology at Hunter College, CUNY, who has written several books about white supremacy and technology. “They know that there’s no real counter to that stance if they show up.” Free speech, says Daniels, is often something that people on the left want to protect no matter what as a bedrock of democracy, even when white supremacists have always used free speech to elevate anti-democratic, racist, and even genocidal ideas. White supremacists are often quick to adopt new technologies where there aren’t rules or community standards that will get in their way, Daniels said. For that reason, she calls these groups “innovation opportunists.”
    In some ways, Telegram faces the same challenge as all social media: It wants to provide a safe platform for everyone, but some of those people want to cause harm. One big difference between Telegram and other platforms is that Telegram isn’t American (its developers are based in Dubai), so it’s not beholden to pressures from Congress to clean up its act or face regulation. It’s also not trying to make money, so fears of a tarnished brand probably don’t factor into the app’s governance, either.

    As with the ISIS episode, Telegram will sometimes intervene. Last year, Telegram got booted from Apple’s App Store because it hosted “inappropriate content”; it was restored within 48 hours after new filters were put in place. Now, when attempting to visit some neo-Nazi or racist meme channels on an iPhone—for example, one called “There Is No Political Solution”—you’ll encounter a message that reads: “Unfortunately, this channel couldn’t be displayed on your device because it violates Apple App Store Review Guidelines, section 1.1.1.” The same channel can be found if you aren’t using an Apple device. Telegram, which wants to be available to any possible user, needs to interoperate with major infrastructure companies, like Apple and Google. Which means if it faces a form of accountability, it’s from Silicon Valley. (Telegram did not respond to questions about when it decides to moderate content.)

    I'm reading this and I have to see I really don't see why this is an abject example of the failure of free speech absolutism or whatever you wanna call it.

    Yes, obviously a messaging app that protects user's privacy and does not hand over info is appealing to any group that wants that, and that includes white supremacists. Contrary to what professor Daniels seems to be suggesting, this is not an unforeseen and unintended consequence. It is a predictable and acceptable consequence. Anyone could have told you that! This is in no way a failure. Protecting against free speech interference is protecting against free speech interference.

    What you probably meant to say was "Telegram becomes an example of something I don't like".

    The fact that you call the providing of white supremacists safe places to organize and coordinate harassment "a predictable and acceptable consequence" continues to show why free speech absolutism is a failed concept. Why should the price of "free speech" be the enabling of harassment and violence against minorities

    Because as the article notes and you conspicuously failed to quote, this "free speech" also protects political and human rights activists from prosecution and censorship in places like Russia, Hong Kong and Iran. It allows socialists and antifascists everywhere to communicate and organize in spite of persecution. The app is designed to ensure this.

    The price paid for principled protection against politically motivated censorship is that it protects even when you believe think the particular politically motivated censorship is good.

    Telegram just doesn't seem to be a good example of free speech absolutism (and the "failures" thereof). The article paints it mostly as a messaging app and a way for groups to organize. The harms that absolute free speech might entail just don't really show up in that landscape. I really don't think the existence of Nazi's or their use of your system is a default failure state. The same way I wouldn't call a restaurant a failure in free speech because Nazi's like to eat there (while not bothering anyone else).

    The harms come when the Nazi's try and drive out other viewpoints/people. Which doesn't seem to be happening on Telegram, or at least I didn't see any evidence of it from the article. We can certainly continue the discussion on when and where speech should be limited, but telegram doesn't seem like a good place to start.

    "The world is a mess, and I just need to rule it" - Dr Horrible
    Phoenix-DJuliusGnome-Interruptus
  • JuliusJulius Registered User regular
    Julius wrote: »
    Telegram becomes an object example of the failure of free speech absolutism:
    This development might be inevitable on a platform designed to circumvent censorship in countries like Russia and mass surveillance in the United States. “An absolutist approach to free speech is a classic door that white supremacists have been driving their trucks through for decades,” said Jessie Daniels, a professor of sociology at Hunter College, CUNY, who has written several books about white supremacy and technology. “They know that there’s no real counter to that stance if they show up.” Free speech, says Daniels, is often something that people on the left want to protect no matter what as a bedrock of democracy, even when white supremacists have always used free speech to elevate anti-democratic, racist, and even genocidal ideas. White supremacists are often quick to adopt new technologies where there aren’t rules or community standards that will get in their way, Daniels said. For that reason, she calls these groups “innovation opportunists.”
    In some ways, Telegram faces the same challenge as all social media: It wants to provide a safe platform for everyone, but some of those people want to cause harm. One big difference between Telegram and other platforms is that Telegram isn’t American (its developers are based in Dubai), so it’s not beholden to pressures from Congress to clean up its act or face regulation. It’s also not trying to make money, so fears of a tarnished brand probably don’t factor into the app’s governance, either.

    As with the ISIS episode, Telegram will sometimes intervene. Last year, Telegram got booted from Apple’s App Store because it hosted “inappropriate content”; it was restored within 48 hours after new filters were put in place. Now, when attempting to visit some neo-Nazi or racist meme channels on an iPhone—for example, one called “There Is No Political Solution”—you’ll encounter a message that reads: “Unfortunately, this channel couldn’t be displayed on your device because it violates Apple App Store Review Guidelines, section 1.1.1.” The same channel can be found if you aren’t using an Apple device. Telegram, which wants to be available to any possible user, needs to interoperate with major infrastructure companies, like Apple and Google. Which means if it faces a form of accountability, it’s from Silicon Valley. (Telegram did not respond to questions about when it decides to moderate content.)

    I'm reading this and I have to see I really don't see why this is an abject example of the failure of free speech absolutism or whatever you wanna call it.

    Yes, obviously a messaging app that protects user's privacy and does not hand over info is appealing to any group that wants that, and that includes white supremacists. Contrary to what professor Daniels seems to be suggesting, this is not an unforeseen and unintended consequence. It is a predictable and acceptable consequence. Anyone could have told you that! This is in no way a failure. Protecting against free speech interference is protecting against free speech interference.

    What you probably meant to say was "Telegram becomes an example of something I don't like".

    The fact that you call the providing of white supremacists safe places to organize and coordinate harassment "a predictable and acceptable consequence" continues to show why free speech absolutism is a failed concept. Why should the price of "free speech" be the enabling of harassment and violence against minorities (which, let's remember, has a chilling effect on free speech as those targeted leave the discussion to protect their own safety)?

    After reading the New York Times' retrospective on Giggledygoop 5 years out, and the damage that they did, the way they made people fear for their lives, fear to actually speak out - I don't find the "hate speech is the price of free speech" argument to be a compelling one. Bigots will, by their very nature, use their speech to chill the speech of their targets. There is no coexistence.

    Both leftism and liberalism have a huge weakness in that they can be turned into weapons and shields for white supremacists who can speak their language and manipulate their ideas. It is another reason why diversity and intersectionalism is important - to call out the bullshit.

    If your leftist principles support enabling white supremacists in creating a safe space to organize attacks against the oppressed, they aren’t leftist principles. You are just practicing another form of white supremacy.

    I mean I also hold the principle that the police shouldn't be allowed to just bust open your door and invade your home, despite the fact that that also enables white supremacists to create a safe space in their home to organize attacks.

    I am not really sure opposing government oppression is really just practising another form of white supremacy, but I'm willing to hear you out. I assume your argument is more sophisticated than "believing in rights for everyone includes believing in rights for white supremacists -> white supremacists can organize within their rights -> you are thus also practicing white supremacy".?

    Phoenix-DMrMisterNSDFRandJebus314FrankiedarlingElvenshaeMonwynLanlaorntinwhiskersGnome-Interruptus
  • HamHamJHamHamJ Registered User regular
    Julius wrote: »
    Julius wrote: »
    Telegram becomes an object example of the failure of free speech absolutism:
    This development might be inevitable on a platform designed to circumvent censorship in countries like Russia and mass surveillance in the United States. “An absolutist approach to free speech is a classic door that white supremacists have been driving their trucks through for decades,” said Jessie Daniels, a professor of sociology at Hunter College, CUNY, who has written several books about white supremacy and technology. “They know that there’s no real counter to that stance if they show up.” Free speech, says Daniels, is often something that people on the left want to protect no matter what as a bedrock of democracy, even when white supremacists have always used free speech to elevate anti-democratic, racist, and even genocidal ideas. White supremacists are often quick to adopt new technologies where there aren’t rules or community standards that will get in their way, Daniels said. For that reason, she calls these groups “innovation opportunists.”
    In some ways, Telegram faces the same challenge as all social media: It wants to provide a safe platform for everyone, but some of those people want to cause harm. One big difference between Telegram and other platforms is that Telegram isn’t American (its developers are based in Dubai), so it’s not beholden to pressures from Congress to clean up its act or face regulation. It’s also not trying to make money, so fears of a tarnished brand probably don’t factor into the app’s governance, either.

    As with the ISIS episode, Telegram will sometimes intervene. Last year, Telegram got booted from Apple’s App Store because it hosted “inappropriate content”; it was restored within 48 hours after new filters were put in place. Now, when attempting to visit some neo-Nazi or racist meme channels on an iPhone—for example, one called “There Is No Political Solution”—you’ll encounter a message that reads: “Unfortunately, this channel couldn’t be displayed on your device because it violates Apple App Store Review Guidelines, section 1.1.1.” The same channel can be found if you aren’t using an Apple device. Telegram, which wants to be available to any possible user, needs to interoperate with major infrastructure companies, like Apple and Google. Which means if it faces a form of accountability, it’s from Silicon Valley. (Telegram did not respond to questions about when it decides to moderate content.)

    I'm reading this and I have to see I really don't see why this is an abject example of the failure of free speech absolutism or whatever you wanna call it.

    Yes, obviously a messaging app that protects user's privacy and does not hand over info is appealing to any group that wants that, and that includes white supremacists. Contrary to what professor Daniels seems to be suggesting, this is not an unforeseen and unintended consequence. It is a predictable and acceptable consequence. Anyone could have told you that! This is in no way a failure. Protecting against free speech interference is protecting against free speech interference.

    What you probably meant to say was "Telegram becomes an example of something I don't like".

    The fact that you call the providing of white supremacists safe places to organize and coordinate harassment "a predictable and acceptable consequence" continues to show why free speech absolutism is a failed concept. Why should the price of "free speech" be the enabling of harassment and violence against minorities

    Because as the article notes and you conspicuously failed to quote, this "free speech" also protects political and human rights activists from prosecution and censorship in places like Russia, Hong Kong and Iran. It allows socialists and antifascists everywhere to communicate and organize in spite of persecution. The app is designed to ensure this.

    The price paid for principled protection against politically motivated censorship is that it protects even when you believe think the particular politically motivated censorship is good.

    I think the point being made, that you refuse to address directly, is that the price isn’t being paid by you. That’s why you get to use the lofty rhetoric, and minorities, women, LGBT, and other marginalized voices have to leave online spaces for fear of being harassed or murdered by fascists.

    And the price you are willing to pay is disidents in authoritarian regimes actually being jailed and murdered?

    While racing light mechs, your Urbanmech comes in second place, but only because it ran out of ammo.
    FrankiedarlingLanlaorntinwhiskers
  • PhillisherePhillishere Registered User regular
    edited August 16
    Julius wrote: »
    Julius wrote: »
    Telegram becomes an object example of the failure of free speech absolutism:
    This development might be inevitable on a platform designed to circumvent censorship in countries like Russia and mass surveillance in the United States. “An absolutist approach to free speech is a classic door that white supremacists have been driving their trucks through for decades,” said Jessie Daniels, a professor of sociology at Hunter College, CUNY, who has written several books about white supremacy and technology. “They know that there’s no real counter to that stance if they show up.” Free speech, says Daniels, is often something that people on the left want to protect no matter what as a bedrock of democracy, even when white supremacists have always used free speech to elevate anti-democratic, racist, and even genocidal ideas. White supremacists are often quick to adopt new technologies where there aren’t rules or community standards that will get in their way, Daniels said. For that reason, she calls these groups “innovation opportunists.”
    In some ways, Telegram faces the same challenge as all social media: It wants to provide a safe platform for everyone, but some of those people want to cause harm. One big difference between Telegram and other platforms is that Telegram isn’t American (its developers are based in Dubai), so it’s not beholden to pressures from Congress to clean up its act or face regulation. It’s also not trying to make money, so fears of a tarnished brand probably don’t factor into the app’s governance, either.

    As with the ISIS episode, Telegram will sometimes intervene. Last year, Telegram got booted from Apple’s App Store because it hosted “inappropriate content”; it was restored within 48 hours after new filters were put in place. Now, when attempting to visit some neo-Nazi or racist meme channels on an iPhone—for example, one called “There Is No Political Solution”—you’ll encounter a message that reads: “Unfortunately, this channel couldn’t be displayed on your device because it violates Apple App Store Review Guidelines, section 1.1.1.” The same channel can be found if you aren’t using an Apple device. Telegram, which wants to be available to any possible user, needs to interoperate with major infrastructure companies, like Apple and Google. Which means if it faces a form of accountability, it’s from Silicon Valley. (Telegram did not respond to questions about when it decides to moderate content.)

    I'm reading this and I have to see I really don't see why this is an abject example of the failure of free speech absolutism or whatever you wanna call it.

    Yes, obviously a messaging app that protects user's privacy and does not hand over info is appealing to any group that wants that, and that includes white supremacists. Contrary to what professor Daniels seems to be suggesting, this is not an unforeseen and unintended consequence. It is a predictable and acceptable consequence. Anyone could have told you that! This is in no way a failure. Protecting against free speech interference is protecting against free speech interference.

    What you probably meant to say was "Telegram becomes an example of something I don't like".

    The fact that you call the providing of white supremacists safe places to organize and coordinate harassment "a predictable and acceptable consequence" continues to show why free speech absolutism is a failed concept. Why should the price of "free speech" be the enabling of harassment and violence against minorities (which, let's remember, has a chilling effect on free speech as those targeted leave the discussion to protect their own safety)?

    After reading the New York Times' retrospective on Giggledygoop 5 years out, and the damage that they did, the way they made people fear for their lives, fear to actually speak out - I don't find the "hate speech is the price of free speech" argument to be a compelling one. Bigots will, by their very nature, use their speech to chill the speech of their targets. There is no coexistence.

    Both leftism and liberalism have a huge weakness in that they can be turned into weapons and shields for white supremacists who can speak their language and manipulate their ideas. It is another reason why diversity and intersectionalism is important - to call out the bullshit.

    If your leftist principles support enabling white supremacists in creating a safe space to organize attacks against the oppressed, they aren’t leftist principles. You are just practicing another form of white supremacy.

    I mean I also hold the principle that the police shouldn't be allowed to just bust open your door and invade your home, despite the fact that that also enables white supremacists to create a safe space in their home to organize attacks.

    I am not really sure opposing government oppression is really just practising another form of white supremacy, but I'm willing to hear you out. I assume your argument is more sophisticated than "believing in rights for everyone includes believing in rights for white supremacists -> white supremacists can organize within their rights -> you are thus also practicing white supremacy".?

    The point is that we are talking about issues with regulating the Internet (which should be a common carrier, anyway) and not police search and seizure rules. You keep thrusting up the red banner and marching around, talking about oppression in China, while avoiding the topic at hand - oppression of your brothers and sisters right here and now. We are talking about harassment in the public square.

    Like, if the entire point of leftism is to crouch in the shadows hiding from the cops alongside the white supremacists, then what's the fucking point? The tyrants are going to shape laws to hold power, and it will take more than principles to stop them. That's not an excuse to block rational laws against incitement, harassment, and literal conspiracies to commit murder.

    Phillishere on
    mrondeauBigJoeMMan in the MistsYoutubeFANTOMASAngelHedgie
  • mrondeaumrondeau Montréal, CanadaRegistered User regular
    HamHamJ wrote: »
    Julius wrote: »
    Julius wrote: »
    Telegram becomes an object example of the failure of free speech absolutism:
    This development might be inevitable on a platform designed to circumvent censorship in countries like Russia and mass surveillance in the United States. “An absolutist approach to free speech is a classic door that white supremacists have been driving their trucks through for decades,” said Jessie Daniels, a professor of sociology at Hunter College, CUNY, who has written several books about white supremacy and technology. “They know that there’s no real counter to that stance if they show up.” Free speech, says Daniels, is often something that people on the left want to protect no matter what as a bedrock of democracy, even when white supremacists have always used free speech to elevate anti-democratic, racist, and even genocidal ideas. White supremacists are often quick to adopt new technologies where there aren’t rules or community standards that will get in their way, Daniels said. For that reason, she calls these groups “innovation opportunists.”
    In some ways, Telegram faces the same challenge as all social media: It wants to provide a safe platform for everyone, but some of those people want to cause harm. One big difference between Telegram and other platforms is that Telegram isn’t American (its developers are based in Dubai), so it’s not beholden to pressures from Congress to clean up its act or face regulation. It’s also not trying to make money, so fears of a tarnished brand probably don’t factor into the app’s governance, either.

    As with the ISIS episode, Telegram will sometimes intervene. Last year, Telegram got booted from Apple’s App Store because it hosted “inappropriate content”; it was restored within 48 hours after new filters were put in place. Now, when attempting to visit some neo-Nazi or racist meme channels on an iPhone—for example, one called “There Is No Political Solution”—you’ll encounter a message that reads: “Unfortunately, this channel couldn’t be displayed on your device because it violates Apple App Store Review Guidelines, section 1.1.1.” The same channel can be found if you aren’t using an Apple device. Telegram, which wants to be available to any possible user, needs to interoperate with major infrastructure companies, like Apple and Google. Which means if it faces a form of accountability, it’s from Silicon Valley. (Telegram did not respond to questions about when it decides to moderate content.)

    I'm reading this and I have to see I really don't see why this is an abject example of the failure of free speech absolutism or whatever you wanna call it.

    Yes, obviously a messaging app that protects user's privacy and does not hand over info is appealing to any group that wants that, and that includes white supremacists. Contrary to what professor Daniels seems to be suggesting, this is not an unforeseen and unintended consequence. It is a predictable and acceptable consequence. Anyone could have told you that! This is in no way a failure. Protecting against free speech interference is protecting against free speech interference.

    What you probably meant to say was "Telegram becomes an example of something I don't like".

    The fact that you call the providing of white supremacists safe places to organize and coordinate harassment "a predictable and acceptable consequence" continues to show why free speech absolutism is a failed concept. Why should the price of "free speech" be the enabling of harassment and violence against minorities

    Because as the article notes and you conspicuously failed to quote, this "free speech" also protects political and human rights activists from prosecution and censorship in places like Russia, Hong Kong and Iran. It allows socialists and antifascists everywhere to communicate and organize in spite of persecution. The app is designed to ensure this.

    The price paid for principled protection against politically motivated censorship is that it protects even when you believe think the particular politically motivated censorship is good.

    I think the point being made, that you refuse to address directly, is that the price isn’t being paid by you. That’s why you get to use the lofty rhetoric, and minorities, women, LGBT, and other marginalized voices have to leave online spaces for fear of being harassed or murdered by fascists.

    And the price you are willing to pay is disidents in authoritarian regimes actually being jailed and murdered?
    ... You might want to learn about causes and effects.

    Phillishere
  • Phoenix-DPhoenix-D Registered User regular
    Julius wrote: »
    Julius wrote: »
    Telegram becomes an object example of the failure of free speech absolutism:
    This development might be inevitable on a platform designed to circumvent censorship in countries like Russia and mass surveillance in the United States. “An absolutist approach to free speech is a classic door that white supremacists have been driving their trucks through for decades,” said Jessie Daniels, a professor of sociology at Hunter College, CUNY, who has written several books about white supremacy and technology. “They know that there’s no real counter to that stance if they show up.” Free speech, says Daniels, is often something that people on the left want to protect no matter what as a bedrock of democracy, even when white supremacists have always used free speech to elevate anti-democratic, racist, and even genocidal ideas. White supremacists are often quick to adopt new technologies where there aren’t rules or community standards that will get in their way, Daniels said. For that reason, she calls these groups “innovation opportunists.”
    In some ways, Telegram faces the same challenge as all social media: It wants to provide a safe platform for everyone, but some of those people want to cause harm. One big difference between Telegram and other platforms is that Telegram isn’t American (its developers are based in Dubai), so it’s not beholden to pressures from Congress to clean up its act or face regulation. It’s also not trying to make money, so fears of a tarnished brand probably don’t factor into the app’s governance, either.

    As with the ISIS episode, Telegram will sometimes intervene. Last year, Telegram got booted from Apple’s App Store because it hosted “inappropriate content”; it was restored within 48 hours after new filters were put in place. Now, when attempting to visit some neo-Nazi or racist meme channels on an iPhone—for example, one called “There Is No Political Solution”—you’ll encounter a message that reads: “Unfortunately, this channel couldn’t be displayed on your device because it violates Apple App Store Review Guidelines, section 1.1.1.” The same channel can be found if you aren’t using an Apple device. Telegram, which wants to be available to any possible user, needs to interoperate with major infrastructure companies, like Apple and Google. Which means if it faces a form of accountability, it’s from Silicon Valley. (Telegram did not respond to questions about when it decides to moderate content.)

    I'm reading this and I have to see I really don't see why this is an abject example of the failure of free speech absolutism or whatever you wanna call it.

    Yes, obviously a messaging app that protects user's privacy and does not hand over info is appealing to any group that wants that, and that includes white supremacists. Contrary to what professor Daniels seems to be suggesting, this is not an unforeseen and unintended consequence. It is a predictable and acceptable consequence. Anyone could have told you that! This is in no way a failure. Protecting against free speech interference is protecting against free speech interference.

    What you probably meant to say was "Telegram becomes an example of something I don't like".

    The fact that you call the providing of white supremacists safe places to organize and coordinate harassment "a predictable and acceptable consequence" continues to show why free speech absolutism is a failed concept. Why should the price of "free speech" be the enabling of harassment and violence against minorities

    Because as the article notes and you conspicuously failed to quote, this "free speech" also protects political and human rights activists from prosecution and censorship in places like Russia, Hong Kong and Iran. It allows socialists and antifascists everywhere to communicate and organize in spite of persecution. The app is designed to ensure this.

    The price paid for principled protection against politically motivated censorship is that it protects even when you believe think the particular politically motivated censorship is good.

    I think the point being made, that you refuse to address directly, is that the price isn’t being paid by you. That’s why you get to use the lofty rhetoric, and minorities, women, LGBT, and other marginalized voices have to leave online spaces for fear of being harassed or murdered by fascists.

    That's...not how telegram works and, breaking it would be worse (guess what ICE would love to have? The telegram logs of every protest planning chat. I'm sure Saudi Arabia would love those LGBT hookup chat logd, etc) etc.

    This isn't Twitter where they're are deliberately enabling the dipshits, it's a side effect. Nor is anyone really able to be chased off it by bigots.

    JuliusWhiteZinfandelJebus314FrankiedarlingElvenshae
  • PhillisherePhillishere Registered User regular
    edited August 16
    mrondeau wrote: »
    HamHamJ wrote: »
    Julius wrote: »
    Julius wrote: »
    Telegram becomes an object example of the failure of free speech absolutism:
    This development might be inevitable on a platform designed to circumvent censorship in countries like Russia and mass surveillance in the United States. “An absolutist approach to free speech is a classic door that white supremacists have been driving their trucks through for decades,” said Jessie Daniels, a professor of sociology at Hunter College, CUNY, who has written several books about white supremacy and technology. “They know that there’s no real counter to that stance if they show up.” Free speech, says Daniels, is often something that people on the left want to protect no matter what as a bedrock of democracy, even when white supremacists have always used free speech to elevate anti-democratic, racist, and even genocidal ideas. White supremacists are often quick to adopt new technologies where there aren’t rules or community standards that will get in their way, Daniels said. For that reason, she calls these groups “innovation opportunists.”
    In some ways, Telegram faces the same challenge as all social media: It wants to provide a safe platform for everyone, but some of those people want to cause harm. One big difference between Telegram and other platforms is that Telegram isn’t American (its developers are based in Dubai), so it’s not beholden to pressures from Congress to clean up its act or face regulation. It’s also not trying to make money, so fears of a tarnished brand probably don’t factor into the app’s governance, either.

    As with the ISIS episode, Telegram will sometimes intervene. Last year, Telegram got booted from Apple’s App Store because it hosted “inappropriate content”; it was restored within 48 hours after new filters were put in place. Now, when attempting to visit some neo-Nazi or racist meme channels on an iPhone—for example, one called “There Is No Political Solution”—you’ll encounter a message that reads: “Unfortunately, this channel couldn’t be displayed on your device because it violates Apple App Store Review Guidelines, section 1.1.1.” The same channel can be found if you aren’t using an Apple device. Telegram, which wants to be available to any possible user, needs to interoperate with major infrastructure companies, like Apple and Google. Which means if it faces a form of accountability, it’s from Silicon Valley. (Telegram did not respond to questions about when it decides to moderate content.)

    I'm reading this and I have to see I really don't see why this is an abject example of the failure of free speech absolutism or whatever you wanna call it.

    Yes, obviously a messaging app that protects user's privacy and does not hand over info is appealing to any group that wants that, and that includes white supremacists. Contrary to what professor Daniels seems to be suggesting, this is not an unforeseen and unintended consequence. It is a predictable and acceptable consequence. Anyone could have told you that! This is in no way a failure. Protecting against free speech interference is protecting against free speech interference.

    What you probably meant to say was "Telegram becomes an example of something I don't like".

    The fact that you call the providing of white supremacists safe places to organize and coordinate harassment "a predictable and acceptable consequence" continues to show why free speech absolutism is a failed concept. Why should the price of "free speech" be the enabling of harassment and violence against minorities

    Because as the article notes and you conspicuously failed to quote, this "free speech" also protects political and human rights activists from prosecution and censorship in places like Russia, Hong Kong and Iran. It allows socialists and antifascists everywhere to communicate and organize in spite of persecution. The app is designed to ensure this.

    The price paid for principled protection against politically motivated censorship is that it protects even when you believe think the particular politically motivated censorship is good.

    I think the point being made, that you refuse to address directly, is that the price isn’t being paid by you. That’s why you get to use the lofty rhetoric, and minorities, women, LGBT, and other marginalized voices have to leave online spaces for fear of being harassed or murdered by fascists.

    And the price you are willing to pay is disidents in authoritarian regimes actually being jailed and murdered?
    ... You might want to learn about causes and effects.

    There is something kind of darkly amusing about the idea that liberal and leftist governments must bind themselves from acting against these fascists because this will somehow moderate authoritarian regimes worldwide. It's been my experience that the kind of governments that freely murder dissidents aren't exactly looking to the world's liberal governments for permission or stopping themselves because of principled stands by Western leftists.

    Does anyone really think that regulating 4chan out of existence is going to somehow make North Korea and China more repressive and not doing so will make them better?

    Phillishere on
    mrondeauCommander ZoomshrykeMeeqeMan in the MistsMrVyngaardNyysjanHappy Little MachineYoutubeMartini_Philosopher
Sign In or Register to comment.