EDIT: I will agree that far too much slack is cut for them - I'm not at all on board with the idea that editorial responsibility shouldn't apply at all, as seems to be the case
Soft moderation is hard.
and, my entire point, entirely a voluntary choice and not a responsibility of the platform holder.
But that seems moralistic rather than practical? Like, yes, ideally a person would engage in self reflection and back off rather than steaming full ahead towards the red line... but people aren't ideal constructs, and when you consider the relatively low cost to tell someone, 'Hey, you're going in the wrong direction. This is not good behavior,' I think there must be onus put on the platform to establish this kind of tiered punishment system.
I mean, the current paradigm certainly isn't working very well regardless of any oughts about human behavior.
Yeah, social platforms should have bigger and more coherent structured moderation and clear rules that are enforced strictly yet fairly. Social media moderation on those sites are a joke. Plus, it'd get a lot of people employed right now.
EDIT: I will agree that far too much slack is cut for them - I'm not at all on board with the idea that editorial responsibility shouldn't apply at all, as seems to be the case
Soft moderation is hard.
and, my entire point, entirely a voluntary choice and not a responsibility of the platform holder.
So the question is: Should it be merely voluntary?
EDIT: I will agree that far too much slack is cut for them - I'm not at all on board with the idea that editorial responsibility shouldn't apply at all, as seems to be the case
Soft moderation is hard.
They have a pill for that.
+2
Options
knitdanIn ur baseKillin ur guysRegistered Userregular
There's some possibly interesting conversation to be had about the Doug Adler vs ESPN lawsuit that dovetails nicely into the issues of "what responsibility does a content platform have to both the audience and those who provide the content" but I'll be damned if I have the brainpower at the moment to express it adequately.
“I was quick when I came in here, I’m twice as quick now”
-Indiana Solo, runner of blades
Hard moderation is me pressing buttons to hand out infractions.
Soft moderation is me telling people to cut that shit out.
I would say that the infraction system here is also a form of soft punishment, though. It is more severe than a verbal warning, less severe than an account suspension and much less severe than a ban.
It seems to work quite effectively. I notice a similar efficacy on other forums I visit that have similar tiers of punishment.
Facebook / Twitter / YouTube having nothing comparable.
Also, like, more reasonable levels of punishment on behalf of platform holders rather than a binary, 'You're perfectly fine... oh wait, now you're completely banned forever,' would probably go a long way.
PewDiePie should have been in trouble plenty of times in the past, but there was no punishment for it because he didn't yet go past the point of absolutely no return. That's a pretty poor way of managing expectations & behavior.
Ooooon the other hand: Is managing the behaviour of the creators really the responsibility of the platform holders?
Because, after all, everything that is not across the line, is not across the line. If somebody seems to be moving towards that line, that is essentially their issue to fix.
Well, if it isn't the responsibility of the platform holders, why ban them at all then? If we prefer the Libertarian approach, then ultimately responsibility should rest on the viewer to curate their own experience and media should just be utterly untamed by design.
Yes, IMHO, it is the responsibility of YouTube and Twitter and Facebook, etc, to have reasonable punishment structures in place that modify the behavior of their users & manage expectations. It is certainly a shared responsibility with content creators, but right now I think far too much slack is cut for the platforms and the enforcement mechanisms / penal codes are often ridiculous.
Why ban them at all? Because certain content is unacceptable, of course.
I'm not saying anything about some libertarian-ish anything. Just that, I'd say that the only person responsible for modifying the behaviour of content creators, are the content creators.
The platform holder's responsibility is whatever is on their platform.
What I'm saying is, I don't see anything principally wrong with having a penal system for your platform where the one and only punishment is the chopping block, for anything that goes past the line of acceptable for that platform.
If someone seems to be edging closer to it, the only ones who have an actual responsibility for heading that off, is themselves, is my point.
EDIT: I will agree that far too much slack is cut for them - I'm not at all on board with the idea that editorial responsibility shouldn't apply at all, as seems to be the case
But that seems moralistic rather than practical? Like, yes, ideally a person would engage in self reflection and back off rather than steaming full ahead towards the red line... but people aren't ideal constructs, and when you consider the relatively low cost to tell someone, 'Hey, you're going in the wrong direction. This is not good behavior,' I think there must be onus put on the platform to establish this kind of tiered punishment system.
I mean, the current paradigm certainly isn't working very well regardless of any oughts about human behavior.
No, you misunderstand me. There's no moralism here. The only difference is practical.
I'm saying, steam full ahead towards the red line, or head yourself off, if you want. If you want to start submitting work that gets rejected, then you do you.
I have set-up some branching narrative paths... and I don't really want any of them to feel like the 'wrong' choice, but it seems really difficult to preserve a sense of verisimilitude this way (especially if the stakes in a story are such that things are very zero sum)?
Like, if I play through one plotline and it tells me I made the right choice (and by extension, that the other choices would have been wrong, because we're in zero sum land) at the end, and then I play through a different plot and it tells me this was also the right choice... I feel like that kind of cheapens the experience?
But I'm not sure what to do. My purist intuition is telling me that the 'bold' thing to do is pick a good path for the sake of having a consistent world and let all of the others just be bad, even though that will almost certainly lead to a shitty endgame experience for players.
/mutters
Will they have an opportunity to try the other paths?
Yes.
You spend most of the game navigating through arguments from different parties about the right approach to solving a conflict, and can choose to experience the perspective of each party more intimately / get to know some of their key players... but then at a critical threshold you basically have to lock in a choice for what approach you will take.
I don't want to piss on players after they've gone through the trouble of making a choice & carrying out a given solution... but I also don't want to make it feel phony.
HM.
/consults Fallout narrative branches.
The solution is realism
All paths lead to disaster
you could always make the choice actually matter.
they choose side a, side a win battle x, and march to the sea through Atlanta. they choose side b, side b wins battle x and they eventually take New York. they aren't privy to the other content and it does make an actual difference.
Twitter is still losing over a hundred million a year. I doubt they can afford a new moderation staff, unfortunately.
Then they're absolutely useless and had better find a way to profit to make that happen. Not having sufficient moderation team isn't a solid selling point at this stage.
Also, like, more reasonable levels of punishment on behalf of platform holders rather than a binary, 'You're perfectly fine... oh wait, now you're completely banned forever,' would probably go a long way.
PewDiePie should have been in trouble plenty of times in the past, but there was no punishment for it because he didn't yet go past the point of absolutely no return. That's a pretty poor way of managing expectations & behavior.
Ooooon the other hand: Is managing the behaviour of the creators really the responsibility of the platform holders?
Because, after all, everything that is not across the line, is not across the line. If somebody seems to be moving towards that line, that is essentially their issue to fix.
Well, if it isn't the responsibility of the platform holders, why ban them at all then? If we prefer the Libertarian approach, then ultimately responsibility should rest on the viewer to curate their own experience and media should just be utterly untamed by design.
Yes, IMHO, it is the responsibility of YouTube and Twitter and Facebook, etc, to have reasonable punishment structures in place that modify the behavior of their users & manage expectations. It is certainly a shared responsibility with content creators, but right now I think far too much slack is cut for the platforms and the enforcement mechanisms / penal codes are often ridiculous.
Why ban them at all? Because certain content is unacceptable, of course.
I'm not saying anything about some libertarian-ish anything. Just that, I'd say that the only person responsible for modifying the behaviour of content creators, are the content creators.
The platform holder's responsibility is whatever is on their platform.
What I'm saying is, I don't see anything principally wrong with having a penal system for your platform where the one and only punishment is the chopping block, for anything that goes past the line of acceptable for that platform.
If someone seems to be edging closer to it, the only ones who have an actual responsibility for heading that off, is themselves, is my point.
EDIT: I will agree that far too much slack is cut for them - I'm not at all on board with the idea that editorial responsibility shouldn't apply at all, as seems to be the case
But that seems moralistic rather than practical? Like, yes, ideally a person would engage in self reflection and back off rather than steaming full ahead towards the red line... but people aren't ideal constructs, and when you consider the relatively low cost to tell someone, 'Hey, you're going in the wrong direction. This is not good behavior,' I think there must be onus put on the platform to establish this kind of tiered punishment system.
I mean, the current paradigm certainly isn't working very well regardless of any oughts about human behavior.
I disagree.
The onus is on the user to be aware of the environment/customs/expectations of the platform they're using.
I wouldn't even say it's required for the platform to have a clear list of rules posted, though that's certainly a helpful thing to have.
Basically my opinion is no user is owed a platform and they do not have anything resembling a right to a platform.
If they fail to properly understand the environment/customs/expectations of the privately owned platform they are using and get banned?
Hard moderation is me pressing buttons to hand out infractions.
Soft moderation is me telling people to cut that shit out.
I would say that the infraction system here is also a form of soft punishment, though. It is more severe than a verbal warning, less severe than an account suspension and much less severe than a ban.
It seems to work quite effectively. I notice a similar efficacy on other forums I visit that have similar tiers of punishment.
Facebook / Twitter / YouTube having nothing comparable.
You Tube and Reddit are flirting more with harder moderation, nowhere near what it should be but there have been results. I've seen accounts that engage in racist behavior punished repeatedly until they're finally banned on YT, for instance.
EDIT: I will agree that far too much slack is cut for them - I'm not at all on board with the idea that editorial responsibility shouldn't apply at all, as seems to be the case
Soft moderation is hard.
and, my entire point, entirely a voluntary choice and not a responsibility of the platform holder.
So the question is: Should it be merely voluntary?
I see it like a newspaper: Your devolving mental state doesn't matter to the editorial staff. Only whether or not any given letter you submit can be published or not.
Hard moderation is me pressing buttons to hand out infractions.
Soft moderation is me telling people to cut that shit out.
I would say that the infraction system here is also a form of soft punishment, though. It is more severe than a verbal warning, less severe than an account suspension and much less severe than a ban.
It seems to work quite effectively. I notice a similar efficacy on other forums I visit that have similar tiers of punishment.
Facebook / Twitter / YouTube having nothing comparable.
yeah in the context of the conversation we've been having, infractions are absolutely soft moderation
Also, like, more reasonable levels of punishment on behalf of platform holders rather than a binary, 'You're perfectly fine... oh wait, now you're completely banned forever,' would probably go a long way.
PewDiePie should have been in trouble plenty of times in the past, but there was no punishment for it because he didn't yet go past the point of absolutely no return. That's a pretty poor way of managing expectations & behavior.
Ooooon the other hand: Is managing the behaviour of the creators really the responsibility of the platform holders?
Because, after all, everything that is not across the line, is not across the line. If somebody seems to be moving towards that line, that is essentially their issue to fix.
Well, if it isn't the responsibility of the platform holders, why ban them at all then? If we prefer the Libertarian approach, then ultimately responsibility should rest on the viewer to curate their own experience and media should just be utterly untamed by design.
Yes, IMHO, it is the responsibility of YouTube and Twitter and Facebook, etc, to have reasonable punishment structures in place that modify the behavior of their users & manage expectations. It is certainly a shared responsibility with content creators, but right now I think far too much slack is cut for the platforms and the enforcement mechanisms / penal codes are often ridiculous.
Why ban them at all? Because certain content is unacceptable, of course.
I'm not saying anything about some libertarian-ish anything. Just that, I'd say that the only person responsible for modifying the behaviour of content creators, are the content creators.
The platform holder's responsibility is whatever is on their platform.
What I'm saying is, I don't see anything principally wrong with having a penal system for your platform where the one and only punishment is the chopping block, for anything that goes past the line of acceptable for that platform.
If someone seems to be edging closer to it, the only ones who have an actual responsibility for heading that off, is themselves, is my point.
EDIT: I will agree that far too much slack is cut for them - I'm not at all on board with the idea that editorial responsibility shouldn't apply at all, as seems to be the case
But that seems moralistic rather than practical? Like, yes, ideally a person would engage in self reflection and back off rather than steaming full ahead towards the red line... but people aren't ideal constructs, and when you consider the relatively low cost to tell someone, 'Hey, you're going in the wrong direction. This is not good behavior,' I think there must be onus put on the platform to establish this kind of tiered punishment system.
I mean, the current paradigm certainly isn't working very well regardless of any oughts about human behavior.
No, you misunderstand me. There's no moralism here. The only difference is practical.
I'm saying, steam full ahead towards the red line, or head yourself off, if you want. If you want to start submitting work that gets rejected, then you do you.
But why is having the red line a responsibility, then? Like, if you're going to say that it is ultimately up to the content creators to reign themselves in, I am confused why there should be any kind of muzzling mechanism in place at all?
With Love and Courage
0
Options
ShivahnUnaware of her barrel shifter privilegeWestern coastal temptressRegistered User, Moderatormod
Also, like, more reasonable levels of punishment on behalf of platform holders rather than a binary, 'You're perfectly fine... oh wait, now you're completely banned forever,' would probably go a long way.
PewDiePie should have been in trouble plenty of times in the past, but there was no punishment for it because he didn't yet go past the point of absolutely no return. That's a pretty poor way of managing expectations & behavior.
Ooooon the other hand: Is managing the behaviour of the creators really the responsibility of the platform holders?
Because, after all, everything that is not across the line, is not across the line. If somebody seems to be moving towards that line, that is essentially their issue to fix.
Well, if it isn't the responsibility of the platform holders, why ban them at all then? If we prefer the Libertarian approach, then ultimately responsibility should rest on the viewer to curate their own experience and media should just be utterly untamed by design.
Yes, IMHO, it is the responsibility of YouTube and Twitter and Facebook, etc, to have reasonable punishment structures in place that modify the behavior of their users & manage expectations. It is certainly a shared responsibility with content creators, but right now I think far too much slack is cut for the platforms and the enforcement mechanisms / penal codes are often ridiculous.
Why ban them at all? Because certain content is unacceptable, of course.
I'm not saying anything about some libertarian-ish anything. Just that, I'd say that the only person responsible for modifying the behaviour of content creators, are the content creators.
The platform holder's responsibility is whatever is on their platform.
What I'm saying is, I don't see anything principally wrong with having a penal system for your platform where the one and only punishment is the chopping block, for anything that goes past the line of acceptable for that platform.
If someone seems to be edging closer to it, the only ones who have an actual responsibility for heading that off, is themselves, is my point.
EDIT: I will agree that far too much slack is cut for them - I'm not at all on board with the idea that editorial responsibility shouldn't apply at all, as seems to be the case
But that seems moralistic rather than practical? Like, yes, ideally a person would engage in self reflection and back off rather than steaming full ahead towards the red line... but people aren't ideal constructs, and when you consider the relatively low cost to tell someone, 'Hey, you're going in the wrong direction. This is not good behavior,' I think there must be onus put on the platform to establish this kind of tiered punishment system.
I mean, the current paradigm certainly isn't working very well regardless of any oughts about human behavior.
No, you misunderstand me. There's no moralism here. The only difference is practical.
I'm saying, steam full ahead towards the red line, or head yourself off, if you want. If you want to start submitting work that gets rejected, then you do you.
But why is having the red line a responsibility, then? Like, if you're going to say that it is ultimately up to the content creators to reign themselves in, I am confused why there should be any kind of muzzling mechanism in place at all?
Because it's your platform, what goes on it is your responsibility. You're publishing it. Editorial responsibility. Which is why some things would be rejected.
Also, like, more reasonable levels of punishment on behalf of platform holders rather than a binary, 'You're perfectly fine... oh wait, now you're completely banned forever,' would probably go a long way.
PewDiePie should have been in trouble plenty of times in the past, but there was no punishment for it because he didn't yet go past the point of absolutely no return. That's a pretty poor way of managing expectations & behavior.
Ooooon the other hand: Is managing the behaviour of the creators really the responsibility of the platform holders?
Because, after all, everything that is not across the line, is not across the line. If somebody seems to be moving towards that line, that is essentially their issue to fix.
Well, if it isn't the responsibility of the platform holders, why ban them at all then? If we prefer the Libertarian approach, then ultimately responsibility should rest on the viewer to curate their own experience and media should just be utterly untamed by design.
Yes, IMHO, it is the responsibility of YouTube and Twitter and Facebook, etc, to have reasonable punishment structures in place that modify the behavior of their users & manage expectations. It is certainly a shared responsibility with content creators, but right now I think far too much slack is cut for the platforms and the enforcement mechanisms / penal codes are often ridiculous.
Why ban them at all? Because certain content is unacceptable, of course.
I'm not saying anything about some libertarian-ish anything. Just that, I'd say that the only person responsible for modifying the behaviour of content creators, are the content creators.
The platform holder's responsibility is whatever is on their platform.
What I'm saying is, I don't see anything principally wrong with having a penal system for your platform where the one and only punishment is the chopping block, for anything that goes past the line of acceptable for that platform.
If someone seems to be edging closer to it, the only ones who have an actual responsibility for heading that off, is themselves, is my point.
EDIT: I will agree that far too much slack is cut for them - I'm not at all on board with the idea that editorial responsibility shouldn't apply at all, as seems to be the case
But that seems moralistic rather than practical? Like, yes, ideally a person would engage in self reflection and back off rather than steaming full ahead towards the red line... but people aren't ideal constructs, and when you consider the relatively low cost to tell someone, 'Hey, you're going in the wrong direction. This is not good behavior,' I think there must be onus put on the platform to establish this kind of tiered punishment system.
I mean, the current paradigm certainly isn't working very well regardless of any oughts about human behavior.
No, you misunderstand me. There's no moralism here. The only difference is practical.
I'm saying, steam full ahead towards the red line, or head yourself off, if you want. If you want to start submitting work that gets rejected, then you do you.
But why is having the red line a responsibility, then? Like, if you're going to say that it is ultimately up to the content creators to reign themselves in, I am confused why there should be any kind of muzzling mechanism in place at all?
To put it bluntly because the platform owner(s) have said platform to make money
The moment a creator's content becomes potentially more costly than profitable that's a problem for them
Killing the relationship with that creator is arguably the correct move in that instance
I'd argue having readily accessible guidelines available for all content creators is a really good idea so that the platform doesn't end up in the above situation
But I'm also not going to complain when a platform decides the guy saying "KILL THE JEWS LAWL THAT'S HILARIOUS RIGHT?" can fuck right the fuck off
0
Options
LudiousI just wanted a sandwich A temporally dislocated QuiznosRegistered Userregular
Also, like, more reasonable levels of punishment on behalf of platform holders rather than a binary, 'You're perfectly fine... oh wait, now you're completely banned forever,' would probably go a long way.
PewDiePie should have been in trouble plenty of times in the past, but there was no punishment for it because he didn't yet go past the point of absolutely no return. That's a pretty poor way of managing expectations & behavior.
Ooooon the other hand: Is managing the behaviour of the creators really the responsibility of the platform holders?
Because, after all, everything that is not across the line, is not across the line. If somebody seems to be moving towards that line, that is essentially their issue to fix.
Well, if it isn't the responsibility of the platform holders, why ban them at all then? If we prefer the Libertarian approach, then ultimately responsibility should rest on the viewer to curate their own experience and media should just be utterly untamed by design.
Yes, IMHO, it is the responsibility of YouTube and Twitter and Facebook, etc, to have reasonable punishment structures in place that modify the behavior of their users & manage expectations. It is certainly a shared responsibility with content creators, but right now I think far too much slack is cut for the platforms and the enforcement mechanisms / penal codes are often ridiculous.
Why ban them at all? Because certain content is unacceptable, of course.
I'm not saying anything about some libertarian-ish anything. Just that, I'd say that the only person responsible for modifying the behaviour of content creators, are the content creators.
The platform holder's responsibility is whatever is on their platform.
What I'm saying is, I don't see anything principally wrong with having a penal system for your platform where the one and only punishment is the chopping block, for anything that goes past the line of acceptable for that platform.
If someone seems to be edging closer to it, the only ones who have an actual responsibility for heading that off, is themselves, is my point.
EDIT: I will agree that far too much slack is cut for them - I'm not at all on board with the idea that editorial responsibility shouldn't apply at all, as seems to be the case
But that seems moralistic rather than practical? Like, yes, ideally a person would engage in self reflection and back off rather than steaming full ahead towards the red line... but people aren't ideal constructs, and when you consider the relatively low cost to tell someone, 'Hey, you're going in the wrong direction. This is not good behavior,' I think there must be onus put on the platform to establish this kind of tiered punishment system.
I mean, the current paradigm certainly isn't working very well regardless of any oughts about human behavior.
I disagree.
The onus is on the user to be aware of the environment/customs/expectations of the platform they're using.
I wouldn't even say it's required for the platform to have a clear list of rules posted, though that's certainly a helpful thing to have.
Basically my opinion is no user is owed a platform and they do not have anything resembling a right to a platform.
If they fail to properly understand the environment/customs/expectations of the privately owned platform they are using and get banned?
Tough shit, that's how it should work.
Well, I can likewise state that nobody is owed a platform to listen to, either. Why should anyone be held accountable just because you were offended or upset by a blog post or video? You can choose not to watch YouTube, afterall. Let the Invisible Hand smite the unworthy.
These platforms now exist, people are using them, it is not going so well in many ways. The school of strictly hard knocks isn't working, and we probably should have been able to guess it wouldn't work because historically judicial systems that only use extreme methods of punishment in response to the worst possible crimes didn't do so well at curbing crime rates. I mean, sure, in abstract I agree that someone like PewDiePie should just understand intuitively what is going too far and what isn't for his shock jock gig... but he clearly didn't, and neither do his myriad imitators. They need someone to teach them & smarten them up, regardless of whether or not we think they ought to need that kind of education / supervision.
With Love and Courage
0
Options
ShivahnUnaware of her barrel shifter privilegeWestern coastal temptressRegistered User, Moderatormod
Also, like, more reasonable levels of punishment on behalf of platform holders rather than a binary, 'You're perfectly fine... oh wait, now you're completely banned forever,' would probably go a long way.
PewDiePie should have been in trouble plenty of times in the past, but there was no punishment for it because he didn't yet go past the point of absolutely no return. That's a pretty poor way of managing expectations & behavior.
Ooooon the other hand: Is managing the behaviour of the creators really the responsibility of the platform holders?
Because, after all, everything that is not across the line, is not across the line. If somebody seems to be moving towards that line, that is essentially their issue to fix.
Well, if it isn't the responsibility of the platform holders, why ban them at all then? If we prefer the Libertarian approach, then ultimately responsibility should rest on the viewer to curate their own experience and media should just be utterly untamed by design.
Yes, IMHO, it is the responsibility of YouTube and Twitter and Facebook, etc, to have reasonable punishment structures in place that modify the behavior of their users & manage expectations. It is certainly a shared responsibility with content creators, but right now I think far too much slack is cut for the platforms and the enforcement mechanisms / penal codes are often ridiculous.
Why ban them at all? Because certain content is unacceptable, of course.
I'm not saying anything about some libertarian-ish anything. Just that, I'd say that the only person responsible for modifying the behaviour of content creators, are the content creators.
The platform holder's responsibility is whatever is on their platform.
What I'm saying is, I don't see anything principally wrong with having a penal system for your platform where the one and only punishment is the chopping block, for anything that goes past the line of acceptable for that platform.
If someone seems to be edging closer to it, the only ones who have an actual responsibility for heading that off, is themselves, is my point.
EDIT: I will agree that far too much slack is cut for them - I'm not at all on board with the idea that editorial responsibility shouldn't apply at all, as seems to be the case
But that seems moralistic rather than practical? Like, yes, ideally a person would engage in self reflection and back off rather than steaming full ahead towards the red line... but people aren't ideal constructs, and when you consider the relatively low cost to tell someone, 'Hey, you're going in the wrong direction. This is not good behavior,' I think there must be onus put on the platform to establish this kind of tiered punishment system.
I mean, the current paradigm certainly isn't working very well regardless of any oughts about human behavior.
No, you misunderstand me. There's no moralism here. The only difference is practical.
I'm saying, steam full ahead towards the red line, or head yourself off, if you want. If you want to start submitting work that gets rejected, then you do you.
But why is having the red line a responsibility, then? Like, if you're going to say that it is ultimately up to the content creators to reign themselves in, I am confused why there should be any kind of muzzling mechanism in place at all?
To put it bluntly because the platform owner(s) have said platform to make money
The moment a creator's content becomes potentially more costly than profitable that's a problem for them
Killing the relationship with that creator is arguably the correct move in that instance
I'd argue having readily accessible guidelines available for all content creators is a really good idea so that the platform doesn't end up in the above situation
But I'm also not going to complain when a platform decides the guy saying "KILL THE JEWS LAWL THAT'S HILARIOUS RIGHT?" can fuck right the fuck off
also you do actually have a legal responsibility for what you publish
on youtube and stuff, mainly expressed through copyright violations
but long before that your standards will probably be "what will drive other people away"
Also, like, more reasonable levels of punishment on behalf of platform holders rather than a binary, 'You're perfectly fine... oh wait, now you're completely banned forever,' would probably go a long way.
PewDiePie should have been in trouble plenty of times in the past, but there was no punishment for it because he didn't yet go past the point of absolutely no return. That's a pretty poor way of managing expectations & behavior.
Ooooon the other hand: Is managing the behaviour of the creators really the responsibility of the platform holders?
Because, after all, everything that is not across the line, is not across the line. If somebody seems to be moving towards that line, that is essentially their issue to fix.
Well, if it isn't the responsibility of the platform holders, why ban them at all then? If we prefer the Libertarian approach, then ultimately responsibility should rest on the viewer to curate their own experience and media should just be utterly untamed by design.
Yes, IMHO, it is the responsibility of YouTube and Twitter and Facebook, etc, to have reasonable punishment structures in place that modify the behavior of their users & manage expectations. It is certainly a shared responsibility with content creators, but right now I think far too much slack is cut for the platforms and the enforcement mechanisms / penal codes are often ridiculous.
Why ban them at all? Because certain content is unacceptable, of course.
I'm not saying anything about some libertarian-ish anything. Just that, I'd say that the only person responsible for modifying the behaviour of content creators, are the content creators.
The platform holder's responsibility is whatever is on their platform.
What I'm saying is, I don't see anything principally wrong with having a penal system for your platform where the one and only punishment is the chopping block, for anything that goes past the line of acceptable for that platform.
If someone seems to be edging closer to it, the only ones who have an actual responsibility for heading that off, is themselves, is my point.
EDIT: I will agree that far too much slack is cut for them - I'm not at all on board with the idea that editorial responsibility shouldn't apply at all, as seems to be the case
But that seems moralistic rather than practical? Like, yes, ideally a person would engage in self reflection and back off rather than steaming full ahead towards the red line... but people aren't ideal constructs, and when you consider the relatively low cost to tell someone, 'Hey, you're going in the wrong direction. This is not good behavior,' I think there must be onus put on the platform to establish this kind of tiered punishment system.
I mean, the current paradigm certainly isn't working very well regardless of any oughts about human behavior.
I disagree.
The onus is on the user to be aware of the environment/customs/expectations of the platform they're using.
I wouldn't even say it's required for the platform to have a clear list of rules posted, though that's certainly a helpful thing to have.
Basically my opinion is no user is owed a platform and they do not have anything resembling a right to a platform.
If they fail to properly understand the environment/customs/expectations of the privately owned platform they are using and get banned?
Tough shit, that's how it should work.
Well, I can likewise state that nobody is owed a platform to listen to, either. Why should anyone be held accountable just because you were offended or upset by a blog post or video? You can choose not to watch YouTube, afterall. Let the Invisible Hand smite the unworthy.
These platforms now exist, people are using them, it is not going so well in many ways. The school of strictly hard knocks isn't working, and we probably should have been able to guess it wouldn't work because historically judicial systems that only use extreme methods of punishment in response to the worst possible crimes didn't do so well at curbing crime rates. I mean, sure, in abstract I agree that someone like PewDiePie should just understand intuitively what is going too far and what isn't for his shock jock gig... but he clearly didn't, and neither do his myriad imitators. They need someone to teach them & smarten them up, regardless of whether or not we think they ought to need that kind of education / supervision.
the analogy of judicial systems and crime rates is a bad one.
I'd argue having readily accessible guidelines available for all content creators is a really good idea so that the platform doesn't end up in the above situation
But I'm also not going to complain when a platform decides the guy saying "KILL THE JEWS LAWL THAT'S HILARIOUS RIGHT?" can fuck right the fuck off
Yes, that was a reasonable place to draw a line and drop the ban hammer.
I'm saying that we wouldn't have even got there, I don't think, if less severe punishment had been administered earlier when PewDiePie was doing some of his shock routines that went slightly too far but not quite into, 'pay some guys to hold up a Kill The Jews' territory.
Posts
and, my entire point, entirely a voluntary choice and not a responsibility of the platform holder.
Yeah, social platforms should have bigger and more coherent structured moderation and clear rules that are enforced strictly yet fairly. Social media moderation on those sites are a joke. Plus, it'd get a lot of people employed right now.
So the question is: Should it be merely voluntary?
Choose Your Own Chat 1 Choose Your Own Chat 2 Choose Your Own Chat 3
Wait, what ritua
(User was sent to the shadow realm for this post)
They have a pill for that.
-Indiana Solo, runner of blades
I would say that the infraction system here is also a form of soft punishment, though. It is more severe than a verbal warning, less severe than an account suspension and much less severe than a ban.
It seems to work quite effectively. I notice a similar efficacy on other forums I visit that have similar tiers of punishment.
Facebook / Twitter / YouTube having nothing comparable.
Gotta love those dark energy disks that will clearly not saw off your legs
Twitter should have one guy generating banning data and then just train a neural net
Mostly because I want to see what happens when people are banned by an algorithm
No, you misunderstand me. There's no moralism here. The only difference is practical.
I'm saying, steam full ahead towards the red line, or head yourself off, if you want. If you want to start submitting work that gets rejected, then you do you.
you could always make the choice actually matter.
they choose side a, side a win battle x, and march to the sea through Atlanta. they choose side b, side b wins battle x and they eventually take New York. they aren't privy to the other content and it does make an actual difference.
Geth, report in
Then they're absolutely useless and had better find a way to profit to make that happen. Not having sufficient moderation team isn't a solid selling point at this stage.
I disagree.
The onus is on the user to be aware of the environment/customs/expectations of the platform they're using.
I wouldn't even say it's required for the platform to have a clear list of rules posted, though that's certainly a helpful thing to have.
Basically my opinion is no user is owed a platform and they do not have anything resembling a right to a platform.
If they fail to properly understand the environment/customs/expectations of the privately owned platform they are using and get banned?
Tough shit, that's how it should work.
They've also had potential buyers backing off due to the hostility caused by lack of moderation.
You Tube and Reddit are flirting more with harder moderation, nowhere near what it should be but there have been results. I've seen accounts that engage in racist behavior punished repeatedly until they're finally banned on YT, for instance.
I see it like a newspaper: Your devolving mental state doesn't matter to the editorial staff. Only whether or not any given letter you submit can be published or not.
yeah in the context of the conversation we've been having, infractions are absolutely soft moderation
But why is having the red line a responsibility, then? Like, if you're going to say that it is ultimately up to the content creators to reign themselves in, I am confused why there should be any kind of muzzling mechanism in place at all?
Geth doesn't take pre-emptive action, though :P
I'm mostly interested in the psychology and sociology that'd be involved
https://www.youtube.com/watch?v=WlgJs_G8Co8
Because it's your platform, what goes on it is your responsibility. You're publishing it. Editorial responsibility. Which is why some things would be rejected.
I'm confused at your confusion!
Geth, design a character that can defeat Data
-Indiana Solo, runner of blades
To put it bluntly because the platform owner(s) have said platform to make money
The moment a creator's content becomes potentially more costly than profitable that's a problem for them
Killing the relationship with that creator is arguably the correct move in that instance
I'd argue having readily accessible guidelines available for all content creators is a really good idea so that the platform doesn't end up in the above situation
But I'm also not going to complain when a platform decides the guy saying "KILL THE JEWS LAWL THAT'S HILARIOUS RIGHT?" can fuck right the fuck off
Point
If you guys want a trained NN I can hook you up.
Well, I can likewise state that nobody is owed a platform to listen to, either. Why should anyone be held accountable just because you were offended or upset by a blog post or video? You can choose not to watch YouTube, afterall. Let the Invisible Hand smite the unworthy.
These platforms now exist, people are using them, it is not going so well in many ways. The school of strictly hard knocks isn't working, and we probably should have been able to guess it wouldn't work because historically judicial systems that only use extreme methods of punishment in response to the worst possible crimes didn't do so well at curbing crime rates. I mean, sure, in abstract I agree that someone like PewDiePie should just understand intuitively what is going too far and what isn't for his shock jock gig... but he clearly didn't, and neither do his myriad imitators. They need someone to teach them & smarten them up, regardless of whether or not we think they ought to need that kind of education / supervision.
Wouldn't you have to train it on this forum's data, specifically?
also you do actually have a legal responsibility for what you publish
on youtube and stuff, mainly expressed through copyright violations
but long before that your standards will probably be "what will drive other people away"
I got a little excited when I saw your ship.
the analogy of judicial systems and crime rates is a bad one.
Yes, that was a reasonable place to draw a line and drop the ban hammer.
I'm saying that we wouldn't have even got there, I don't think, if less severe punishment had been administered earlier when PewDiePie was doing some of his shock routines that went slightly too far but not quite into, 'pay some guys to hold up a Kill The Jews' territory.
https://www.youtube.com/watch?v=USVEz0vx690
I feel like shit
Are the antibiotics not helping?