It's not so simple for people to figure out which company is doing it. If it was simply a matter of choice, transparent choice, there would be no problem.
Don't you mean "What right do people have to transparent truth?" I'm not even asking them to present an idea. I'm just asking them to give some good idea of what the type of things they are removing, and make that up front and obvious to the readers. And then only when it is a public collaboration, of regular members of the public writing things. (What I'm talking about wouldn't apply to a regular news site)
I would love for people to have a right to demand transparent truth, it would stop all the religious channels and rubbish on social media for a start!
I agree for a consumer/user point of view that can be frustrating. I have seen a lot of examples of this in the gaming world where game companies will suspend or ban players without offering any details. They will send you a stock email telling you the you violated the terms of service but they won't give details of how or when it happened? Some will even have an appeal process where you can send them an appeal but the reply you get back is usually the stock "We have reviewed your case and the ban stands". It seems like the bigger the company the less details they give. People have speculated they just do not have the resources to have humans review the cases individually so they rely on algorithms to identify the offense.
That is a good analogy, but a very different situation from the one we are talking about here. Gaming is purely for entertainment, does not have to do with relevant information to society and mostly tends not to be political in nature. If a gaming company bans a player, it only harms the individual player.
I like the spirit of your post, but you have one relationship backwards. Corporations are not controlled by government, government is controlled by corporations. Corporatism as described by Mussolini, fascism as described by most others.
Approximately 35% of the world's population (2,700,000,000) uses Facebook. Is there some social, or even moral, obligation for them not to censor people's opinions and ideas or do we just let Mark Zuckerberg decide what we should read?
nothing stopping others from creating alternatives - donating to sites like this helps them create alternatives forcing a Christian site to allow Muslims to post there is not the answer
The tos still do not necessarily say what is actually going to happen. Many times rules are quite vague and can be interpreted in all sorts of ways, which are not obvious/apparent from reading them. (I've made this type of point repeatedly in the Law & Justice section. People think things run by the law, but in reality in some ways it would be more accurate to say the law allows the enforcers of that law to do things, which might be very far from the intended purpose of the law, or very far from a plain and obvious simple reading of the law. Apparently that's "too complicated" for many people to understand.) And then there's the fact that the majority of people never bother to read all the fine print, so don't know what is being banned. (My point is not that the rules are being applied to them, but rather that they don't realize what is being excluded from their view) Look, if the rules say something plain and obvious like "No talking about alien abductions", that is fine, because it's an obvious rule and everyone knows what it is preventing. I'm talking about about mods censoring things that are not so obvious violations of the rules.
And I wonder when people will stop being stupid and realize that Facebook is not a fair & open venue for discussing controversial political / social issues. Probably half of them don't even realize censorship on this level is going on. I think it originally started when tons of persons in the Middle East were flocking to Facebook to read things, about their own countries, that their own countries had censored. That's how Facebook started becoming connected to political discussions. That is so ironic of course.
I do not disagree with you, seeing the rich get handslaps for the same crimes the rest of us would pay dearly for is an example but facebook is a private free site, same as this one, they are not the government, so they can do with their site as they wish, and we can go elsewhere if we choose
I feel like we're going around in circles and you're missing the point. Yes, I would say a private site does have the right to enforce whatever rules it wants that are in the fine print. My issue is not with individuals getting their own individual posts deleted, but rather everyone else who are reading on the site not realizing that's happening. The way I see it, it's kind of a form of deception. Because, as a reader, you get the impression that everyone else can post things, when in reality there may be an invisible filter, skewing the perspective of everything, making everything one-sided. Yes, that exists on the news too, but at least with the news you don't necessarily get the impression so much that any person could share their opinions.
I agree to a point, and I think some stuff is deleted that should not be by moderators with an agenda at facebook, same with any site - but hate speach I think we can all agree should be removed
No, even there we cannot agree. However, I would be more receptive to that idea if the discussion site at least showed that someone had made a post, but that is was removed for supposed "hate speech reasons". That would at least give everyone else some idea of what had been removed from the conversation. Even then, people reading might not actually know the type of things that were being removed as supposed "hate speech".
How do you be transparent about a post that is no longer there? I personally left facebook early on. There were things about the company I did not like and there was just too much drama/politics among some of my FB friends. Now I just text the people I want to text and otherwise avoid any contact with the ones I am over staying in touch with.
That is an excellent question. There has got to be a creative way to inform readers and at least give them some vague of idea of what they are not being allowed to read. Maybe at least show them there was a deleted post, and show them what post (or quote) that post was responding to? Maybe small red letters that say "deleted post", but that a reader could optionally click on to see a little more information, even though they might still not get to see the exact content of what was deleted. At least a reader would know there were deleted posts in that discussion. If it was simply a trash post, the moderator deleting the post could simply tag the post as "Deleted post. Reason: trash post" and still have there be a way to allow the reader to see what that post was if they really wanted to, to be able to confirm that the moderators really are using correct judgement in removing posts from the discussion due to them being obvious trash posts. So it could help establish some level of trust between the readers and the moderators. (Because you could verify, if you wanted to, in most cases whether the moderators assessment of the reason they deleted a post was correct) If posts started being deleted, and there was absolutely no way for the reader to be able to click on something to see what those posts were, then readers might start growing suspicious of that site. So maybe you are reading the posts in a discussion, and every few posts you might see a line in tiny red letters that says "deleted post". If you click on it, then it would give the reason why a moderator said they deleted it, and if you click on it once again, you could actually be able to go to another page and see what that deleted post was. If you really wanted to, and for some reason did not trust the moderators.
Perhaps they can begin by not letting algorithms decide these things. There was a lot of online brigading last time and apparently there is almost as much this time.