Facebook Bans Good Content Due To Bad Reporting
By Matt Algren. November 15, 2011, 1:00 PM CDT
In what has become an ongoing problem, a blogger has come forward with reports of troublesome information suppression on Facebook. Jeremy Ryan, of the site Addicting Info, recently reported continued reprimands he’s been getting from Facebook due to people submitting false abuse reports.
Trolls are now having activists removed by filing fake Facebook complaints. That is right, people are suppressing information in Wisconsin by actively reporting people they deem to be a threat on Facebook. I myself have been reported and banned for one to three days for simply posting “Good job” or “The majority of Wisconsin doesn’t like Scott Walker.” People have been reported on pages for saying nothing more than my name and have been reprimanded by Facebook. The strategy is simple and Facebook lets it continue. If someone reports something as abusive to Facebook they don’t actually look at it, they just remove it and warn the person who posted it. If you get enough you are not able to dispute them at all, and with no admin contacts and no one at Facebook actually looking at the posts reported as “abusive,” the person gets blocked.
With no warning, and for no discernible reason, after posting a link to the piece on a very few “Occupy” sites (perhaps three?), I received a notice (image, right) from Facebook that claimed I was posting “spam and irrelevant content on Facebook pages.” The notice also came with the note that my account was being disable for fifteen days from posting any content on pages that were not mine.
In other words, Facebook had “decided” that the content I was sharing wasn’t content, that it was “spam,” or “irrelevant,” despite the very clear fact that it was neither.
Facebook offered no opportunity to challenge its decision, no option to protest, no option to appeal. Almost as bad, Facebook did not offer a credible reason as to why it deemed original news information as spam and irrelevant. And quite frankly, censoring a report of questionable police actions is a chilling notion.
And then again on November 6, 2011:
After editing our “Week In Review” segment, I attempted to share it on Facebook on my personal page. I received an even more draconian censorship notice: “Warning: This message contains blocked content. Some content in this message has been reported as abusive by Facebook users.” My first thought was, “that’s odd,” as this is an original post and was just published a few seconds prior. My second thought was, OK, that’s their problem, and I’ll take the heat if a Facebook user wants to report my site’s content as “abusive.” So I tried to post it again.
(Full disclosure: David Badash is a Facebook/Twitter friend of mine.)
Earlier this year, Facebook (whether by administrator or by algorithm) removed a picture of two men kissing because, according to their notice, “Shares that contain nudity, or any kind of graphic or sexually suggestive content, are not permitted on Facebook,” though the removed picture contained none of these things. After a substantial amount of publicity, a Facebook spokesperson quietly released a statement to one blogger stating that the removal was in error. And you know what? I believe Facebook. I don’t think Mark Zuckerberg is interested in suppressing information (Lamebook notwithstanding), but the company’s policies and procedures are far from adequate in protecting against the suppression of information. Returning to Jeremy Ryan for a moment:
Suppression of Information is a new low. Anyone with money can now buy the suppression of the message delivered by anyone they choose. We cannot stand for this and we must call for Facebook to change these policies and allow an appeal process or someone to look before banning someone. Otherwise we will set up a situation in which social media goes to the highest bidder as well.
Ryan is right. Facebook currently boasts over 800 million active Facebook users worldwide. The company’s engineers need to understand the responsibility that comes with such influence and act accordingly. At the very least, an appeals process should be put in place to keep good information flowing while actual spam continues to be minimized.
About Matt Algren
Matt is a self-taught tinkerer who's fallen madly in love with social media and neato Android stuff. He writes on an eight-year-old computer that constantly freezes up on him, leading him to teach the neighborhood kids many new swear words when he has his windows open. He's probably eating chocolate ice cream in his home in Southwest Ohio right now. It's delicious.