This morning I read an interesting post from my friend Trevin about the election and wrote a comment in response. His post was thoughtful and not at all combative, in part expressing surprise at the low voter turnout and the makeup of voters, citing:
Clinton won 65% of Latinos and only 54% of women. Even more surprising, Clinton only won 51% of “White women college graduates”.
If you read the comment, I think it’s pretty clear that it’s civil… you might even go so far as to say that it’s thoughtful and furthers the discussion! But… you won’t see that comment on Facebook:
It was “deemed abusive or otherwise disallowed” by Facebook’s algorithms (it was too fast to be done by anything but a machine).
It reminded me of two things:
- Mark Zuckerberg calling it a “crazy idea” that the spread of fake news on Facebook influenced the election. Obviously this post isn’t about fake news, but it’s a conversation about how trustworthy polling is. And, Facebook blocked that conversation from occurring.
- Having some flashbacks to my time in China where it was well known that WeChat had certain keywords that would get you flagged or banned.
Of course, this could be just a bug. But, it was a good reminder to me that Facebook is not mine. Your Facebook wall is not a blog under your control. Your Facebook messages are not private communications between you and the recipient. Your speech and behavior within their walled garden are subject to their terms and conditions.
I have basically accepted the echo-chamber that comes from seeing things shared from people you already associate with and identify with. But, I’ve been increasingly concerned about digital redlining, and only recently have considered the fact that I’m not even seeing the “real” echo chamber, but some subtly modified version of it.