This morning I was invited by a friend to read an equally harrowing and enlightening blog post from someone who works as a facebook moderator, which is a person who has to make a decision for all the reports of something as ‘offensive’ that facebook’s algorithm couldn’t deal with initially for some reason or another.
I’m glad I read it because I had never before considered the fact that people whose lives are in danger or who are being abused reach out to the human beings behind facebook every day for help. The author goes on to angrily lament the fact that a large majority of the reports they have to sift through everyday are trivial non-issues, such as “I don’t believe in this coin. It goes against what I believe in.” or “This is not true, my God would never let this happen”. Tragically, this means they don’t get to see the cries for help from people whose lives are in danger or who are being abused as quickly as they otherwise would, which means they don’t get help to them (by contacting the relevant authorities) as quickly as may be necessary. The author cites one example in particular of a young child who was afraid that she was going to be sexually abused, but whose call for help was buried under a deluge of triviality:
“…after 5 hours of reading through your butthurt sob stories, I find the 7-year-old girl who, too scared to tell an authority has figured out how to report a picture her uncle posted of her that was blatantly, obviously sexual, and begged for help.
“He says he’s coming over today at 2. Can you please help me, I don’t know what to do.””
“Thanks to your delicate sensibilities, I didn’t get to read her report until it was 5 her time. Four hours ago I could have possibly prevented another one of her rapes. I could have had the police waiting for him when he arrived at her house. Four hours from now I’ll probably cry myself to sleep because I didn’t get to. Thanks again, to people who obviously need to stay the fuck off of the internet. Why did her report not come up faster? Because she’s seven, she reported the picture as a lesser offense that reported properly would have bumped it up the queue past all of your cries of visual injustice. Those pieces of ugliness that are top-o-the-queue every morning.”
Having read this moderator’s tragic account I realised how much responsibility for the welfare of millions of people around the world that facebook has taken on its shoulders due to its moderators’ potential to help those being abused or in danger of being abused. I also realised how glad I am that I have never felt the desire to or the need to report anything as ‘offensive’ that has appeared on my facebook timelime, and how remarkable that fact is. This fact is testament to the great job facebook has done of providing a structure for individuals to create our ideal or almost ideal communities.
The author reveals that:
“True cries for help come from FB users every day, all day long. Facebook moderators spend a significant part of each workday forwarding ‘acts in progress’ to the authorities local to those accounts. When those accounts are legitimate (not ghosted or proxies) those reports can and do save people’s lives.”
That’s a sobering thought. Facebook is not a fantasy world. It’s not a game and it’s not just a bunch of algorithms. Through it real people reach out for help and real people respond in the hope that they can prevent something horrible from happening.
This account by a frustrated facebook moderator is a wake-up call for all of us who constitute this remarkable globe-spanning online community that facebook has enabled us to create. Facebook can and does help those in real danger who call for help through its reporting system, but in order for moderator’s like our author to be able to respond as quickly and efficiently as possible, we need to play our part and take responsibility for the offence we take to things.
Facebook allows us to prevent content from any given person or page from appearing in our timelines. We can mark posts as spam and we can unfriend someone or unlike a page. This allows us to realise, more or less, the same results we get from ostracism in the real world. We have relationships with people we like and don’t have relationships with people we don’t like. Facebook does a great job of replicating the nature of real world communities, and in fact gives us more control than we can have in the real world. In the real world, for example, there’s always the possibility that you’ll have to endure or be in ear shot of the offensive ramblings of some idiot, but on facebook that rarely happens – and if it does you can make sure it never happens again.
It’s not the job of facebook moderators to configure our communities to our individual liking because facebook provides the functionality for us to do this for ourselves, and they are not Gods who we can command to punish people we don’t like. They are human beings in a unique position to help other human beings who are being abused or who are in some kind of danger, and so for goodness sake let us all get out of their way.