This morning I was invited by a friend to read an equally harrowing and enlightening blog post from someone who works as a facebook moderator, which is a person who has to make a decision for all the reports of something as ‘offensive’ that facebook’s algorithm couldn’t deal with initially for some reason or another.
I’m glad I read it because I had never before considered the fact that people whose lives are in danger or who are being abused reach out to the human beings behind facebook every day for help. The author goes on to angrily lament the fact that a large majority of the reports they have to sift through everyday are trivial non-issues, such as “I don’t believe in this coin. It goes against what I believe in.” or “This is not true, my God would never let this happen”. Tragically, this means they don’t get to see the cries for help from people whose lives are in danger or who are being abused as quickly as they otherwise would, which means they don’t get help to them (by contacting the relevant authorities) as quickly as may be necessary. The author cites one example in particular of a young child who was afraid that she was going to be sexually abused, but whose call for help was buried under a deluge of triviality:
“…after 5 hours of reading through your butthurt sob stories, I find the 7-year-old girl who, too scared to tell an authority has figured out how to report a picture her uncle posted of her that was blatantly, obviously sexual, and begged for help.
“He says he’s coming over today at 2. Can you please help me, I don’t know what to do.””
“Thanks to your delicate sensibilities, I didn’t get to read her report until it was 5 her time. Four hours ago I could have possibly prevented another one of her rapes. I could have had the police waiting for him when he arrived at her house. Four hours from now I’ll probably cry myself to sleep because I didn’t get to. Thanks again, to people who obviously need to stay the fuck off of the internet. Why did her report not come up faster? Because she’s seven, she reported the picture as a lesser offense that reported properly would have bumped it up the queue past all of your cries of visual injustice. Those pieces of ugliness that are top-o-the-queue every morning.”
Having read this moderator’s tragic account I realised how much responsibility for the welfare of millions of people around the world that facebook has taken on its shoulders due to its moderators’ potential to help those being abused or in danger of being abused. I also realised how glad I am that I have never felt the desire to or the need to report anything as ‘offensive’ that has appeared on my facebook timelime, and how remarkable that fact is. This fact is testament to the great job facebook has done of providing a structure for individuals to create our ideal or almost ideal communities.
The author reveals that:
“True cries for help come from FB users every day, all day long. Facebook moderators spend a significant part of each workday forwarding ‘acts in progress’ to the authorities local to those accounts. When those accounts are legitimate (not ghosted or proxies) those reports can and do save people’s lives.”
That’s a sobering thought. Facebook is not a fantasy world. It’s not a game and it’s not just a bunch of algorithms. Through it real people reach out for help and real people respond in the hope that they can prevent something horrible from happening.
This account by a frustrated facebook moderator is a wake-up call for all of us who constitute this remarkable globe-spanning online community that facebook has enabled us to create. Facebook can and does help those in real danger who call for help through its reporting system, but in order for moderator’s like our author to be able to respond as quickly and efficiently as possible, we need to play our part and take responsibility for the offence we take to things.
Facebook allows us to prevent content from any given person or page from appearing in our timelines. We can mark posts as spam and we can unfriend someone or unlike a page. This allows us to realise, more or less, the same results we get from ostracism in the real world. We have relationships with people we like and don’t have relationships with people we don’t like. Facebook does a great job of replicating the nature of real world communities, and in fact gives us more control than we can have in the real world. In the real world, for example, there’s always the possibility that you’ll have to endure or be in ear shot of the offensive ramblings of some idiot, but on facebook that rarely happens – and if it does you can make sure it never happens again.
It’s not the job of facebook moderators to configure our communities to our individual liking because facebook provides the functionality for us to do this for ourselves, and they are not Gods who we can command to punish people we don’t like. They are human beings in a unique position to help other human beings who are being abused or who are in some kind of danger, and so for goodness sake let us all get out of their way.
But surely it is up to Facebook to strengthen rules about what people should complain about? Or even provide a better filtering system for life threatening stuff. Why is a 7 year old getting raped the fault of people who thought FB was a fun social networking tool?
I agree. Now that facebook has become so widely used by people for communicating/socialising, it does have an ongoing responsibility to look to improve its moderation systems, which I’m sure it is. If FB can create an algorithm that can somehow scan all incoming offence reports for life-threatening stuff, identify it, and then escalate it to the attention of a moderator, then that might come to be – IF users don’t object because I guess such a system would have privacy implications.
What happened to that girl isn’t the fault of FB users, I agree. I don’t think I implied that it is. My point was that, given what we now know about the plight of facebook moderators, the very serious stuff they’re dealing with on a daily basis, and the potential they have to prevent/curtail harm to people like that little girl, we now have a responsibility to stop creating mountains of trivial stuff that obscures the life-threatening stuff from the eyes of moderators like our author who would dearly have loved to help that girl. Knowledge is responsibility, I heard someone say – and it’s true.
“What happened to that girl isn’t the fault of FB users, I agree. I don’t think I implied that it is.” – Didn’t mean you Gary, I was referring to the author you were quoting.
However in reply to you I would say that your hopes about our collective responsibility are about as likely to happen as a plea that we all just love each other. And as that is unrealistic it should be the responsibility of these mega billion dollar companies to solve the issue, not their users. They should stop blaming their customers for without them talking and posting absolute trivial crapola 24/7 they would not have a job.
Saying it ‘should’ be the responsibility of FB is, I’m afraid, trying to get an ‘ought to be’ from an ‘is’, which is not meaningful.
Until such time as facebook finds a way to stop the tonnes of silly ‘this offends me’ reports from drowning out the serious cries for help, then in reality it *is* the responsibility of the FB community, to some degree at least, whether they like it or accept it or not. They can ignore this reality if they choose, but they cannot ignore the consequences of ignoring this reality; which we now know are potentially tragic.