Nightmare of Social Media Moderation 05-13-2021, 08:05 PM
#1
![[Image: ModeratingFacebookInfographicTeaser.jpg]](https://www.adweek.com/wp-content/uploads/sites/2/2015/04/ModeratingFacebookInfographicTeaser.jpg)
Isabella, a moderator for Facebook, hired under one of their largest Irish contractors called Covalen, gave evidence to a parliamentary committee recently. Moderators must sign strict NDAs and cannot discuss the contents of any of their tickets with anyone other than doctors and counsellors. A law firm and a union are fighting for freedom for moderators to speak out, and better psychological support, claiming that the 24/7 support team available to staff are not actually qualified psychiatrists, but wellness coaches and isn't sufficient enough. Read the full article here.
Code:
“The high priority queues - the graphic violence, the child stuff, the exploitation and the suicides, people working from home don’t get that - the burden is put on us.”
Despite having family shielding at home, she was told to come into the office and developed anxiety, for which she now takes antidepressants.
“Every day was a nightmare,” she said, adding that the support given was “insufficient.”
Facebook says psychological help is available to all its moderators 24 hours a day, but Isabella claims its wellness coaches are not qualified psychiatrists.
“I was seeing the wellness team but didn’t feel I got the support I needed. I can’t say I left work feeling relieved or knowing I could go home and have a good night's sleep - that’s not possible,” she added.
“It would follow me home. I could just be watching TV at home and think back to one of the horrible, really graphic tickets.”
media captionMeet people who review Facebook's reported content
Sub-contracted staff are given 1.5 hours of "wellness" time a week, she says, which can be used for speaking to a wellness coach, going for walks or taking time out when feeling overwhelmed.
“It’s not enough. I’m now seeing the content I view in work in my dreams. I remember it, I experience it again and it is horrible.
“You never know what is going to come next and you have to watch it the full way through because they might have violators.”
Here's an interesting infographic (from Forbes, 2020) that indicates the share of things moderators deal with:
![[Image: 960x0.jpg?fit=scale]](https://specials-images.forbesimg.com/imageserve/5ee0209139ca790006b69b27/960x0.jpg?fit=scale)
Facebook and other social media sites are under massive pressure, especially in recently, to control the content on their platforms. However, we ought to consider how this happens, and therefore, who actually does that job. While there are investments in automatic algorithms, they're still in an early stage where many mistakes are made. What are your thoughts on this issue?