Technology

Facebook moderators have nightmares every day


To reduce work pressure, Facebook advised censors to draw pictures and sing karaoke, but this caused outrage.

Facebook content moderators are responsible for viewing many posts, pictures, and videos with offensive content such as violence and pornography to remove them to help Facebook become a safer environment. They have to do this job all day and face many serious psychological problems. However, what Facebook advises them to relieve pressure is to sing karaoke and draw pictures after work, according to the testimony of a moderator.

During a hearing on Facebook’s treatment of contract moderators before the Irish Parliament, moderator Isabella Plunkett said Facebook’s advice was well-intentioned but did not help them. Plunkett is working for Covalen, one of Facebook’s largest contractors in Ireland.

“To help us, they provide ‘health coaches’. They mean well, but they’re not doctors. They suggest singing karaoke or drawing pictures, but you don’t always want to sing after watching someone get beaten up online,” the 26-year-old said.

Child abuse, suicide, violence… are just a few of the types of content Plunkett views on a daily basis. She shared that she encountered many nightmares about what she watched. For months, she took antidepressants. She was offered to see the company’s doctor, but no visit was scheduled.

Facebook was harshly criticized for the treatment of employees like this. Recently, a content censorship in Texas disclosed an insider information, in which the company advised employees to “practice breathing” after watching horrible content.

Like other content moderators, Plunkett is not allowed to tell friends and family about his work because he signed a non-disclosure agreement when he started working. She said she always “feels lonely”. Censors are told to only expose themselves to content of self-harm or child abuse for 2 hours / day, but in fact, it is not.

According to a Facebook spokesperson, the company is providing support to moderators because certain types of content are sometimes difficult for viewers to see. They receive intensive training and psychological support.

Plunkett says her job is to “train the algorithm”, selecting specific violent, hateful videos so that one day, machines can take the place of humans.

In 2019, Facebook CEO Mark Zuckerberg said some of the censor’s stories were “somewhat dramatic”. Zuckerberg made the same comment during a companywide meeting in the summer of 2019.

Du Lam(According to BI)

Facebook must be held accountable for fake news and defamatory content

Facebook must be held accountable for fake news and defamatory content

Facebook is facing strong opposition from Australian politics, especially after the company banned people from accessing news.

.

Leave a Reply

Your email address will not be published. Required fields are marked *