Not only does an information screening machine need to be efficient, it must be more productive over time.
On the night that Dr. Li Wenliang breathed his last at Wuhan Central Hospital, Li An, a former ByteDance employee as well as many Chinese Internet users, kept updating information about his health condition.
Dr. Li was the first person to speak out about the epidemic, but was prevented from speaking out the truth. On the morning of February 7, 2020, he passed away because of Covid-19. Li and the Chinese online community deleted their accounts when expressing their grief and outrage on Weibo.
“I found guilt more than anger. At that time, I was developing content moderation tools for ByteDance. In other words, I helped build a system of censorship of accounts like mine, self. Burying yourself in a virtual grave that gradually expands in this country, “Li said Protocol.
On the morning of February 7, 2020, Dr. Ly Van Luong passed away because of Covid-19. Photo: BBC.
The company goes on the wire
Weibo did not send an explicit request for the deletion of the posts and accounts related to Dr. Li. This is not the only social network that behaves like that. “I know the team at ByteDance used the algorithm I created to censor the content. Every day, I feel more and more like a link in the ruthless machine,” Li said ..
ByteDance is the parent company of Douyin, also a famous Chinese technology unicorn. Last year, when it was questioned about sharing data with Beijing’s authorities, ByteDance blocked domestic engineers’ access to international products, including TikTok.
TikTok then announced plans to open two “transparency centers” in the US, openly the methods of content censorship with the authorities. At home, however, censorship is mostly still in the dark.
“I work at ByteDance’s main data department. Since the beginning of 2020, the technology we create has supported the censorship of entire company’s content inside and outside China. In addition, ByteDance is recruiting more. about 20,000 people monitor content in the country, “Li said.
While working there, Li’s team received many requests to create an algorithm that detects the Xinjiang-speaking account and immediately interrupted the livestream. After all, no algorithms have been generated because we don’t have enough data on the Xinjiang language.
Political content accounts for only a small portion of the posts that are deleted. Netizens here know how to censor themselves, what is said and what is not.
Douyin is mainly an entertainment platform. The company censors content that the government says violates fine traditions such as pornography, nudity, offensive images, profanity, illegal livestreams and piracy.
Politics is always a sensitive topic. Chinese social networks are most concerned about their inability to remove all sensitive content, leading to more strict government scrutiny. It is a matter of life and death.
Sometimes ByteDance’s moderation system crashes. “Even for a few minutes, we were worried whether there were any political videos uploaded at the time. ByteDance is still a young unicorn, not closely related to the state. ty is therefore always on the wire, “said Li.
Overall, moderation contributes to the company’s commercial success. Not many places have a dedicated content monitoring team like ByteDance.
Normally, the China Cyber Administration will issue a directive to ByteDance’s content quality center. Sometimes companies receive more than 100 directives per day. The center then assigns tasks to each group, reviewing current and past content.
When there is a livestream, the sound is automatically converted to text for the algorithm to find the offending words. The analytics algorithm then considers whether account-specific monitoring should be required.
If the user prompts a sensitive word, moderators receive the original video clip and note the time the word was spoken. If the moderator thinks the content is sensitive or inappropriate, they will cut the livestream, temporarily lock or delete the account.
“My colleagues and I do not meet directly with the management center or moderator. After major events there will be a performance review meeting. Only then will we be present to see what can be done to improve operations. censorship, “Li said.
Not many places have a dedicated content monitoring team like ByteDance. Photo: Tech Up.
The role of the censor team is to make sure moderators find “malicious and dangerous content” as soon as possible, a job like finding a needle in the seabed. In addition, it is necessary to improve efficiency, ie only use a few people to detect more content that violates ByteDance’s community guidelines.
“This is not the kind of work I would be proud to tell friends and family. When they ask me what I do at ByteDance, I often say that I am the one to delete the posts,” Li said.
Generated tools can help block fake news. In China, however, its main function is still censorship, sometimes deleting all information related to sensitive events. But this function is rarely used.
Dr Li warned colleagues and friends about an unidentified virus that was invading hospitals in Wuhan. In response, what he received were reprimands. “For weeks, we didn’t know what was going on. The government was covering the severity of the crisis,” Li said.
Follow Zing / Protocol
TikTok spent $ 92 million to settle the scandal of user information gathering
Chinese software firm ByteDance, “father” of TikTok, has agreed to spend $ 92 million to settle a class action lawsuit against TikTok, accusing the application of collecting information about minor users.