A citizen group is standing up to discriminatory speech on Twitter, demanding that the social networking service take urgent measures to block hate speech and other abusive communication.
Experts believe specific guidelines are needed to curb discriminatory tweets. Human rights advocates are concerned that unfettered abuse could lead to major nationalist rallies in Japan similar to those recently seen in the United States.
Earlier this month, protesters gathered in front of the Japanese unit of Twitter Inc. in Chuo Ward, Tokyo, demanding that the company step up efforts to immediately remove from the platform hate speech, such as the many examples of offensive posts targeted against people of Korean descent.
The group, Tokyo No Hate, catalogued over 1,000 discriminatory comments that its supporters found on Twitter and deemed as hate speech. The protesters took to the streets and placed some 400 printouts of the comments on the pavement in front of the Twitter office so people could step on them.
“If we leave these kinds of posts (on Twitter) and the society continues to allow such hate speech to be published, an incident similar to that in Charlottesville would likely take place in Japan,” Masayuki Ishino, one of the main members of Tokyo No Hate, said during the two-hour protest held in the evening of Sept. 8, referring to the white supremacist rally last month in Virginia that turned violent, leaving one counterprotester killed from a car-ramming attack and two state troopers dead in a helicopter accident.
Twitter guidelines prohibit tweets that threaten a person’s life or could foment violence against any targeted ethnic group or minority. A user account deemed in violation of this rule could be suspended temporarily or permanently.
Users can report abuses, but Twitter can be slow to respond, Ishino said.
“It takes about a month after a report is sent to Twitter before the firm responds. The response is slow, and in some cases, there is no response,” Ishino said.
“When we send reports, we want (Twitter) to deal with the situation thoroughly. Rather than just suspending accounts based on the words used, (the company) should actually interpret their context, so it can judge properly whether the post is abusive or a violation of human rights,” he said.
In response to questions by The Japan Times, a spokeswoman at the Japanese unit said in an email that the company is dealing with a large volume of tweets every day and receives many kinds of feedback on hate speech.
“We’re making our best effort to improve (screening), and began revising Twitter policy from various perspectives,” she said.
According to Reuters news agency, in Europe, companies including Google Inc., Facebook Inc. and Twitter have agreed to an EU code of conduct in 2016 to remove hate speech within 24 hours, while the German parliament passed a bill in June that would fine social media companies up to €50 million ($53.62 million) if they fail to remove discriminatory postings on their platforms in a timely manner.
Japan’s first law aimed at curbing racial discrimination, known as the hate speech law, took effect in June 2016.
However, experts on freedom of expression said social media companies such as Twitter should establish their own standards and guidelines aside from the law. The self-imposed rules should be specific on what content is inappropriate, and at the same time the social media companies should educate users.
“It’s necessary for (social networking service) operators, including Twitter, to establish and share common guidelines. It’s desirable that each company establish its own guideline and share it with the rest,” said Shojiro Sakaguchi, a professor at Hitotsubashi University’s graduate school of law.
However, he said that it is hard to draw a clear line between hate speech and political statement.
Some political commentary could be construed as hate speech due to inflammatory content, he said, adding that eliminating such political tweets is undesirable.
Kenta Yamada, a professor of media law at Senshu University, said Twitter’s Japanese unit should cooperate with other website operators in the country and act as one to combat hate speech.
It will be a tough decision for Twitter however, Yamada said.
“They should be responsible for what’s posted on their platform. But on the other hand, they must guarantee freedom of expression as much as they can. That is also their social role,” Yamada said.
In a time of both misinformation and too much information, quality journalism is more crucial than ever.
By subscribing, you can help us get the story right.