Afghanistan-Taliban crisis: Facebook, Twitter and YouTube are facing a new challenge due to the takeover of the country

0
36

The Taliban’s swift takeover of Afghanistan presents large US tech companies with a new challenge in dealing with content created by a group considered terrorists by some world governments.

Social media giant Facebook confirmed on Monday that it is calling the Taliban a terrorist group and banning it and content that supports it from its platforms.

But Taliban members reportedly continued to use Facebook’s end-to-end encrypted messaging service WhatsApp to communicate directly with Afghans, despite the company’s rules prohibiting dangerous organizations from doing so.

A Facebook spokesman said the company is closely monitoring the situation in the country and that WhatsApp will take action against all accounts associated with sanctioned organizations in Afghanistan, which could include account removal.

Taliban spokesmen tweeted on Twitter with hundreds of thousands of followers during the country’s takeover.

When asked about the Taliban’s use of the platform, the company pointed out its policy against violent organizations and hateful behavior, but did not answer any Reuters questions about its classification. Twitter’s rules state that groups promoting terrorism or violence against civilians are not allowed.

The return of the Taliban has raised fears that they will crack down on freedom of expression and human rights, especially women’s rights, and that the country could once again become a haven for global terrorism.

Taliban officials have issued statements declaring that they want peaceful international relations and have promised to protect Afghans.

Major social media firms made high profile decisions this year about how to deal with incumbent world leaders and groups in power.

These include controversial blockades by former US President Donald Trump for inciting violence around the Capitol uprising on January 6 and bans on Myanmar’s military amid a coup in the country.

Facebook, which has long been criticized for failing to tackle hate speech in Myanmar, said the coup escalated the risk of offline harm and its history of human rights abuses contributed to the banning of the ruling military or Tatmadaw.

The companies that have come under fire from global lawmakers and regulators for their excessive political and economic influence often rely on government designations or official international approvals to determine who is allowed on their websites.

These also help determine who could be screened, allow official government accounts, or receive special treatment for violating comments due to gaps in news or in the public interest.

However, the divergent viewpoints of the tech companies suggest that the approach is not uniform.

Alphabet’s YouTube, when asked if there was a ban or restriction on the Taliban, declined to comment, but said the video-sharing service was relying on governments to define Foreign Terrorist Organizations (FTO), to direct the enforcement of its rules against violent criminal groups through the site.

YouTube pointed to the list of US State Department FTOs that the Taliban did not belong to. The US instead classifies the Taliban as a “Specially Designated Global Terrorist,” which freezes the US assets of those blacklisted and bans Americans from working with them.

To make matters worse, while most countries show little evidence that they diplomatically recognize the group, the Taliban’s position on the world stage could change as they tighten control.

“The Taliban are a reasonably accepted actor at the level of international relations,” said Mohammed Sinan Siyech, researcher on security in South Asia and a PhD student at Edinburgh University, referring to discussions that China and the US had with the group.

“When that recognition comes, the subjective decision that this group is bad and that we won’t host it is a complication for a company like Twitter or Facebook.”

© Thomson Reuters 2021

LEAVE A REPLY

Please enter your comment!
Please enter your name here