Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Why Facebook Won't Kick Off A Warlord

Lt. Gen. Mohamed Hamdan Dagalo is a social media personality. He's also the leader of the paramilitary group that attacked thousands of pro-democracy protesters on June 3, leaving more than a 100 dead.
STR
/
AFP/Getty Images
Lt. Gen. Mohamed Hamdan Dagalo is a social media personality. He's also the leader of the paramilitary group that attacked thousands of pro-democracy protesters on June 3, leaving more than a 100 dead.

In one Facebook post, he stands tall in yellow camouflage, decorated with badges. He promises he'll increase the salaries of teachers in Sudan. In another post, he hunches over a fire, cooking food with locals. And in another, published days after he oversaw a bloodbath, he's standing on top of his jeep, brimming with joy as throngs of men, women and children dance around him.

Lt. Gen. Mohamed Hamdan Dagalo, better known as Hemeti, is a social media personality. He is also the leader of the Rapid Support Forces — the paramilitary group that attacked thousands of pro-democracy protesters this month, leaving more than 100 dead. This is a bit of a second act for Hemeti, who also served time with the Janjaweed, the militia group considered responsible for the genocide in Darfur about 15 years ago, according to Foreign Policy magazine.

On Facebook, multiple pages promote Hemeti as a formidable yet kind authority figure.

Sudanese activists have petitioned Facebookto remove Hemeti and his extremist group from the platform. But the tech giant says it cannot take action because Hemeti is now second in command in Sudan's transitional government. Even if he is a warlord, Facebook leaders reason, he may be a state actor. The company is reluctant to make decisions that either anoint or knock down government officials.

The conflict in Sudan is just the latest example of Facebook's chronic uncertainty over how to wield its vast power in volatile regions where lives are at stake — while also shielding itself from the charge that the private company is simply too powerful.

The massacre that has put Facebook in the hot seat this time happened on June 3, the last day of Ramadan, among the holiest of days for Muslims. RSF soldiers led the attack, along with police and some special forces, using live ammunition, tear gas, whips and sticks to raid a months-long sit-in in Sudan's largest city and capital, Khartoum.

Health workers on the ground say more than 100 people were killed. Those responsible tried to conceal the carnage by dumping the bodies into the Nile River. Humanitarian groups charge the junta's footmen raped women too, and point to pictures of RSF troops flaunting female underwear on Facebook.

The African Unionsuspended Sudan in response to the killings. Amnesty International called it a "horrific slaughter." Sudan's transitional government admitted to ordering the attacks.

Facebook is "giving a pulpit to what is essentially a terrorist organization," says Ahmed el-Gaili, a Sudanese attorney who practices international law and is based in Dubai. "You cannot give a forum to an organization that has committed such crimes, even if all they are posting are pictures of cats and dogs."

Facebook has a track record of banning extremists who have come under fire in the U.S. The company recently expanded its definition of hate speech to include white nationalism. Multiple Sudanese activists said they find it baffling that the company would ban far-right activists like Alex Jones, and yet allow a paramilitary leader like Hemeti to use the platform as his propaganda machine.

"Unfortunately, reactive changes in policy are the norm at Facebook, and they probably haven't felt enough pressure yet on Sudan," says Susan Benesch, director of the Dangerous Speech Project, which tracks extremist content online.

Facebook has a different explanation. Company leaders do not dispute Hemeti's track record as a warlord. But his position has changed over the years. He's moved from the fringes of Sudanese society into Sudan's main political circles. He was appointed second-in-command of a transitional government that ousted dictator Omar al-Bashir earlier this year.

According to Brian Fishman, a Facebook spokesman who leads efforts to track dangerous organizations, different rules apply to state actors and non-state actors. The company has artificial intelligence designed to identify and deplatform (or boot) groups that may be affiliated with ISIS, for example. Even if certain posts might seem innocuous, they're banned because their activities in the real world are harmful.

But if the questionable Facebook user also represents the state, and is not explicitly breaking speech rules set by the company, Facebook is hesitant to intervene. The international community is already worried about the company's inordinate power to publish and censor the speech of more than 2 billion people. If Facebook bans a government official, Fishman says, that could make other governments even more wary of the Silicon Valley giant.

CEO Mark Zuckerberg has talked openly about the expectation that social media companies should protect society "from broader harms" by censoring or banning content. He is calling for a new independent body to be created to take on these decisions. "I've come to believe that we shouldn't make so many important decisions about speech on our own," Zuckerberg explained in aWashington Post op-ed.

His remarks came after his company acknowledged its slow response to the genocide in Myanmar. Civil society and human rights organizations in that country reached out to Facebook as early as 2014, asking repeatedly for the platform to intervene as extremist leaders built their social media personalities, and later moved beyond propaganda to incite violence against the Rohingya Muslim population.

"Facebook ignored the warnings," says Michael Lwin, a technologist and lawyer based in Myanmar. The problem in that country was not that Facebook didn't know that its platform was being used for propaganda, Lwin says. It's that the company didn't act despite knowing. The company kept a handful of Burmese-language human reviewers based in Singapore, he says, who didn't understand the local context and reached out to local civil society groups infrequently and reactively. "There are commonalities in how this story unfolds," Lwin adds.

Facebook is one of the world's largest companies. Its revenue topped $15 billion in thefirst quarter of this year. Facebook does not have an office in Sudan. But spokesman Fishman says a team is tracking the situation on the ground and they've made substantial investments in hiring Arabic-speaking content reviewers.

Shortly before the Khartoum massacre, one paramilitary page on Facebook featured a video in which critics of the pro-democracy sit-in claimed it was failing. In the days after, the RSF said on Facebook that it was acting in theinterest of the country, and that activist groups lacked patriotism. In another, it took credit forbringing stability to Darfur and securing the Sudanese borders against illegal immigrants.

While these military leaders use Facebook to promote their message, they've cut off Internet access to the rest of the country, imposing a digital blackout and citing security reasons. Sudanese citizens who had relied on Messenger, WhatsApp and Instagram (all owned by Facebook) to publicize meeting dates, post footage of human rights abuses or message each other are not able to.

"It bothers me a lot," says Mohamed Suliman, an expat based in the U.S. He helped launch the online petition after Facebook failed to respond to his and other users' requests to take down Hemeti's pages. "It's like every day, you see the one who killed your sister, who raped innocent women, being promoted as the leader. And the people he's attacking cannot speak," Salih said.

Editor's note: Facebook is among NPR's recent sponsors.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Corrected: June 26, 2019 at 12:00 AM EDT
A previous version of this story incorrectly referred to Mohamed Suliman as Mohamed Salih. Additionally, the quote about story commonalities was from Lwin, not Lewis.
Corrected: June 26, 2019 at 12:00 AM EDT
A previous version of this story incorrectly referred to Mohamed Suliman as Mohamed Salih. Additionally, the quote about story commonalities was from Lwin, not Lewis.
Aarti Shahani is a correspondent for NPR. Based in Silicon Valley, she covers the biggest companies on earth. She is also an author. Her first book, Here We Are: American Dreams, American Nightmares (out Oct. 1, 2019), is about the extreme ups and downs her family encountered as immigrants in the U.S. Before journalism, Shahani was a community organizer in her native New York City, helping prisoners and families facing deportation. Even if it looks like she keeps changing careers, she's always doing the same thing: telling stories that matter.