Democracy Gone Astray

Democracy, being a human construct, needs to be thought of as directionality rather than an object. As such, to understand it requires not so much a description of existing structures and/or other related phenomena but a declaration of intentionality.
This blog aims at creating labeled lists of published infringements of such intentionality, of points in time where democracy strays from its intended directionality. In addition to outright infringements, this blog also collects important contemporary information and/or discussions that impact our socio-political landscape.

All the posts here were published in the electronic media – main-stream as well as fringe, and maintain links to the original texts.

[NOTE: Due to changes I haven't caught on time in the blogging software, all of the 'Original Article' links were nullified between September 11, 2012 and December 11, 2012. My apologies.]

Saturday, October 14, 2023

Hamas hate videos make Elon Musk Europe’s digital enemy No. 1

Elon Musk has made himself Europe's digital public enemy No. 1.

Since Hamas attacked Israel on Saturday, the billionaire's social network X has been flooded with gruesome images, politically-motivated lies and terrorist propaganda that authorities say appear to violate both its own policies and the European Union's new social media law.

Now Musk is facing the threat of sanctions — including potentially hefty fines — as officials in Brussels start gathering evidence in preparation for a formal investigation into whether X has broken the European Union's rules. Authorities in the U.K. and Germany have joined the criticism.

The tussle represents a critical test for all sides. Musk will be keen to fight any claim that he's failing to be a responsible owner of the social network formerly known as Twitter — all while upholding his commitment to free speech. The EU will want to show its new regulation, known as the Digital Services Act (DSA), has teeth.

Thierry Breton, Europe's commissioner in charge of social media content rules, demanded that Musk explain why graphic images and disinformation about the Middle East crisis were widespread on X.

"I urge you to ensure a prompt, accurate and complete response to this request within the next 24 hours," Breton wrote on X late Tuesday.

"We will include your answer in our assessment file on your compliance with the DSA," said Breton, who also wrote to Meta’s Mark Zuckerberg to remind him of his obligations under Europe’s rules. TikTok’s head Shou Zi Chew was also asked on October 12 to explain how his platform was dealing with misinformation and graphic content.

"I remind you that following the opening of a potential investigation and a finding of non-compliance, penalties can be imposed," Breton said. Those fines can total up to 6 percent of a company's global revenue.

In response, Linda Yaccarino, X’s chief executive, wrote to Breton early on Thursday to outline how the social media giant had responded to the ongoing Middle East conflict. That included removing or labelling potentially harmful content, working with law enforcement agencies and adding so-called “community notes,” or crowd-sourced fact-checks, to posts.

The Commission late on Thursday sent X a "request for information" about how it has been handling problematic content — a preliminary step to launching a formal probe that could lead to fines. It's the first time the European Union executive has taken this step.

The heat on Twitter did not begin with the Hamas attacks. Ever since Musk bought the platform, he's been hit by criticism that he's failing to stop hate speech from spreading online.

X has cut back on its content moderation teams, in the spirit of promoting free speech; pulled out of a Brussels-backed pledge to tackle digital foreign interference; and tweaked its social media algorithms to promote often shady content over verified material from news organizations and politicians.

Musk has responded — via his social media account with 159 million followers — with jeers and attacks on his naysayers. But the latest uproar over content apparently inciting and praising terrorism has made it a surefire bet that X will be one of the first companies to be investigated under the EU's social media rules.

In response to Breton’s demand, Musk asked the French commissioner to outline how X had potentially violated Europe’s content regulations. “Our policy is that everything is open source and transparent,” he added. In the U.K., Michelle Donelan, the country's digital minister, also met with social media executives Wednesday to discuss how their firms were combatting online hate speech.
The probe is coming

In truth, an investigation into X's compliance with Europe's new content rulebook has been on the cards for months. Over the summer, Breton and senior EU officials visited the company's headquarters in San Francisco for a so-called "stress test" to see how it was complying.

Under the EU's legislation, tech giants like X, TikTok and Facebook must carry out lengthy risk assessments to figure out how hate speech and other illegal content can spread on their platforms. These firms must also allow greater access to external auditors, regulators and civil society groups that will track how social media companies are complying with the new oversight.

Investigations into potential wrongdoing under Europe's content rules will likely involve months-long inquiries into a company's behavior, the Commission taking a legal decision on whether to levy fines or other sanctions, and a likely appeal from the firm in response. Such cases are expected to take years to complete.

Within Brussels, the Commission has been compiling evidence of potential wrongdoing across multiple social media companies, even before the EU's new content legislation came into full force in August, according to five officials and other individuals with direct knowledge of the matter.

The goal is to start at least three investigations linked to the Digital Services Act by early next year, according to three of those people. They spoke on condition of anonymity because the discussions are not public and remain ongoing.

In recent days, Commission officials have been compiling evidence associated with Hamas' attacks on Israel — much of which has been shared on X with little, if any, pushback from the company.

That content included verified X accounts with ties to Russia and Iran reposting graphic footage of alleged atrocities targeting Israeli soldiers. Some of these posts have been viewed hundreds of thousands of times. Other accounts linked to Hezbollah and ISIS have similarly posted widely with few, if any, removals.

It is unclear whether such footage will lead to a specific investigation into X's handling of the most recent violent content. But it has reaffirmed the likelihood Musk will soon face legal consequences for not removing such material from his social network.

Combating violent and terrorist content requires "people sitting at a computer screen and looking at this and making judgments," said Graham Brookie, senior director of the Atlantic Council’s Digital Forensic Research Lab, which has tracked the online footprint of Hamas' ongoing attacks. "It used to be that there were dozens of people that do that at Twitter, and now there's only a handful."

Original Article
Source: politico.eu
Author: Mark Scott 

No comments:

Post a Comment