2024, when almost half the world will be affected by the election, is a “decisive year” in the fight against disinformation, European foreign policy chief Josep Borrell said on Tuesday, AFP reported.

Fake news with an American military convoy moving from Romania to UkrainePhoto: MApN

Disinformation is “one of the most important threats” facing democracies, Borrell told reporters as he presented his department’s second report on the subject.

Over the years, Russia has built “a large infrastructure designed to lie, manipulate and destabilize on an industrial scale,” Borrell warned.

It is a “threat to the security” of democracies that requires “a fight against the industry that manufactures lies,” he warned.

Some 945 million Indians are voting in May’s general election in a country that last year became the world’s most populous, overtaking China.

At the beginning of June, more than 400 million voters in 27 European countries are to elect 720 MEPs in large-scale transnational elections.

Use of artificial intelligence

Disinformation is certainly nothing new, but its means and capabilities have increased tenfold thanks to social media and the development of artificial intelligence (AI), even if its use is still limited, according to the report.

However, a few examples stand out, including a video created by artificial intelligence from scratch showing the President of the Republic of Moldova, Maia Sandu, as if speaking from an official government channel, and a retouched video calling on Ukrainians to stage a coup d’état. state

This second report, compiled by the European External Action Service (EEAS), examined a total of 750 cases of disinformation around the world between November 2022 and December last year.

“Factory of Lies”

In the months before Poland’s elections last year, for example, Belarusian state media launched several social media news channels targeting Polish voters, showing fake videos attacking the candidates.

According to Borrell, Russia and China are the main culprits of this “factory of lies”, with Ukraine becoming the main target at a time when Moscow is trying to justify the invasion of this country by its military, starting with the launch of its “special. operation” until February 22, 2024.

Politicians and public figures were targeted, as well as celebrities such as stars Margot Robbie and Nicolas Cage, whose images were used to reach a wider audience.

Faced with such attacks, the report recommends a series of measures to combat the phenomenon more effectively.

Defenses against disinformation before the election are being prepared “a few months in advance” and it is important to expand and adapt these measures after the election, he advises.

Four answers

There are four answer options:

  • “ignore,
  • master,
  • to minimize
  • redirect”.

During an election campaign, those involved in the fight against disinformation must be able to “accurately distinguish threats from ignoring or responding to them.” The report’s authors note that taking action can sometimes end up giving importance to the very fake news we are trying to combat.

If steps are to be taken to limit this threat, the report suggests informing platforms that broadcast fake news as early as possible, for example by asking them to monitor any content related to the relevant election more closely.

The report also states that sanctions should be possible if certain networks or social platforms are negligent.

“Dangerous content is spreading like a cancer, endangering the health of our democracy,” Borrell said. “But we have the necessary tools to effectively fight this disease. We have the opportunity to do that, but we need to do more.”