Send message to

Do you want to send the message without a subject?
Please note that your message can be maximum 1000 characters long
Special characters '<', '>' are not allowed in subject and message
reCaptcha is invalid.
reCaptcha failed because of a problem with the server.

Your message has been sent

You can find the message in your personal profile at "My messages".

An error occured

Please try again.

Make an appointment with

So that you can make an appointment, the calendar will open in a new tab on the personal profile of your contact person.

Create an onsite appointment with

So that you can make an onsite appointment, the appointment request will open in a new tab.

Symbolic image: Manipulated content on the internet
  • Industry News
  • Management, Awareness and Compliance
  • Artificial intelligence (AI)

Algorithms as a weapon against democracy

Automated propaganda networks use AI to undermine democratic societies. Targeted disinformation and fake news are shaking society's trust in institutions and the media. In the super election year, the activities of fake news networks are on the rise, while there is a lack of regulation, especially on social media platforms. The question is how we can defend ourselves against this threat without jeopardising our open society.

Riots such as those currently taking place in the UK show that disinformation and fake news are increasingly becoming a security problem for society as a whole. The search for countermeasures is underway.

Automated propaganda networks misuse current technical possibilities such as artificial intelligence for their purposes. They have only one goal: to undermine democratic societies. The war in Ukraine has fueled these activities. In the meantime, even the secret services are dealing with the issue of disinformation. However, the possible countermeasures are still limited and not always harmless.

In the current super election year, the operators of fake news networks are ramping up their activities. In some cases, they are even making common cause with cyber criminals: "For example, they hack the account of a politician and use it to send fake news," warns the German federal office for information security, BSI. 
Decisions on election day often depend on what information is available before the election. Fake news therefore represents "a major threat to democracy", according to the BSI. The diverse possibilities for manipulation in cyberspace are attracting masses of dubious actors, who often spare no effort to manipulate people and opinions on behalf of autocratic regimes. The so-called doppelganger campaign is a spectacular example of the effort that goes into this and what is possible.

The Meta Group described the so-called doppelganger operation as the "largest and most aggressive operation of Russian origin". Suspected Russian actors went to great lengths to imitate well-known media, ministry and government websites. They proceeded very skilfully, for example German online news magazine "t-online.de" became the similar sounding "t-online.online" or the website of another German news magazine "spiegel.de" became the fake site "spiegel.ltd". On the fake websites, they not only posted copies of the original news sites garnished with fake news, but also lots of manipulated photos, among other things. They also produced deepfake videos using specific AI platforms and published them on the fake websites. (How deepfake videos can be used successfully is described in the article "Real or fraud?” )

The underground actors even conducted their own interviews with politicians and scientists, but gave false information about the medium and the true background. The content is then taken out of context and presented with completely different references. Masses of social media bots ensure dissemination and attention via social networks.

 

"The ground should shift"

This is not without success: "We see the effect of disinformation" everywhere in society, said Sinan Selen during the Potsdam Conference on National Cyber Security organised by the German Hasso Plattner Institute (HPI) recently. Fake news and disinformation was a major topic at this year's event. "We are seeing an erosion of trust," stated the Vice President of the The German domestic
intelligence services (BfV), "and that has something to do with disinformation". Christian Dörr, professor at HPI and head of the "Cybersecurity and Enterprise Security" department, also confirms this: "We see the results in a radicalisation of society, for example, or in difficulties in conducting an open discourse".

These effects are intended by the actors. They want to undermine trust in social and political institutions and the media, polarise society and stir it up. To this end, they try to manipulate the political mood, especially before elections. New technologies are changing the campaigns: "While disinformation used to be used to make people believe something specific, today it is about making sure that people no longer believe anything at all. The ground should shift," explained journalist and former editor-in-chief of the news magazine "Der Spiegel", Georg Mascolo, during the HPI event. For German Green Party politician Konstantin von Notz, "the main aim is to withdraw trust from the rule of law".

 

Algorithms as a weapon 

This radicalisation of targets in the spread of misinformation is made possible by technological advances - from social media to artificial intelligence. They allow for much more effective measures than during the Cold War, for example. "Disinformation is not a new topic, but now I can scale and customise; I know exactly what I need to send you to make it work," explains Christian Dörr. He adds: "What is new is that we can, in principle, be controlled in a targeted manner.

The creators of fake news campaigns skilfully exploit the interests of social media platform operators. The logic of these platforms is a logic of profit maximisation, Konstantin von Notz puts it in a nutshell. The basis for this is formed by algorithms that "rank everything high that is exciting, offensive or sensational", i.e. prioritise it, explains Mascolo. For the platforms, "polarisation is a very good business model", which is why they have no problem with disinformation, he adds. Holger Münch, head of the German BKA (Federal Criminal Police Office), summarises the situation as follows: "We are chasing after a problem where people are provided with one-sided information in order to earn money with clicks." German BND (foreign intelligence service) Vice President Dag Baehr even speaks of the "weaponization of algorithms ".

The deeper disinformation penetrates society, the more important the search for countermeasures becomes. The focus immediately turns to the issue of regulation. "Every sausage stand is more strictly regulated than social media platforms," criticises von Notz. In his opinion, the experiences of the German past should serve as a warning: "After National Socialism, we imposed a tough regulatory framework on public broadcasting. This does not exist in the area of social media. That's crazy," says von Notz.

However, the difficulty with the regulatory approaches is that they almost exclusively involve US operators. "We will not be able to solve the issue of disinformation on our own," concludes Markus Richter, State Secretary at the German Federal Ministry of the Interior, as it not only has a federal dimension, but also a European one. However, he sees "a great deal of support for this in the EU". The EU is already promoting various projects to combat disinformation and has set up "euvsdisinfo.eu", a platform for analysing fake news campaigns. "I believe that we will soon be able to find a common solution at EU level," Richter is convinced.

In addition, awareness campaigns, such as those run by the German Federal Centre for Civic Education (bpb), remain important. It is also necessary to increase media literacy. Georg Mascolo, for example, is calling for media skills to be integrated into school lessons. BND Vice President Baehr, on the other hand, would like to see a "mechanism to bring more truth to the people", but leaves it unclear what this should look like.

Caution is required with all measures so that we do not destroy what we actually want to preserve. "We have to be careful that we don't abolish what makes up our society in our defence against disinformation. That could be the target of the attackers," warns Martin Wolff, Head of the International Clausewitz Centre at the German Bundeswehr Command and Staff College. Because "disinformation is always an attack on an open society. If you close it down and turn it into a dictatorship, you won't have the problems," he adds. However, as this cannot be desirable, he ends with the question: "What standards do we want to set ourselves in order to defend what is important to us ?"


What you should know about the use of AI!

In the cyber underground, we see AI systems that specialise in different attack scenarios. This makes social engineering or phishing attacks, for example, even more dangerous. But AI is now also being used intensively in cybersecurity. AI therefore also increases the efficiency of defence measures in security solutions such as threat detection, incident response, phishing protection or SIEM.

 
close

This content or feature is available to the it-sa 365 community. 
Please register or log in with your login data.