Riots such as those currently taking place in the UK show that disinformation and fake news are increasingly becoming a security problem for society as a whole. The search for countermeasures is underway.
Automated propaganda networks misuse current technical possibilities such as artificial intelligence for their purposes. They have only one goal: to undermine democratic societies. The war in Ukraine has fueled these activities. In the meantime, even the secret services are dealing with the issue of disinformation. However, the possible countermeasures are still limited and not always harmless.
In the current super election year, the operators of fake news networks are ramping up their activities. In some cases, they are even making common cause with cyber criminals: "For example, they hack the account of a politician and use it to send fake news," warns the German federal office for information security, BSI.
Decisions on election day often depend on what information is available before the election. Fake news therefore represents "a major threat to democracy", according to the BSI. The diverse possibilities for manipulation in cyberspace are attracting masses of dubious actors, who often spare no effort to manipulate people and opinions on behalf of autocratic regimes. The so-called doppelganger campaign is a spectacular example of the effort that goes into this and what is possible.
The Meta Group described the so-called doppelganger operation as the "largest and most aggressive operation of Russian origin". Suspected Russian actors went to great lengths to imitate well-known media, ministry and government websites. They proceeded very skilfully, for example German online news magazine "t-online.de" became the similar sounding "t-online.online" or the website of another German news magazine "spiegel.de" became the fake site "spiegel.ltd". On the fake websites, they not only posted copies of the original news sites garnished with fake news, but also lots of manipulated photos, among other things. They also produced deepfake videos using specific AI platforms and published them on the fake websites. (How deepfake videos can be used successfully is described in the article "Real or fraud?” )
The underground actors even conducted their own interviews with politicians and scientists, but gave false information about the medium and the true background. The content is then taken out of context and presented with completely different references. Masses of social media bots ensure dissemination and attention via social networks.
"The ground should shift"
This is not without success: "We see the effect of disinformation" everywhere in society, said Sinan Selen during the Potsdam Conference on National Cyber Security organised by the German Hasso Plattner Institute (HPI) recently. Fake news and disinformation was a major topic at this year's event. "We are seeing an erosion of trust," stated the Vice President of the The German domestic
intelligence services (BfV), "and that has something to do with disinformation". Christian Dörr, professor at HPI and head of the "Cybersecurity and Enterprise Security" department, also confirms this: "We see the results in a radicalisation of society, for example, or in difficulties in conducting an open discourse".
These effects are intended by the actors. They want to undermine trust in social and political institutions and the media, polarise society and stir it up. To this end, they try to manipulate the political mood, especially before elections. New technologies are changing the campaigns: "While disinformation used to be used to make people believe something specific, today it is about making sure that people no longer believe anything at all. The ground should shift," explained journalist and former editor-in-chief of the news magazine "Der Spiegel", Georg Mascolo, during the HPI event. For German Green Party politician Konstantin von Notz, "the main aim is to withdraw trust from the rule of law".
Algorithms as a weapon
This radicalisation of targets in the spread of misinformation is made possible by technological advances - from social media to artificial intelligence. They allow for much more effective measures than during the Cold War, for example. "Disinformation is not a new topic, but now I can scale and customise; I know exactly what I need to send you to make it work," explains Christian Dörr. He adds: "What is new is that we can, in principle, be controlled in a targeted manner.
The creators of fake news campaigns skilfully exploit the interests of social media platform operators. The logic of these platforms is a logic of profit maximisation, Konstantin von Notz puts it in a nutshell. The basis for this is formed by algorithms that "rank everything high that is exciting, offensive or sensational", i.e. prioritise it, explains Mascolo. For the platforms, "polarisation is a very good business model", which is why they have no problem with disinformation, he adds. Holger Münch, head of the German BKA (Federal Criminal Police Office), summarises the situation as follows: "We are chasing after a problem where people are provided with one-sided information in order to earn money with clicks." German BND (foreign intelligence service) Vice President Dag Baehr even speaks of the "weaponization of algorithms ".