Information Warfare: Disinformation

 

The Problem:

Disinformation is a type of misinformation where someone shares data that they know is incorrect in order to influence individual, group, or public opinion or obscure the truth.  Disinformation may include distribution of forged documents, videos, manuscripts, and photographs, or spreading dangerous rumors and fabricated intelligence.  China has been a major player in the disinformation strategy.

  • China created a “keyboard army” that is a large group of Chinese citizens paid to monitor the internet and influence public opinion on a massive scale online. The end goal is to aggressively defend and protect China’s image overseas[1].
  • Spamouflage Dragon is a pro-Chinese political spam network that camouflages their political messaging with innocent content (showing, for instance, cute animals and dancing girls)[2]. The innocent content creates simple clickbait, but once people click, Spamouflage Dragon shares their political message.
  • Another tactic is fake or hijacked social media accounts, where those accounts become the nexus for disinformation. China’s “wolf warrior” diplomats aggressively defend their home country online through building an audience with viral content, leveraging the influence networks of other autocrats, manufacturing the appearance of popular backing, posting conflicting conspiracy theories, and using ‘positive’ content to drown out criticism[3].

Disinformation contains false or out-of-context true information but the key components are that it always carries a malicious intent, it is deliberately deployed, and often part of a larger influence campaign. These longer term campaigns are often pushed over an extended period with concrete and continuous efforts with the “Big Lie” playbook being a good example. The six stages of media manipulation in the “Big Lie” – from Campaign Planning, to Seeding Information, Eliciting Responses, Adjusting Tactics, then restarting the cycle again and again.

Possible Solutions:

Understanding that first stage (the source and their intent) can go a long way in tackling the disinformation. First, always confirm the information from multiple reputable sources. Secondly, find out who benefits the most and how they are related to the spread of this information. These two questions can go long way with tackling the disinformation. If you don’t know who is pushing the information and whether it’s a true or not, you know it’s not trustworthy.

Combating disinformation at a national level is a hard problem to solve. However, it is possible with time and strategic approach. Grey Market Labs Engineer, Dhaval Vyas, states that “education is a key when it comes to combating disinformation. A well rounded education teaches critical thinking skills, which are extremely helpful with the identification of disinformation. Younger people are particularly vulnerable to fake news and disinformation. Therefore, developing critical thinking skills early on and teaching an ability to manage propaganda, fake news and disinformation effectively can go long way in combating disinformation.” However, this is a longer process.

The technology has amplified the problem of disinformation; however it can also offer a potential solutions. One approach could be using blockchain. Blockchain system uses a decentralized and immune ledger to manage information. It can help provide transparency into the lifecycle of the content by verifying origin and source reputation. The New York Times’s News Provenance Project is utilizing this approach. Another approach could be a use of global registries of labeled fake news. There are already websites available that helps with identification of fake news, such as factcheck.org and politifact.com. Integrating these websites with social media and news organizations through APIs could be very helpful. Lot of fake news is AI generated and it could also be used to identify fake news. Natural networks generate synthetic text, and they are also familiar with habits, quirks, and traits of the text. This makes them well-suited to detect content emerging from those networks.

The terms propaganda, misinformation, and disinformation need to be well defined. Legal structure needs to be added around these terms to allow accountability to be held for the organizations/persons spreading disinformation needs to be brought to justice using these laws. Presently there is little deterrence on spreading disinformation. At a global scale, formulating shared terminology for combating disinformation, and deliberately and continuously responding to foreign-sponsored disinformation is necessary to reduce the impact and potential harm from state sponsored campaigns.

 

***The next article in The New Battlefront 101 series will discuss cyber attacks on individuals.

___________________________________________________________________________________

Grey Market Labs® is a Certified B-Corp founded with the mission to protect life online. Our Replica™ platform orchestrates, automates, and secures Environments-as-a-Service, making organizations more protected with our patented privacy and Zero Trust architecture and more productive by increasing access to critical data, tools, and workflows simply, on-demand, anywhere. Replica™ support of dozens of use cases that span industries: from disrupting fraud on the dark web, to supporting military operations, combatting human trafficking, and enabling trusted data sharing in healthcare. 

Grey Market Labs® is the first cybersecurity product company recognized as a Certified B-Corp organization.

Contact us to see how we can work together.