For example, the CV19 war was hugely successful due to narrative engineering and controlling/blocking the flow of information on social media.
To successfully navigate the climate of deception and weaponized data it’s important to scrutinize information and verify facts and sources for ourselves.
According to a recent PLA article, controlling public perception includes “four social-media tactics, dubbed “confrontational actions”: Information Disturbance, Discourse Competition, Public Opinion Blackout, and Block Information. The goal is to achieve an “invisible manipulation” and “invisible embedding” of information production “to shape the target audience’s macro framework for recognizing, defining, and understanding events.”
Link To Full Article HERE
In the recent article, China’s social-media attacks are part of a larger ‘cognitive warfare’ campaign, the authors describe this invisible warfare domain:
“Chinese government and military writings say cognitive operations aim to “capture the mind” of one’s foes, shaping an adversary’s thoughts and perceptions and consequently their decisions and actions. Unlike U.S. defense documents and strategic thinkers, the People’s Liberation Army puts cognitive warfare on par with the other domains of warfare like air, sea, and space, and believes it key to victory—particularly victory without war.”
By Josh Baughman and Peter W. Singer
The phrase “cognitive warfare” doesn’t often appear in news stories, but it’s the crucial concept behind China’s latest efforts to use social media to target its foes.
Recent stories have ranged from Meta’s “Biggest Single Takedown” of thousands of false-front accounts on Facebook, Instagram, TikTok, X, and Substack to an effort to spread disinformation about the Hawaii fires to a campaign that used AI-generated images to amplify divisive U.S. political topics. Researchers and officials expect similar efforts to target the 2024 U.S. election, as well as in any Taiwan conflict.
Chinese government and military writings say cognitive operations aim to “capture the mind” of one’s foes, shaping an adversary’s thoughts and perceptions and consequently their decisions and actions. Unlike U.S. defense documents and strategic thinkers, the People’s Liberation Army puts cognitive warfare on par with the other domains of warfare like air, sea, and space, and believes it key to victory—particularly victory without war.
Social media platforms are viewed as the main battlefield of this fight. China, through extensive research and development of their own platforms, understands the power of social media to shape narratives and cognition over events and actions. When a typical user spends 2.5 hours a day on social media—36 full days out of the year, 5.5 years in an average lifespan—it is perhaps no surprise that the Chinese Communist Party believes it can, over time, shape and even control the cognition of individuals and whole societies.
A recent PLA Daily article lays out four social-media tactics, dubbed “confrontational actions”: Information Disturbance, Discourse Competition, Public Opinion Blackout, and Block Information. The goal is to achieve an “invisible manipulation” and “invisible embedding” of information production “to shape the target audience’s macro framework for recognizing, defining, and understanding events,” write Duan Wenling and Liu Jiali, professors of the Military Propaganda Teaching and Research Department of the School of Political Science at China’s National Defense University.
Information Disturbance (信息扰动). The authors describe it as “publishing specific information on social media to influence the target audience’s understanding of the real combat situation, and then shape their positions and change their actions.” Information Disturbance uses official social media accounts (such as CGTN, Global Times, and Xinhua News) to push and shape a narrative in specific ways.
While these official channels have taken on a more strident “Wolf Warrior” tone, recently, Information Disturbance is not just about appearing strong, advise the analysts. Indeed, they cite how during 2014’s “Twitter War” between the Israeli Defense Force and the Palestinian Qassam Brigade, the Palestinians managed to “win international support by portraying an image of being weak and the victim.” The tactic, which predates social media, is reminiscent of Deng Xiaoping’s Tao Guang Yang Hui (韬光养晦)—literally translated as “Hide brightness, nourish obscurity.” China created a specific message to target the United States (and the West more broadly) under the official messaging of the CCP, that China was a humble nation focused on economic development and friendly relationships with other countries. This narrative was very powerful for decades; it shaped the U.S. and other nations’ policy towards China.
Discourse Competition (话语竞争)The second type is a much more subtle and gradual approach to shaping cognition. The authors describe a “trolling strategy” [拖钓], “spreading narratives through social media and online comments, gradually affecting public perception, and then helping achieve war or political goals.”
Here, the idea is to “fuel the flames” of existing biases and manipulate emotional psychology to influence and deepen a desired narrative. The authors cite the incredible influence that “invisible manipulation” and “invisible embedding” can have on social media platforms such as Facebook and Twitter in international events, and recommend that algorithm recommendations be used to push more and more information to target audiences with desired biases. Over time, the emotion and bias will grow and the targeted users will reject information that does not align with their perspective.
Public Opinion Blackout (舆论遮蔽). This tactic aims to flood social media with a specific narrative to influence the direction of public opinion. The main tool to “blackout” public opinion are bots that drive the narrative viral, stamping out alternative views and news. Of note to the growing use of AI in Chinese influence operations, the authors reference studies that show that a common and effective method of exerting cognitive influence is to use machine learning to mine user emotions and prejudices to screen and target the most susceptible audiences, and then quickly and intensively “shoot” customized “spiritual ammunition” to the target group.
This aligned within another PLA article entitled, “How ChatGPT will Affect the Future of Warfare,” .” Here, the authors write that generative AI can “efficiently generate massive amounts of fake news, fake pictures, and even fake videos to confuse the public” at a n overall societal level of significance. Their The idea is to create, in their words, a “flooding of lies”” while by the dissemination and Internet trolls to create “altered facts” creates confusion about facts and . The goal is to create confusion in the target audience’s cognition regarding the truth of “facts” and play on emotions of fear, anxiety and suspicion. to create an atmosphere of insecurity, uncertainty, and mistrust. The end-state for the targeted society is an atmosphere of insecurity, uncertainty, and mistrust.
Block Information (信息封锁). The fourth type focuses on “carrying out technical attacks, blockades, and even physical destruction of the enemy’s information communication channels”. The goal is to monopolize and control information flow by preventing an adversary from disseminating information. In this tactic, and none of the others, the Chinese analysts believe the United States has a huge advantage. They cite that in 2009, for example, the U.S. government authorized Microsoft to cut off the Internet instant messaging ports of Syria, Iran, Cuba and other countries, paralyzing their networks and trying to “erase” them from the world Internet. The authors also mention in 2022, Facebook announced restrictions on some media in Russia, Iran, and other countries, but falsely claim that the company did so to delete posts negative toward the United States, for the US to gain an advantage in “cognitive confrontation.”
Continue Reading Article HERE
Link To Full Article HERE
5G/6G (and beyond), wireless infrastructure, and Artificial Intelligence are the core of these systems.
Wireless infrastructure (including satellites) is the key to this weaponized surveillance grid, linking the Internet of Things (IoT) and the Internet of Bodies (IoB).
Using cash and unplugging from ‘smart’ devices, social media, and wi-fi are steps we can all take to start exiting from the digital prison.
Civilian leadership and regulatory oversight of infrastructure are urgently needed to reign in out-of-control military operations and weapons of war.