This is NATO folks, they are not our friends
Executive Summary
Warfare has shifted dramatically over the past several decades, moving away from the physical
threats of conventional warfare. War now moves towards the social and ideological threats
brought about by mass media and advances in technology. The advent of this new type of
warfare is different from anything we have seen before. Although it takes elements from
previous types of hybrid warfare, the reach and level of impact it possesses make it far more
dangerous than its predecessors. We have dubbed this new way of war cognitive warfare.
Cognitive warfare, although sharing various similarities to other non-conventional and
non-kinetic types of warfare/operations, is ultimately unique in its execution and purpose. In this
paper, we examine the origins of non-kinetic warfare by first looking at the Cold War and the use
of psychological operations (PsyOps). We follow the evolution of warfare, noting that
advancements in technology gave rise to electronic warfare and subsequently cyber warfare. As
cyber capabilities continued to develop, intelligence became a growing field and information
warfare started to develop. Cognitive warfare, however, goes a step further than just fighting to
control the flow of information. Rather, it is the fight to control or alter the way people react to
information. Cognitive warfare seeks to make enemies destroy themselves from the inside out.
We define cognitive warfare as the weaponization of public opinion, by an external entity, for
the purpose of (1) influencing public and governmental policy and (2) destabilizing public
institutions.
Destabilization and influence are the fundamental goals of cognitive warfare. These goals work
towards the purpose of sowing discontent within a society or encouraging particular beliefs and
actions. The 2016 Democratic National Convention (DNC) leaks are a good example of a foreign
power exploiting divisions to destabilize a society. Terrorist groups like Al-Qaeda demonstrate
how civilians can be influenced and recruited by radical ideologies. Never before has such
insidious manipulation been as easy to accomplish as today. Advances in connectivity,
digitization, neurology, and psychology have provided society with a great many boons. Yet, with every new opportunity, a new threat emerges. Today we are faced with the problems that
come with social media’s ability to broadcast information to billions of willing people in a matter
of minutes. We must defend against algorithms that can identify who would be the most
susceptible to posted material, and who is most willing to spread it. The present-day ability to
fake and manipulate information is unprecedented, and recent advancements in artificial
intelligence have now made video and audio suspect as well. People are unsure of what to
believe, not even sparing governmental institutions from this lack of faith. Simultaneously, we
are revolutionizing what we know about how our brains and emotions function as individuals
experiment with different forms of control. [As inept and corrupt as the government has been since WW2, is it really a surprise that so many are disgusted with it, NATO/UN all globalist pigs and commie pigs at that d.c]
Therefore, it is our belief that NATO must adapt quickly and forcefully to defend against current
threats in the sphere of cognitive warfare and work to curtail future threats. While democratic
society is both complicated and amazing, it is also vulnerable. To get ahead of these threats,
NATO must respond defensively in three ways. First, NATO must work to develop a working
definition or framework for cognitive acts of war. This includes a set of criteria for discovering
cognitive attacks as they are taking place. Second, the alliance must assess vulnerabilities to
cognitive attacks at a national and personal level in hopes of creating and inspiring a more
resilient population. Third, NATO must establish organizations to liaise with tech companies and
handle the challenges of the future of warfare. An additional final consideration would be an
analysis of potential hostile states onto whom we may use cognitive warfare against in an
offensive strategy or as deterrent.
The foundation for democracy lies not only in laws and civil order, but also in trust and mutual
respect: the trust that we will follow those laws, respect civil institutions, and respect each other
and our differing opinions. Trust is now at risk, truth is being attacked, and democracy is being
threatened. The time to prepare is now, and the whole world is watching.[NATO, you have nothing to do with the truth, and we sure do not trust you or that death of vipers in that s*#thole New York., state government there just as corrupt as the Feds dc]
Introduction
People have attempted to influence public opinion since the rise of civilization. It is an essential
component of the political structures into which we have evolved. However, the weaponization
of public opinion is a novel, threatening development in how we interact. The advent of the
internet and mass media have made possible the large scale manipulation of populations via
targeted, accessible, multimodal messaging, which can now exist under the guise of anonymity.
In a sea of a billion voices, pinpointing individual sources has become incredibly difficult [1].
An effort that, in some ways, is comparable to the difficulty of identifying who screamed “Fire!”
in a crowd. Some will argue that this is intended, contending that anonymity is required for the
resources the internet provides. Others, however, fret about the unintended consequences that
this lack of accountability might bring about in the long-term [2].
No matter which perspective is correct, it is our opinion that NATO must be aware of the threat
that our interconnectedness has wrought. Tactics targeting the public will not fade with time;
they will become more efficient. They will aim at broader audiences. Moreover, they will
become increasingly convincing. Already, technological advancements have showcased their
ability to do exactly this. One does not have to look further than the 2016 DNC information leaks
to confirm the rapidity with which information is acquired and spread to the advantage of an
opposing party.
Social media, news networks, automation algorithms, artificial intelligence, mental health
guidance, and, perhaps, even our own physiology are expected to evolve rapidly in the near
future. All of these are working to make us more connected, more data-driven, and more curious.
It will be an exciting new era of human interaction. However, the roads in our minds are not
one-way streets. Whilst people receive information, they are simultaneously giving away
information and data. As it stands, simple lines of codes will one day be able to identify and
describe everything about us. Our habits, our friends, our faiths, our cultures, our preferences,
and even our vices. For the first time, war will not deal with exposed bodies. It will deal with
exposed minds instead. It is this new avenue of war we have dubbed cognitive warfare. [They have no intention of stopping their madness dc]
Evolution of Non-Kinetic Warfare
Origins
It starts, as with most things in the nature of modern war, in the Cold War. Mutually assured
destruction (MAD) became the accepted global doctrine, rendering total war on the scale of
WWII improbable. Proxy warfare became a dinnertime discussion. Subversion and espionage are
prevalent in daily international interactions and “plausible deniability” is the term of the time.
Thus, the CIA and FBI have expanded far beyond their initial capabilities, and actions in the
shadows have become the norm [3]. These new methods have become the “civilized” approach
to conflict, clearly better than the “barbarities” of a nuclear holocaust. It is also here where the
power of words and ideas, and non-kinetic war, is finally seen in full force. Millions, or even
billions, witnessed this when the Soviet Union watched the collapse of the Iron Curtain, unable
to fulfill its goal of withstanding the power of “blue jeans and rock and roll.” [4]. Democratic
nations have always had a “home advantage” in utilizing the voice of the public. Able to tout
their messages of individual freedoms and abundant resources, Western democracies have
consistently used their words and ideas as ammunition against more authoritative regimes. [ and now on their own people with this covid plandemic dc]
Perhaps the proof of the efficacy of such tactics lies in the reactions they have elicited from
non-democratic powers. Restrictions, bans, and general censorship have long been the policy of
countries such as China, Russia, and, much more drastically, North Korea. The age of the
internet has only reinvigorated their concerns. Unsurprisingly, Facebook and other social media
platforms face restrictions, if not outright bans, in these and similar countries worldwide
[5][6][7]. However, it is these very ideals of free press and free speech that have left Democratic
nations vulnerable to powers attempting to control public thought. These nations have been
forced to go on the defensive as the protection of resources and individual freedoms no longer
hold the same persuasiveness they once did. The global economy has seen both the US and
China prosper [8]. The most important change since the Cold War, however, has been the shift in
how we communicate and share ideas. We seek to illustrate this change by recounting past shifts in the way that people have fought over minds and information, concluding with a new era and
avenue of war.
Psychological Warfare (PsyOps)
In the United States, PsyOps specifically relates to the use of white, gray, and black products
produced by various branches of the military and the CIA or its predecessors. White products are
officially identifiable as being sourced from the US, gray products have an ambiguous source
element, and black products are meant to seem as if they originate from a hostile source.
Operations include the likes of propaganda radio, providing insubordination manuals to the
militia, and even encouraging child soldiers to defect to avoid conflict [9][10].
In comparison to cognitive warfare, there are quite a few key differences. First, cognitive warfare
deals mostly with gray products. White and black products are either too transparent or too risky
to be reliable methods of affecting public opinion. Additionally, there is a certain element of
deniability inherent to cognitive warfare that is lost in white products and endangered by black
products. Moreover, PsyOps has rarely dealt with large sections of the public in the past. There is
an emphasis on military or subversive activity in PsyOps that is not usually the goal of cognitive
warfare tactics, which tend to target civilian social infrastructure and governments [9].
Electronic Warfare (EW)
EW is defined by the use of the electromagnetic spectrum to attack the enemy, impede enemy
attacks, or identify and scout for specific assets. In some ways, electronic warfare was the
precursor to cyberwarfare. Its origins date back to the 1900s with the invention of early wireless
communication. Infrared homing, radio communications, and the increasing use of wireless
technologies make this an important logistic division inside the armed forces. However, this field
deals heavily with instrumentation and tactical advantages. This does not deal with public
opinion or even interact heavily with the civilian space outside of impeding household electricity
and radio [11][12].
Cyberwarfare
Cyberwarfare is defined as the use of cyberattacks with the intention of causing harm to a
nation’s assets. Cyberwarfare, and its military classification, is still highly debated [13].
Nevertheless, numerous NATO member states and other countries have invested in developing
cyber capabilities, both offensive and defensive [14][15]. Some worry about defining such
actions as war because they “only” target computers. However, the global trend towards
digitization and the Internet of Things (IoT) has meant that more functions are controlled now by
computers than many would imagine. Everything from construction equipment, to financial
institutions, to civilian infrastructure, and even to military installations now depend on a complex
computer network [16]. The loss of such computer assets can, and already has, cost massive
damages not just in terms of time and data loss but in physical damage that can be measured in
dollars and lives [17].
Cyberwarfare’s relation to cognitive warfare is mostly that they share an avenue of operations.
There have been instances of computer viruses spreading themselves through social media by
targeting the friends and/or contacts of the afflicted individual. However, these instances are
better described as cybercrimes rather than targeted attempts of cyberwarfare. Cognitive warfare
utilizes social media networks in a completely different way. Instead of spreading malicious
software, agents of cognitive warfare spread malevolent information. Utilizing similar tactics to
those used in DDoS attacks, namely botnets, cognitive warfare agents can spread an
overwhelming amount of false or misleading information through accounts that look and interact
in a human fashion [18]. However, this is only one tactic employed in cognitive warfare and is
largely where the similarities with cyberwarfare end.
Information Warfare
Information warfare is the most related, and, thus, the most conflated, type of warfare to
cognitive warfare. However, there are key distinctions that make cognitive warfare unique
enough to address under its own jurisdiction. As former US Navy Commander Stuart Green
described it, “Information operations, the closest existing American doctrinal concept for cognitive warfare, consists of five ‘corps capabilities’, or elements. These include electronic
warfare, computer network operations, PsyOps, military deception, and operational security.”
[19]. Succinctly, information warfare works to control the flow of information.
The main distinction between information warfare and cognitive warfare is that the former does
not draw a distinction between battlefield tactical information and information aimed toward the
public. For example, information warfare deals with DDoS attacks and ghost armies while
neither of these falls into the purview of cognitive warfare. Perhaps a sharper delineation is that
information warfare seeks to control pure information in all forms and cognitive warfare seeks to
control how individuals and populations react to presented information [20].
Cognitive Warfare
A recent definition, from December 2019, provided by Oliver Backes and Andrew Swab, of
Harvard’s Belfer Center defined cognitive warfare thusly: “Cognitive Warfare is a strategy that
focuses on altering how a target population thinks – and through that how it acts.” [21]. Despite
the intentional vagueness of this definition, it serves as a more-than-suitable framework for a
further examination of cognitive warfare. Our own research and analysis of past, present, and
potential future use cases of the term have allowed us to further segment cognitive warfare into
two operational fields. We have also come up with a quick reference list to validate whether or
not something falls in the realm of cognitive warfare. [if you are not aware that the American people ARE the target population, please seek help dc][see page 10 of PDF link for list chart dc]
To summarize, cognitive warfare is the weaponization of public opinion by an external entity, for
the purpose of influencing public and/or governmental policy or for the purpose of destabilizing
governmental actions and/or institutions.
Goals of Cognitive Warfare
Cognitive warfare, at its core, can be seen as having the same goal as any type of warfare. As
Carl von Clausewitz states, “War [is an] act of force to compel our enemy to do our will.”
Cognitive warfare, unlike traditional domains of war, does not primarily operate on a physical
plane. Therefore, it does not utilize a physical force in order to compel its enemies. However, it
could also be argued that the goal of cognitive warfare is unlike any other type of warfare. Rather
than “compel our enemy to do our will,” the goal is get the enemy to destroy himself from within
rendering him unable to resist, deter, or deflect our goals.
In either case, the goals of cognitive warfare are achieved through different methods than the
goals of conventional warfare. Cognitive warfare has two separate, but complementary, goals:
destabilization and influence. While both of these goals can be accomplished separately, to
successfully weaponize public opinion, they can also be jointly attained by using one as a means
to the other. The targets of cognitive warfare attacks may range from whole populations to
individual leaders in politics, the economy, religion, and academics. Further, the role of
lesser-known social leaders must not be overlooked. So-called connectors, mavens, and
salespeople can be instrumental in the application of cognitive warfare [22].
To better classify cognitive warfare attacks, Figure 1 presents a pair of axes by which events can
be characterized. In the following section, we will analyze each goal individually and describe
how they are intertwined, generating a new, more dangerous, more pervasive type of warfare.
Then, we will detail examples of cognitive warfare battles and skirmishes that have occurred or
have the potential to occur in the future. These campaigns will propel cognitive warfare to the
global stage. Action must be taken, opposition campaigns created, and defensive measures
implemented, to prevent the perpetrators’ success.
Destabilization
The first fundamental goal of cognitive warfare is to destabilize target populations.
Destabilization is done by disrupting the organization and unity of a population’s systems and
people. This results in a drastic drop in productivity and a loss of cooperation as that population
is now overwhelmed by internal issues and less focused on reaching common goals. Perpetrators
disrupt the organization and unity of their target populations by accelerating pre-existing
divisions within groups of the population or introducing new ideas designed to pit different
groups against each other and increase polarization.
Leaders can be seen as the targets of destabilization when they become the source of polarizing
ideas. Perpetrators can also target the general population of people to randomly introduce
divisive ideas that play on previously held beliefs or push false narratives against groups of people. Some strategies of cognitive warfare that align with the goals of destabilization include,
but are not limited to, the following:
● Increase polarization
● Reinvigorate movements/issues
● Delegitimize government/leadership
● Isolate individuals/groups
● Disrupt key economic activities
● Disrupt infrastructure
● Confuse communication[sound familiar dc]
Below are several cases showcasing examples of destabilization as a goal of cognitive warfare
and the circumstances surrounding them.
Case 1: Destabilization through Confusion
Cognitive warfare campaigns may strive to destabilize populations of people by causing mass
confusion. Chaos is bred when a population no longer knows what is right and who to trust. As a
result, civilians may begin to lose faith in the leadership of the nation that is meant to oversee
their safety and freedoms. Undermining leadership and generating chaos poses a threat to
Western democracies, one of the most recent and glaring examples coming from the outbreak of
and early events surrounding COVID-19.[the following paragraph is 100% bullshit, I do not take my lead from any of those countries, I am just an American who has had enough of the corporate and government corruption that has plagued this country now since the civil war dc]
Russia, China, and Iran took the whirlwind of confusion brought about by the virus as an
opportunity to initiate a cognitive warfare campaign against the West. It is a multidirectional and
multifaceted campaign meant to undermine public confidence in Western states [23]. This
campaign started with the outbreak of the virus and confusion surrounding its origin, which is
where we began to see cognitive warfare tools, such as disinformation and false narratives, being
employed. Chinese foreign minister Lijian Zhao opened up with a barrage of questions in a tweet
from early March targeted at the US asking, “When did patient zero begin in the US? How many people are infected? What are the names of the hospitals? It might be the US Army that brought
the epidemic to Wuhan” [24]. He then urged followers to read and spread a conspiracy theory
from Global Research, a Canadian website, that stated the virus originated in the US Army
Research Medical Research Institute of Infectious Diseases at Fort Derrick, Maryland [24]. The
dissemination of these narratives with little to no concrete evidence at such early stages of the
virus only functioned to sow doubt in the minds of American citizens and its allies. These tools
of cognitive warfare are designed to affect the way people interpret and react to information in a
way that makes them doubt their own leadership. [ trust me you assholes, our government has earned every bit of flack it is getting, these outrageous vaxx mandates by the demented Biden are the last straw dc]
Russia reacted in a similarly malicious way through its government-owned news agency,
Sputnik. It released propaganda in over thirty languages in line with many of the narratives
coming out of China, that argued the virus originated in the U.S. or that the U.S. developed and
released the virus as a bioweapon intended to weaken China’s economy [24]. The constant
stream of false narratives varied from somewhat plausible stories to outlandish accusations
targeted at Western states and government organizations. Herein lies the danger of cognitive
campaigns: it becomes increasingly difficult to differentiate between credible and non-credible
stories, especially when they originate from government-backed news sites like Sputnik. The
Kremlin can send out thousands of different stories with various levels of credibility and
plausibility and see what sticks. This strategy quickly results in the undermining of Western
governments in what the population might see as a lack of honesty towards its people regarding
the virus or the inability to protect them from it.
Iran is the third major player conducting its own cognitive warfare campaign against the citizens
of Western states. News stories emerging from Tehran contained themes similar to the stories
coming out of Beijing and Moscow. Press TV is an English and French news network associated
with the Islamic Republic of Iran Broadcasting, which released numerous pieces tying the
Coronavirus outbreak to the U.S. military [24]. The commander of the Islamic Revolutionary Guard Corps, Hossein Salami, has gone so far as to proclaim that COVID-19 is the spearhead of
a U.S. biological invasion [24].
The cognitive campaigns of China, Russia, and Iran surrounding the outbreak of the Coronavirus
are all targeted against Western states and contain nearly identical messaging. The danger in
these campaigns is their tremendous reach and support from government leaders and institutions.
Similar stories are released in dozens of languages all over the world and either come directly out
of state-controlled news sites or government-supported media outlets [24]. But that is just the
origin, with tens of thousands of independent sites and users spreading the same narratives to
undermine the credibility of Western societies, whether intentionally or not. Americans are
finding these stories over and over and encouraged to believe their government is hiding
information from them. Or they are overwhelmed with stories that make them doubt the nation’s
ability to protect them from serious threats to their safety and personal liberties. The threat is not
only in the information that is being spread but in the destabilizing reactions and beliefs of those
receiving them. [you assholes think we are all dumbed down, we don't believe, we KNOW the government is lying to us, as well as holding crap under the national security umbrella to save themselves from the people. Washington DC has to go dc]
Case 2: Destabilization by Sowing Division
Cognitive warfare often seeks to divide a population and increase polarization. Pre-existing
divisions along political party lines may seem to be the most obvious to exploit. However, this is
not always the case as cognitive campaigns can be aimed at sowing internal divisions within a
group. We see a strong example of this in the case of the 2016 DNC email leaks. [still trying to sell the lie, smfh dc]
Russia had been training its cyber capabilities for several decades, experimenting on countries in
Europe, such as Ukraine, to test its impact in elections. As the director of the National Security
Organization and commander of US Cyber Command, Adm. Michael Rogers stated, “this was
not something that was done by chance, this was not a target that was selected purely arbitrarily.
This was a conscious effort by a nation-state to achieve a specific effect” [25]. And it did have a
profound effect on the DNC and the Democratic party at large.
In April of 2016, Russian cyber units gained access to the DNC’s internal servers, allowing them
to steal confidential emails and documents. Several months later, these communications were
leaked on WikiLeaks. It was at this point that the operation turned from typical espionage to
political sabotage. Campaigns were being uprooted and division began to grow within the party.
It was revealed that the DNC was favoring Hillary Clinton in the time leading up to the selection
of an official presidential candidate. This turned progressive Democrats and moderate Democrats
against each other. The rift within the democratic party resulted in a shift in the candidates’
support. Instead of focusing on the Russian attack itself, political figures were much more
concerned with the content and what it might mean for their careers [26]. Voters began shifting
due to the issues brought out by these new documents and released statements of discontent.
Hillary Clinton’s campaign admitted that this attack had a significant effect on the outcome of
the overall election [23]. And this is not to say there wasn’t a further polarizing impact along
party lines as well. President Trump’s campaign turned towards the content of the hacks as well,
rather than the Russian attack itself. In a time when the country could not afford to react in a
partisan way, the two major parties turned on each other and themselves, resulting in one of the
most polarized stages in American politics. [lies, lies, lies, seth rich paid the price for releasing those emails, the Russia lie comes from the DNC dc]
The Russian attack goes far beyond an act of cyberwarfare. The cyber capabilities and lack of
appropriate security were merely the tools and opportunities that made the attack possible. The
attack itself, however, was aimed at a much larger end, one accomplished with flying colors.
American politics were thrown into disarray and Russia now has influence in two of the most
important institutions in American democracy: elections and independent media [25]. The
divisions created and escalated by this event destabilized politics and the election process.
Unfortunately, the media was nothing but excited to cover the content that was being released,
with WikiLeaks being the most searched political term for the month of October in 2016 [25].
This was precisely the reaction that Russia had hoped for. Their cognitive campaign was
perfectly targeted at exploiting the divisions of American politics and resulted in a state of blame
and uncertainty.
Case 3: Destabilization as a Means to Influence
go to page 17 here to read the rest of the propaganda
No comments:
Post a Comment