Counter-Disinformation: The New Snake Oil

Authored by Tom Wyatt via Racket News,

The San Diego Convention Center was packed with the defense industry elite. Boeing, Northrup Grumman, Booz Allen Hamilton, and a myriad of other arms industry salesmen, hungry to peddle their wares. WEST Conference 2023 is billed as the “premier naval conference and exposition on the West Coast.” A collective of military leaders and titans of the defense industry, intermingled in incestuous harmony.

The WEST 2023 conference in San Diego.

It was a world with which I was well acquainted. After all, I had spent the past fifteen years in the Navy as a Special Warfare Boat Operator, using tools and weapons built by these very defense companies. My call to service came unexpectedly at the tail end of my high school senior year. I left for bootcamp on Valentine’s Day, 2007, and immediately entered the world of Naval Special Warfare upon completion. While the rest of my graduating class received tutelage at universities around the country, mine came by way of the military elite. Over a decade and a half, I received an education in Special Reconnaissance, Unconventional Warfare and tradecraft.

Many of the friends and former colleagues I met along the way now worked for these defense contractors, and perhaps, in another life, I’d be a participant in this conference. As it turned out, I was there as a spectator only.

I watched as a sea of suits and lanyards moved in waves throughout the lobby outside the convention floor. Thousands of exhibitors and participants waited in line to get their credentials, exchanging cards and networking as they sized up the competition. As I surveyed the crowd, I noticed something familiar in the attendees’ faces. There was a look of out-of-place discomfort. Tattoos peeking out of cuffs and collars of the business attire their bodies seemed to be rejecting. It was the look of the defense industry’s freshman class, those who had just made the leap from serving in the armed forces to being arms proliferators.

The conference embodied the idea of the military industrial complex’s self-licking ice cream cone structure. There was no discernible line between merchandiser and consumer, just a single organism supporting itself.

In 2021, the Revolving Door Project released a report titled “The Military-Industrial-Think Tank Complex: Conflict of Interest at the Center for a New American Security,” that trained a troubling spotlight on one of the most prominent defense-minded think tanks. 

According to its website, the Center for a New American Society (CNAS) “is an independent, bipartisan, nonprofit organization that develops strong, pragmatic, and principled national security and defense policies.” The defense industry-funded group claims to “elevate the national security debate” by providing innovative research to policymakers and experts in the field. But given the whiff of defense industry influence around the organization, its high-level engagement with Washington’s most powerful figures raises numerous red flags. 

A major point of concern presented by the Revolving Door Project, was, ironically CNAS’ own revolving door. According to the report, there are “16 CNAS alumni who have been selected for foreign policy and national security policy-making positions in the Biden administration.” Among them: Avril Haines, a former CNAS Board of Directors member who became Biden’s Director of National Intelligence in 2021, and Colin Kahl, the current Undersecretary of Defense Policy, a former CNAS Senior Fellow. 

The report also includes instances of the think tank pushing agendas that directly benefit its membership, such as the collusion between CNAS and the United Arab Emirates to promote relaxed restrictions for exporting US drones. Not surprisingly, CNAS board member Neal Blue’s company, General Atomic, had an existing contract worth nearly $200 million with the UAE for drone production. The report reads:

CNAS receives large contributions directly from defense contractors, foreign governments, and the US government; publishes research and press material that frequently supports the interests of its sponsors without proper disclosure; and even gives its financial sponsors an official oversight role in helping to shape the organization’s research.

WEST 2023 offered yet another venue for private companies to seek out such connections, pushing an agenda that “supports the interests” of the defense industry–a kind of speed dating for the military industrial complex. And while the familiar mechanisms of war, like drones and submarines, were on display, the spotlight was on weapons of the information space.

Panel after panel featured cybersecurity and electronic warfare experts giving discussions on information operations, artificial intelligence and machine learning capabilities. The conference seemed to embody the new horizon for the defense industry: Information Warfare. 

Counter-Terrorism to Great Power Competition

“In hindsight, we should’ve never taken our eye off the Great Power Competition,” the counter-disinformation expert said, referring to the historical focus on traditional preparation for conflict against countries like China and Russia. Her mastery of the subject was honed over a decade through the study of forensic psychology and counter-extremism strategies. She had worked across the public and private sectors, countering the dangerous narratives of violent extremists.

At the end of World War II, the power that had previously been distributed across multiple nations was now consolidated by the US and Soviet Union. American foreign policy entered the era of Great Power Competition (GPC), a contest for global dominance and influence pitting the two former allies in a Highlander-style deathmatch to see who prevailed as the one true superpower. After the fall of the Soviet Union in 1991, the US focused on maintaining this strategic edge, until a decade later when the towers fell. 

After 9/11, the US reoriented its foreign policy around a new acronym: GWOT (Global War on Terror). The Spy vs. Spy tactics of the Cold War were obsolete now that the deadly effects of adversarial narratives had been demonstrated. While propaganda used by the Soviet Union (and the US, for that matter) was aimed at deceiving, disrupting and undermining the adversary, terrorist organizations focused their messaging campaigns on radicalization, targeting at-risk Muslim communities into armies of holy warriors. 

The seemingly archaic, global network of radical Islamists tapped into the far-reaching technology of the world wide web to spread their message and indoctrinate would-be jihadists. To combat this ideological plague, the US began crafting counter-messaging tools and methodologies, giving birth to what would become an updated version of a counter-disinformation industry that had existed as far back as 1942, when Voice of America began broadcasting counter-narratives into Nazi Germany. These efforts ranged from a “whac-a-mole” style process of detecting and eliminating terrorist propaganda to enlisting moderate Muslim leaders to push a counter-message.

A 1947 Voice of America broadcast

It’s no revelation that you can’t carpet bomb an ideology, so while the concept of fighting extremist narratives online to tamp down on global terrorism seems logical on its face, according to the industry expert I interviewed, “In practice, it’s not very effective.” 

This sentiment was affirmed by another expert in countering extremist narratives, Caroline Moreno, who formerly ran the counter-terrorism training program at the FBI Academy in Quantico, VA. “Counter messaging, when it comes from the US government, loses its credibility,” said Moreno. 

It is understandable, in an unfamiliar world of ideology-based violence and radicalization, that some trial and error would occur along the path to understanding such a complex adversary. Tactics, of course, are developed over time and situationally based. Before 9/11, the military was focused on the most logical adversary, a conventional state actor, but had to adapt to the irregular warfare landscape of counter-terrorism operations.

The operators and ground-pounders on the frontlines get a real-world education in the necessary fluidity of such tactics, but the military monolith is often slow to adopt lessons gleaned from battle. To further complicate the matter, there are always plenty of defense industry opportunists promoting their tactics as dogma, such as the failed “hearts and minds” approach to counterinsurgency, further setting back any notion of catching up with the current threat, and the shift from the war on terror to the GPC has only exacerbated this strategic buffering. 

The shift from counter-terrorism to GPC occurred long before the official end of the GWOT. Although some within the defense and intelligence communities saw the writing on the wall for some time, the declared pivot came in 2018 with the new National Defense Strategy, issued by then Secretary of Defense James Mattis.

Former Defense Secretary James Mattis

“Long-term strategic competitions with China and Russia,” the guidance reads,” are the principal priorities for the Department [of Defense].”

The Pentagon, in other words, was announcing plans to take us back to a Cold War mindset. In a space-race type fashion, we would need to outfox the competition in arms, technology and influence in order to maintain our world power monopoly. Unfortunately, our drawdown in the Middle East and the swan song of the war on terror would mean a natural decrease in the defense budget, hampering any lofty dreams of competition.

Unless, of course, the new threat required a level of spending generally associated with kinetic warfare. A little thing like the absence of active armed conflict shouldn’t stop the growth of the defense industry. And in that spirit, the Pentagon’s budget reached its highest level last year, a whopping $816 billion. 

A Brief PRIMER on Information Operations

During my final six years of service, in a move that seemed to parallel the national defense strategy, I shifted from Counter-Terrorism focused special boat operations to conducting sensitive intelligence activities aimed at the Great Power Competition. Our task was to clandestinely prepare the battlespace, establishing an operational environment that would give us an advantage in the event of a hot war, but more importantly, shaping the environment so that our adversary couldn’t do the same. Our best tool in this endeavor was Information Operations.

A Joint Chiefs of Staff publication on Information Operations, Joint Publication 3-13, reads:

Information Operations (IO) are described as the integrated employment of electronic warfare (EW), computer network operations (CNO), psychological operations (PSYOP), military deception (MILDEC), and operations security (OPSEC), in concert with specific supporting and related capabilities, to influence, disrupt, corrupt, or usurp adversarial human and automated decision making while protecting our own.

I’d left this world behind when I found myself at WEST 2023. Now a mere observer of the military industrial complex, I picked up a copy of Signal to orient myself to the occasion. Signal is the official magazine of the Armed Forces Communications and Electronics Association, or AFCEA, one of WEST 2023’s many sponsors. In fact, the conference had so many sponsors that it developed a funder caste system, segregating the donors into categories such as Premier, Platinum, Gold and Silver. 

Amongst defense heavyweights Lockheed Martin and General Dynamics, was a lesser-known company down in the silver category named Primer. Primer is one of the many artificial intelligence and machine learning-focused companies orbiting the defense industry. Aside from AT&T, the silver sponsors blended together in an indistinguishable list of obscure defense contractors, and perhaps Primer would’ve remained obscure, too, had the company not acquired Yonder, an Austin, Texas-based “information integrity” company.

Primer had already entered the disinformation space, in 2020, when it won a Small Business Innovation Research, or SBIR, contract with the Air Force and Special Operation Command, SOCOM, to develop the first machine learning platform to automatically identify and assess suspected disinformation. This evolution into the disinformation world was fully realized with its 2022 acquisition of Yonder, an “information integrity” company focused on detecting and disrupting disinformation campaigns online.

Yonder, originally New Knowledge, rose to prominence when they co-authored a report to the Senate Intelligence Committee on Russian influence campaigns leading up to the 2016 Presidential election. Ironically, New Knowledge’s own foray into election meddling would make them a household name. During the 2017 Alabama Senate race, New Knowledge’s CEO, Jonathon Morgan, created a fake Facebook page and Twitter “botnet” with the intent of persuading votes for the Democratic candidate. 

“We orchestrated an elaborate ‘false flag’ operation that planted the idea that the Moore campaign was amplified on social media by a Russian botnet,” said an internal document from Morgan’s project. 

In another bit of controversy, New Knowledge, in the wake of the alleged Russian election meddling of 2016, helped develop a disinformation dashboard with the German Marshall Fund’s Alliance for Securing Democracy, or ASD. The dashboard, named Hamilton 68, acted as a repository for supposed Twitter accounts linked to Russian influence operatives, with access limited to a select few. This was ASD’s golden tablet, and only journalists and academics could wield the seer stone. 

Unfortunately for all involved, including the media who treated the information as gospel, the dashboard proved most successful at identifying overzealous conservatives from middle America. This legacy, and all its implications, came part and parcel with Yonder’s acquisition.

In Primer’s catalog of machine learning products, Yonder is billed as a tool capable of identifying bad actors and narrative manipulation — Primer’s very own weapon against Information Operations. 

Disinformation, propaganda, active measures — whatever you call it, the name of the game is Information Operations, or IO. In a war where battles are left of boom–where the strategy is to manipulate the information landscape to gain a competitive advantage over your adversary–a cat-and-mouse-like game develops.  But with the advent of the internet, and its compounding stores of information, the task of determining what is real and what is fake is too much to ask of us mere mortals. Thus the need for a Yonder-style solution.

“I’m not a fan of the term disinformation,” the counter-disinformation expert said. 

The statement came as a bit of throat clearing for the industry expert, as she digressed into a brief indictment of the trade she very much believes in.

“It’s been politicized. Even though disinformation has a distinct definition, it’s now being used as a label for any unwelcome information that someone doesn’t like, even when that information is true.” 

A new industry has developed out of the great disinformation scare. A mishmash of government, academic and private industry experts, come together to identify what is true, and what isn’t. Or at least their idea of what they would like to be true and not true. 

Most, if not all, countries dabble in information manipulation, not to mention non-state tricksters and deception artists, so it makes sense that you would need a cross-functional team of experts for such an undertaking. And given the implications of an information governing body, a kind of truth authority, you would damn well expect that all parties involved would be aboveboard.  

But…

“Any industry has hucksters,” the expert said.

The term huckster, perhaps because of its old world feel, brought to mind a scene from the Clint Eastwood film The Outlaw Josey Wales:

The camera pans across a dusty frontier town as a carpetbagging salesman in a white suit holds the attention of a crowd, proselytizing about his magic elixir.

“What’s in it?” a man from the crowd asks.

“Ehh…I don’t know, various things. I’m only the salesman,” the man in the white suit replies.

He looks the salesman up and down.

“You drink it,” the man says as he walks off.

The man in white looks shaken by the question. He recomposes himself before returning to the crowd.

“Well, what can you expect from a non-believer?”

Enter the New Snake Oil Salesman

“You can charge more if you call it information operations,” said one veteran, working in commercial Information Operations. He asked that I not reveal the company he works for but needless to say, it is one of the many government contracted companies peddling AI driven counter-disinformation products.

His voice was flat with a dry, matter of fact delivery. There was no surprise in this revelation, not for him at least.

“And the higher the clearance level [of the project], the more money.”

The second part was less surprising. Anyone familiar with government contracts will share my lack of shock. It has been my experience that government contracts are reliably unreliable. The only thing you can count on is a cronyistic leveraging of relationships within the Department of Defense (DOD). 

One former Primer employee referred to this as the “kabuki theater”: A vague request from the government, limited by over-classification and reluctance to “give too much away,” followed by a lavish set of promises from the contract winner, despite not having the specificity to deliver a quality product. These hollow requests make it impossible for the company to even know if they can deliver their empty promises. 

“We need a machine learning tool, capable of dissecting media posts to identify and catalog state sponsored disinformation,” a DOD contracting officer might say to a Primer account executive. 

“What systems will this work on?” the account executive asks.

“Classified ones.”

“Can we have access to them? It will help us.”

“Nope, classified.”

“Can we at least…”

“Nope. Here’s $10 million, good luck!”

The end result is just what you’d expect: an overly expensive useless doodad. Any oversight from the requesting agency is unlikely to catch this before it’s too late, given they rarely have the technical expertise to know what they are asking for in the first place. 

“I don’t even use my company’s products,” the IO expert said, “I don’t find any of it useful.”

Even though his work is in information operations, he says it is more akin to public relations than traditional IO, just layered in secrecy for effect. This was not the first time I’d heard the comparison to the PR/ marketing world. As a matter of fact, the counter-disinformation expert suggested that many of the tools marketed to counter disinformation are simply recycled marketing products. Companies use social listening software, designed to analyze consumer wants by monitoring their online behavior, and marketing tools to construct the perfect narrative to sell their product. So to detect “bad” narratives, one only has to reverse engineer the process.

“Marketers use these types of social listening tools to understand who their target audience is, how to best reach them, and how best to craft a viral message. Analysts can use them to understand how state-sponsored messages traverse across the internet, hopping from platform to platform, and if they’re resonating with target audiences,” the counter-disinformation expert said.

So what? Many popular products were originally meant for some other use. Viagra was supposed to lower blood pressure, now it just redirects it. What’s the problem in finding a secondary use for a product that already exists? Nothing, provided you don’t make outlandish claims about the “new” product.

“With Yonder, you can slice through streams of social media to gain contextual intelligence on narratives – including their authenticity and likely trajectory of amplification…Understanding the intent, affiliations, and influence of adversarial networks provides you with critical insight into emerging topics and events before they go viral.”

Other companies in the counter-disinformation industry, like Graphika, Two Six Technologies and PeakMetrics, make similar claims using comparable marketing terminology for their AI driven products. It turns out, however, that regardless of the efficacy of these tools, the Defense Department is not equipped to handle them.

“The tech solutions aren’t all that great,” the IO expert said. “The DOD isn’t advanced enough, doesn’t have the infrastructure to keep up with the contracted solutions”. The bureaucracy of DOD acquisitions leaves the military in a perpetual state of obsolescence, always behind the powercurve of technology and innovation of their defense industry counterparts.  

“The US is doing a shit job at [countering] disinformation,” the IO expert said, in another bit of optimistic revelation.

One problem is the lack of organization in the effort. While government agencies like the Department of State’s Global Engagement Center (GEC) claim to “direct, lead, synchronize, integrate, and coordinate U.S. Federal Government efforts to recognize, understand, expose, and counter foreign state and non-state propaganda and disinformation,” the reality is a scattered series of efforts across the government and private entities. This resulting chaos is exacerbated by the fact that the GEC is largely staffed by Foreign Service Officers and contractors, who typically don’t have a background or deep understanding of IO or disinformation.

There are, of course, actual IO professionals in the space, both on the government and private side of things. These subject matter experts are a hot commodity in the task saturated world of the counter-disinformation industry, but that doesn’t mean that any IO hack can make the grade. To ensure they get the very elite, the industry has stringent requirements: X number of years as an IO expert, an active Top Secret security clearance, SCIF (sensitive compartmentalized information facility) access, and, of course, the ever-important connections to the right people. What you end up with is a small pool of the same industry experts, playing a game of musical chairs from place to place, ensuring the same entrenched mindset manifests across all aspects of the industry.

A Collective Paranoia

The military is a paranoid organization. If you need proof, look no further than the posters plastered all over military installations.

“Loose lips…might sink ships,” reads one poster.

“He’s Watching You,” reads another. 

The paranoia, however, is justifiable in many instances. Being the most powerful military inspires competition, and competition can be ugly. Espionage, sabotage and propaganda are always on the menu of adversary tactics to deceive and compromise.

The modern disinformation scare has, no doubt, exacerbated this paranoia. Healthy suspicion, the kind that keeps the evildoers at bay, sometimes turns toxic, resulting in operational paralysis and rampant mistrust, setting off a chain of reactionary measures run amuck. Since the military is reactive by nature, only really employed when there is a tangible threat, our tactics to disrupt the flow of disinformation are inherently reactive as well. 

It is due to this failed strategy that the counter-disinformation expert recommends a more proactive approach to countering, or more appropriately, preventing the fallout from potential disinformation campaigns. Her recommendations: media literacy and civic engagement. 

A more critical media consumer, although hard to imagine at the moment, isn’t a bridge too far, yet the civic engagement angle seems to be. As skeptical as people are of the media, it pales in comparison to the distrust many Americans have for the government. But rather than entertain ideas of how to rebuild that trust, and possibly embolden local leaders and everyday citizens to take ownership of their relationship with the information they ingest and propagate, Washington seems content with deflecting the blame onto a third party.  But for any of it to work, the two recommendations would need to be approached in tandem.

“The majority of actions in the ‘proactive’ realm revolve around making people aware of the potential for misleading/manipulated information and encouraging them to engage critically with the content they consume,” the expert explains.

To support her claim, she directs attention to Taiwan, who reportedly receives more fake news than any other country. Rather than succumb to the overwhelming disinformation campaign, or use artificial intelligence to detect and dismantle it, Taiwan employs the counter-disinformation experts’ proactive strategy.

To achieve this, Taiwan partners with Non-Government Organizations, or NGOs, to promote early age media literacy, which the Ministry of Education has incorporated into its teaching guidelines, along with creating fact checking tools for messaging apps and social media. It is hard to know whether these tools are any more successful than the myriad of ones offered by the burgeoning counter-disinformation industry, but Taiwan’s proactive measures signal a move in a better direction. 

Former Cybersecurity and Infrastructure Security Agency Director Chris Krebs, an architect of modern DHS anti-disinformation policies.

Perhaps initiatives like this, employed in the US and globally, can put an end to the counter-disinformation industry. It is, after all, beginning to show cracks. The recent dismantling of the Misinformation, Disinformation and Malinformation subcommittee, the murky remnants of the Department of Homeland Security’s failed Disinformation Governance Board, indicates a possible sea change. No doubt a more malevolent entity will take its place, but maybe, just maybe, this represents a retreat from the manic steps towards an Orwellian nightmare.

Tyler Durden
Sat, 05/20/2023 – 15:30

ASK INTELWAR AI

Got questions? Prove me wrong...