The Alarm Bells of Disinformation – How are Bad Actors Evolving?
Source: Naphtali Rodriguez

The Alarm Bells of Disinformation – How are Bad Actors Evolving?

Last week, I was asked to keynote in the “Digital Troubleshooting” global conference on how to address and counter disinformation. My talk was titled “Truth Seeking and Disinformation Alarm”. I thought I would share the highlights of the talk. 

Alarm is the key word in the title of my keynote. 

Yes, we all want truth seeking. We all want to combat disinformation. That is pretty easy to agree to. 

But as we look forward, “alarm” is the right word to use. 

Here is why. 

If we think of disinformation and propaganda as a giant iceberg, we are spending too much time focusing on what is above the surface. Below the surface, technology is rapidly advancing what is possible and how it can impact how we work and play.

#1 -- Innovative censorship is a growing problem -- As technology advances, it is easier than ever to prevent information from reaching the public. We tend to focus on what is said….but what about what is never said?  

Examples of how censorship are evolving include the following:

Shadow Banning – access to a user’s larger/normal community is reduced leading to lowered ability to reach one’s community.  Basically, you say something they don’t like, they reduce your reach. 

Account Elimination – in the gaming world, as an example, we see people losing their accounts if they criticize certain countries. You don’t say the right things, we take away your voice.   

Country Favoritism via Algorithm – it is easy to tweak an algorithm, so that any information supportive of a country can receive higher access when doing a search and any information critical of that country is suppressed. It’s important to remember that algorithms are only as good as the humans who create them and the standards that support their actions.   

Those who want to suppress will get increasingly sophisticated on how to “just do enough” to evade us, so we must up-level our ability to identify anomalies faster via analytics and data science techniques. 

Think of it this way. We need more open-source collaboration to “censor-check” and document the different types of censorship that are occurring and then report and educate the general public on a) how the censorship is occurring b) what is the technology approach used c) what is the result and d) what to watch out for as a citizen. 

#2 – Gaming is the new family room – nearly three billion people engage in gaming. China, in particular, is investing heavily in gaming platforms to control the narrative. Concurrently, the gaming platform is becoming a new social center for the user with a range of services that include e-commerce, email, text, links to social channels and more. The future gaming platforms will be as powerful as Facebook is today, if we look at its agility and reach.

What will we do to combat disinfo from reaching citizens in the gaming world? Can we innovate with game modifications to teach people how to spot disinfo while playing Grand Theft Auto? How can we provide useful information into a gaming platform? If we think of gaming as the next Facebook, we think much more about what’s next and how to do it. Facebook has pioneered online communities, image sharing, messaging and more. No limits if we think it through.  

#3 – We spend a lot of time on the “moment”, but not as much time to understand the cumulative impact of disinformation on young minds – yes, we take down accounts and pressure social media companies to do more. But if we step back a second, we realize most of our world is quite young…our brains undergo their most important transformation between birth and age 25……and the impact of living in an environment continually hit with disinfo can take a long-term toll. 

This is alarming when we think of it this way. It means that we need to think of new ways of how to reach youth….how to understand and employ psychological models that shape behavior for the positive and realize that bad actors are playing the long game here. 

The great brands of the world realize they are building a lifetime experience vs. a transactional one. What is the equivalent to combat disinfo?

#4 – We need to work as a team – in the recent past, I had an opportunity to teach 40+ groups battling extremism of some type in both Brussels and DC. Groups from the EU, Eurasia, Middle East and the U.S. What they had in common was their commitment to battle extremism. What they also have in common is their lack of knowledge sharing, their lack of a common digital plan and their continued fragmentation. 

Bad actors work in a more unified manner, which impacts search, impacts messaging, increases their chances of success.

We need a good actor curriculum that teaches how to use keywords, how to build algorithms that identify influencers, how to truly impact search engine optimization, when to use paid media and much more. We need a digital media curriculum and a way to share. For example, if those 80+ groups all use the same keywords with common content, they will be on the first page of most searches. If they work as separate entities, bad actors win. 

#5 – Technology is advancing much faster than we realize – we can create billions of likenesses, which will make it easier to hide. We can compress audio with ease. We can slightly change copy to disinform but do it in plain sight. As AI improves and our ability to create content improves, so it does for bad actors. 

We need the equivalent of a pipeline, an R&D pipeline, to see what is next.

We need to work as a team in new ways to accelerate knowledge sharing.

We need to assume that bad actors are always a step ahead of us, even if not true. 

We need to assume that we are exiting 1.0 and about to enter 2.0 of combating disinformation. We will need to be on our A game. 

Thank you.

If you are interested in the full webinar of all talks, you can go to YouTube for more.  

Marcia Watson

Texas State GOP Delegate & Founder, Wilco We The People, former Director, True Texas Project, former Exec Director Citizens Defending Freedom, Co-Founder Capital Area Conservative Republicans Club & PAC

4y

Great read Bob. Right on the money.

Brian Monks

Retired VP & Chief Security Officer at UL LLC

4y

Thank you Bob. Very informative and I believe you are spot on!

Peter Loupos

Founder and General Manager at Princeton Healthcare Strategies, LLC, Board Member

4y

Great presentation on all points Bob. The good actor curriculum is an innovative idea that if developed and adopted broadly would be a powerful response to disinformation campaigns.

Bob this sounds like good “advice from the Dr.” we should all heed. Countering what has become one of the most troublesome practices of our time will not be easy.

To view or add a comment, sign in

More articles by Bob Pearson

Insights from the community

Others also viewed

Explore topics