|
|
Dear Colleagues,
As always, we hope that these weekly links focused on what is currently happening in the information environment prove to be helpful to you and the good work you do.
The Council on Foreign Relations wrote up a brief on how the U.S. can counter adversaries' disinformation campaigns. The article describes the elements a disinformation campaign is comprised of. First, a false narrative is laid out and part of that false narrative may include 'pre-positioning' sources - such as Russia publishing a report about the U.S. conducting bioweapons research, and then spreading a false media narrative citing that pre-placed, and false, report. Second, a disinformation campaign seeks to amplify that false narrative. This can be done using the same methodologies as spreading true information such as using repetition, or credible spokespeople (also deemed "useful idiots"). Third, an effective campaign hides the original source of the falsehood. Social media tools of reposting, retweeting and repeating help circle back to it's own sources and can effectively hide where the information originated from. The report advises that the best way to counter false narratives is by pre-bunking them and amplifying the truth.
The Department of Justice has seized Kremlin supported internet domains that are alleged to have been spreading Russian propaganda throughout the U.S. population geared at lessening support for Ukraine, and altering the outcome of the 2024 U.S. election. These domains were found to be using social media sites to have 'fabricated influencers' spreading Russian propaganda as well as spreading AI generated false narratives. Most interesting about this seizure are the Exhibits which include the Russian perpetrators' own internal notes and project orders for conducting propaganda focused on U.S. citizens. There are links to the exhibits in the below report of the domain seizure.
When the picture of Pope Francis in a white puffer coat went viral it made more than a few of us look twice! The article from New Scientist talks about how easy disinformation can be created by all the new AI tools, and offers some tips for how to identify if you are looking at AI generated material. Here are some things to look for when trying to determine if what you're seeing is real or not:
- Sociocultural implausibilities: e.g. The Pope wearing a long modern puffy jacket
- Anatomical implausibilities: Hands too small, too large, too many fingers?
- Stylistic oddities: Does the background look weird? Does the lighting not match the subjects?
- Functional implausibilities: Do things look like they wouldn't work how they are placed? Buttons in the wrong place?
- Physics violations (my personal favorite): Check out the shadows and the reflections for aligning with the laws of physics.
It's a lot of work to maintain integrity in your resources and information gathering - keep up the good work!
Stay safe, stay healthy, and stay diligent,
Dr. Ryan Maness
Rebecca Lorentz
DoD ISRC at The Naval Postgraduate School
Compiled and summarized by Rebecca Lorentz. Please email tips or contributions to DoDISRC@nps.edu.
|
|
 Photo: Dark Reading
Disinformation campaigns can be a powerful tool to shape beliefs on matters of great geopolitical importance. Bad actors can deploy them against rivals to sow costly discord, create political uncertainty, and deepen divides within a community.
|
|
The Justice Department today announced the ongoing seizure of 32 internet domains used in Russian government-directed foreign malign influence campaigns colloquially referred to as “Doppelganger,” in violation of U.S. money laundering and criminal trademark laws.
It can be difficult to spot AI-generated images, video, audio and text at a time when technological advances are making them increasingly indistinguishable from much human-created content, leaving us open to manipulation by disinformation.
|
|
|
|