Cognitive Warfare: The New Face of Disinformation – How Americans Are Being Polarized by Foreign Nations

The United States enters the mid-2020s facing an unprecedented challenge to its internal stability, characterized by the systematic exploitation of domestic political and social divisions by foreign state and non-state actors. This report, synthesized from the collective perspectives of national security, foreign affairs, and intelligence analysis, identifies a shift from traditional election interference toward a more pervasive doctrine of “cognitive warfare.” The primary objectives of these foreign adversaries—most notably the Russian Federation, the People’s Republic of China, Islamic Republic of Iran and North Korea—are to degrade the social fabric of American life, paralyze the federal government through internal discord, and undermine global confidence in the democratic model.1

The methodology of these actors involves the synchronization of deceptive narratives with significant geopolitical milestones and the weaponization of emerging technologies like generative artificial intelligence. By leveraging the “attention economy” of social media, which prioritizes engagement over accuracy, foreign entities have effectively “outsourced” the distribution of propaganda to unsuspecting American citizens and domestic influencers.4 The result is a fractured information ecosystem where “shared epistemic foundations”—the basic agreement on facts required for governance—are increasingly absent.7

The intent of this report is to provide an analysis of the threat landscape to facilitate civilian awareness. It details the specific actors involved, the psychological and technical tactics they employ, and the resulting impacts on public safety and institutional trust. Crucially, the analysis concludes that technical and governmental solutions alone are insufficient; the primary line of defense is an informed and analytically rigorous public. By adopting strategies such as lateral reading and psychological “pulse checks,” Americans can guard against deception and ensure that their democratic decisions are informed by reality rather than synthetic manipulation.9

The Strategic Environment: Polarization as a Weapon of War

The contemporary threat to the United States homeland is no longer confined to kinetic or traditional cyber-attacks. National security analysis indicates that polarization itself has been operationalized by foreign adversaries as a strategic weapon.7 The intelligence community defines this environment through the lens of Foreign Malign Influence (FMI), encompassing subversive, covert, or coercive activities conducted by foreign governments or their proxies.11 Unlike historical “active measures” that were often limited in scope and speed, modern FMI leverages digital connectivity to achieve global reach at minimal cost.12

The Philosophy of Cognitive Warfare

Foreign affairs analysis suggests that adversaries have shifted their focus to “cognitive warfare,” a doctrine that targets the human mind as the “final domain” of conflict. This approach operates in the psychological and informational spheres, exploiting human cognition to manipulate beliefs, emotions, and decision-making processes.13 The objective is not necessarily to convince the public of a specific lie, but rather to create a state of perpetual confusion and skepticism where “seeing is no longer believing”.5

Tactical ConceptIntelligence DefinitionStrategic Objective
Cognitive WarfareExploitation of human vulnerabilities to induce behavioral and perceptual shifts.Erosion of democratic norms and institutional trust.
Narrative SynchronizationAligning manipulative content with geopolitical events (e.g., NATO summits).Creating “information asymmetry” during high-stakes moments.
Algorithmic TargetingUsing social media data to deliver tailored content to specific demographics.Reinforcing “echo chambers” and accelerating “sorting” of the public.
Active MeasuresCovert operations to influence world events (mimicry, disinformation, agents of influence).Weakening U.S. global standing and internal cohesion.
Source: 13

The Crisis of Democratic Legitimacy

The integration of foreign disinformation into the domestic political discourse has resulted in what scholars term a “crisis of democratic legitimacy”.7 Intelligence assessments from 2024 and 2025 reveal that when citizens are repeatedly exposed to narratives questioning the integrity of electoral processes or the competence of mainstream institutions, they develop “affective polarization”—an intense, emotional hostility toward those with different political views.2 Foreign actors do not “create” these divisions; instead, they act as “force multipliers,” identifying existing societal “fault lines” and driving wedges into them to ensure they remain unbridgeable.2

Principal Actors: Motivations and Strategic Intent

A coordinated “Axis of Autocracy”—consisting of Russia, China, Iran, and North Korea—is increasingly working in concert to challenge the U.S.-led international order.3 While their specific methods vary, their shared goal is to create a more permissive environment for authoritarianism by distracting the United States with internal crises.1

The Russian Federation: The Architect of Disinformation

Russia remains the pre-eminent and most active foreign influence threat to the United States.2 Moscow’s overarching goal is to weaken the United States, undermine Washington’s support for Ukraine, and fracture Western alliances.2 Intelligence analysis shows that the Kremlin views election periods as moments of extreme vulnerability for democracy and seeks to amplify divisive rhetoric that makes the U.S. system look weak.2

The “Doppelgänger” campaign remains one of the most significant Russian operations identified in recent years. This campaign involves the creation of dozens of websites that mimic legitimate U.S. news organizations, such as The Washington Post and Fox News, to publish fabricated articles that align with Russian interests.4 Furthermore, Russia has adopted a “laundered” approach to influence, funneling millions of dollars to domestic companies to pay American influencers to spread Kremlin talking points under the guise of independent commentary.4

The People’s Republic of China: Comprehensive Economic and Cyber Pressure

The People’s Republic of China (PRC) represents the “most comprehensive and robust” strategic competitor to the United States.15 Beijing’s influence operations are often “whole-of-government” campaigns designed to fend off challenges to its legitimacy, gain an edge in economic and military power, and silence criticism from diaspora communities.1

While the PRC has historically been more cautious than Russia in its direct influence of U.S. domestic politics, recent reports indicate a shift toward more assertive tactics. During the 2024 election cycle, the PRC used bot accounts to post negative content about congressional candidates it deemed anti-China.4 Beyond information manipulation, the PRC’s strategy involves “weaponizing supply chain dependencies” and pre-positioning cyber actors on U.S. critical infrastructure to exert coercive pressure in the event of a conflict.15

The Islamic Republic of Iran: Escalation of Malign Activity

Iran has significantly increased its effort to influence the American public and political environment as of 2025.2 Tehran’s strategy is multi-pronged, seeking to stoke social discord, undermine confidence in the electoral process, and retaliate for U.S. and Israeli military actions in the Middle East.2 Iranian operations have evolved from simple social media propaganda to sophisticated cyber-espionage and direct physical threats.

In late 2024, the Department of Justice announced criminal charges against members of Iran’s Revolutionary Guard Corps for hacking into a presidential campaign and leaking stolen documents to the media.4 Perhaps most concerning to the intelligence community is Iran’s orchestration of “murder-for-hire” plots intended to assassinate high-profile U.S. officials, including Donald Trump, representing a dramatic escalation from digital influence to physical violence.4

State ActorPrimary MotivationCore Tactic in 2025Key Impact on US Public
RussiaHalting aid to Ukraine; fracturing NATO.Mimicking news outlets; paying domestic influencers.Deepened partisan hostility; distrust of mainstream news.
ChinaProtecting CCP legitimacy; economic dominance.Cyber pre-positioning; targeting anti-China candidates.Economic anxiety; concerns over infrastructure safety.
IranRetaliation for strikes; ending US presence in ME.Hacking and leaking campaign data; assassination plots.Political chaos; fear for the safety of public leaders.
North KoreaNormalizing nuclear status; financial theft.Cyber theft and money laundering via TCOs.Financial instability; critical infrastructure vulnerability.
Source: 1

Methodologies of Deception: Tactics and Technologies

Adversaries leverage a combination of psychological triggers and advanced technologies to bypass rational scrutiny and ensure their narratives gain traction within the American public.

The Rise of Generative Artificial Intelligence (AI)

The proliferation of generative AI has revolutionized the “manufacture of reality.” Tools that were once in the realm of experimental science are now routine parts of the disinformation toolkit.18

  1. Deepfake Audio and Video: AI can create near-photo-realistic visuals and clone voices with high precision. In 2025, bad actors used a voice clone of Secretary of State Marco Rubio to contact U.S. and foreign officials, attempting to gain access to sensitive accounts.18 Similarly, deepfake videos have been used to show political figures making statements they never said, such as JD Vance criticizing Elon Musk or Barack Obama expressing concerns about Donald Trump’s health.18
  2. Disaster Porn and Clickbait: AI tools like OpenAI’s Sora 2, released in late 2025, have been used to capitalize on natural disasters. During Hurricane Melissa, viral videos depicted sharks swimming in hotel pools and the destruction of Kingston Airport—events that never happened but were shared millions of times because of their sensational nature.6
  3. Chatbot Unreliability: AI chatbots, often viewed as neutral arbiters, frequently repeat information from low-quality social media posts. During a political rally in October 2025, chatbots amplified false claims that genuine news coverage was “old footage,” misleading the public about crowd size.18

Narrative Synchronization: Timing the Attack

Intelligence analysis reveals that adversaries do not release disinformation randomly. Instead, they use “narrative synchronization”—aligning their messaging with real-world geopolitical events to maximize psychological impact.13 For example, Russian narratives regarding nuclear threats or Western “provocations” are often synchronized with NATO summits or announcements of military aid to Ukraine.13 This temporal relevance increases the perceived credibility of the disinformation, as it appears linked to tangible, current events.13

The Psychology of Susceptibility: Targeting the Mind

Foreign influence operations are effective because they exploit fundamental “neutral and normal cognitive processes”.12 Adversaries systematically target specific psychological vulnerabilities:

  • Confirmation Bias and Motivated Reasoning: Individuals are more likely to believe and share information that aligns with their pre-existing beliefs, regardless of its accuracy.5
  • Affective Polarization: When people have strong negative feelings toward an opposing group, they are more susceptible to “politically aligned disinformation” that reinforces their hatred.7
  • The Power of Emotions: Content that triggers awe, amusement, or, most commonly, anger and anxiety is shared much more frequently than neutral, factual content.5
  • Fuzzy-Trace Theory: People often remember the “gist” (the general feeling) of a story rather than the “verbatim” details. Even if a story is later debunked, the negative “gist” remains in the individual’s memory.23

Case Study: Hurricane Melissa and the Chaos of 2025

The landfall of Hurricane Melissa in Jamaica in late October 2025 serves as a primary case study for how foreign-influenced narratives and AI-generated “synthetic slop” can paralyze domestic response systems.6

The Information Surge

Within thirty minutes of the hurricane’s landfall, AI-generated videos began trending on X, TikTok, and Instagram. These videos, often depicting spectacular but entirely fake destruction, racked up millions of views.6 National security analysts note that while many of these videos were created for financial gain (clickbait), they served the strategic interests of foreign actors by “clogging” official communication channels and drowning out safety information.6

Real-World Consequences

The disinformation surge had tangible safety costs:

  • Emergency Response Delays: False videos showing the destruction of Kingston Airport caused an unnecessary rush of citizens toward inland roads, creating traffic jams that delayed medical convoys by almost an hour.25
  • Resource Diversion: Emergency managers were forced to divert valuable time and personnel to debunking rumors—such as the “sharks in the pool” video—rather than tracking storm surge data and coordinating rescues.24
  • Erosion of Trust in Real Data: The prevalence of AI fakes led the public to question the validity of genuine videos, such as those from the U.S. Air Force “Hurricane Hunters”.26

This event highlights the “liar’s dividend”—a state where the presence of many fakes allows individuals to deny the authenticity of real evidence.25

The Shifting Institutional Landscape of Defense

The ability of the United States to defend against foreign malign influence has undergone significant changes in 2025, primarily due to shifts in executive policy and agency mandates.

The Dissolution of the Foreign Influence Task Force (FITF)

Historically, the FBI’s Foreign Influence Task Force (FITF) served as the primary bridge between the intelligence community and social media companies. Its role was to share actionable intelligence about specific foreign-backed accounts so that platforms could use their discretion to remove them.11 However, in February 2025, Attorney General Pam Bondi ordered the dissolution of the FITF, signaling a retreat from the government’s role in investigating foreign disinformation on social media.27

Gutting of Election Security and Global Engagement

Simultaneously, the Cybersecurity and Infrastructure Security Agency (CISA) saw its election security mission significantly curtailed. Operations focused on countering disinformation and protecting voting systems were “paused” for review in early 2025, and many expert staff members were placed on administrative leave.27 At the State Department, the Global Engagement Center (GEC), founded in 2016 to coordinate communications against Russian and Chinese influence, had its budget mandate expire and its activities reduced to a “zero-content-involvement” policy.27

AgencyFormer Role (Pre-2025)Current Status (2026)Operational Impact
FBI (FITF)Real-time identification of foreign accounts; SMC briefings.Dissolved February 2025.Loss of centralized intelligence sharing with tech companies.
DHS (CISA)Securing election infrastructure; debunking fakes.Election security activities “paused”; staff on leave.Vulnerability of local officials to cyber and influence threats.
State (GEC)Global counter-propaganda efforts.Funding expired; “zero-content” policy adopted.Reduced U.S. voice in countering autocratic narratives abroad.
FBI (Election Command Post)24/7 monitoring of threats during voting cycles.Operations limited to criminal acts only.Narrower window for identifying “perception hacking” campaigns.
Source: 4

National security analysts warn that these institutional rollbacks represent a “gift on a silver platter” to adversaries like Russia and China, who are now more active than ever in their interference efforts.28 In the absence of federal coordination, the responsibility for defense has shifted to fragmented civil society actors who lack the intelligence and resources of the federal government.27

Civilian Defense: Guarding Against Deception

In an environment of reduced institutional protection, the individual citizen must act as a primary node of defense. Foreign affairs and intelligence analysts recommend a series of practical, “cognitive-first” strategies to mitigate the impact of disinformation.

The Core Strategy: Lateral Reading

Research from the Stanford History Education Group (SHEG) demonstrates that “lateral reading” is the most effective way to determine the truthfulness of online information.9 Unlike “vertical reading”—scrolling down a single webpage and looking for professional-looking fonts or “About” pages—lateral reading involves leaving the source to see what other trusted sources say about it.9

  1. Open New Tabs: When you encounter a sensational claim, don’t read the article yet. Instead, open three or four new browser tabs.
  2. Search the Source: Search for the name of the organization or the author. Use Wikipedia or specialized news literacy sites to see if the source has a history of bias or spreading misinformation.9
  3. Cross-Reference the Facts: Check if major, reputable news outlets are reporting the same story. If a “massive scandal” or “disaster” is only being reported by one obscure website or social media account, it is likely false.32

Technical Checks for Deepfakes and AI Content

While AI tools are improving, there are still physical and geometric inconsistencies that can be identified with a “gut check” and careful observation.26

Verification AreaDeepfake Indicator (Red Flag)Authentic Indicator
Facial TextureOverly smooth “airbrushed” skin; pores missing; unnatural blinking.Natural asymmetries; visible pores; irregular blinking patterns.
Lighting/ShadowsShadows pointing toward the light source; flickering around the eyes.Consistent lighting based on identifiable light sources.
Geometric PhysicsBuildings with multiple “vanishing points”; garbled text on signs.Consistent architectural perspective; legible signage.
Audio PatternsLack of breathing; robotic inflection; mouth movements out of sync.Natural cadence; rhythmic breathing; synchronized lip movements.
Logic/ContextMagazine-quality beauty in a crisis zone; anachronistic vehicles.Visuals match the setting; historical/weather data matches the claim.
Source: 19

Psychological Resilience: The Emotional “Pulse Check”

Because disinformation is designed to bypass logic and trigger emotion, the most powerful defense is self-awareness.10 Before clicking “share” or forming a hardened opinion, citizens should ask themselves:

  1. Am I having a heightened emotional reaction? Disinformation is often “emotional and arousing,” designed to make the reader feel awe, amusement, anxiety, or anger.12
  2. Does this align too perfectly with my existing beliefs? If a story seems “too good to be true” because it makes your political rivals look bad, it is a prime candidate for disinformation targeting your confirmation bias.7
  3. Would I question this if it came from the “other side”? Applying a neutral standard to all information, regardless of the source, is the foundation of digital citizenship.10

Verification Tools for the Public

Several free tools are available to help civilians perform their own forensic analysis:

  • Reverse Image Search (Google/TinEye): Allows users to find the original source of an image and see if it was taken from a different context or an old event.10
  • TrueMedia.org: A free service that analyzes images, audio, and video for hidden mathematical signatures of AI generation.34
  • RumorGuard / Checkology: Platforms that provide real-world practice in spotting common tactics used to mislead and evaluate sources for credibility.33
  • Metadata Check: By right-clicking an image and selecting “Properties” (PC) or “Get Info” (Mac), users can sometimes see the original creation date and the software used, which may contradict the claimed story.34

Conclusion: Rebuilding the Shared Reality

The analysis conducted by this joint team of analysts indicates that the United States is currently the target of a sustained, multi-front campaign of cognitive warfare. Foreign adversaries—principally Russia, China, and Iran—have moved beyond the era of simple “fake news” into a period of sophisticated “synthetic reality” designed to exacerbate domestic polarization.2 By weaponizing the psychological mechanisms of confirmation bias and moral outrage, and amplifying them through generative AI, these actors have successfully turned the American information ecosystem against itself.7

The institutional shifts of 2025, which have reduced federal oversight of foreign influence operations, have effectively decentralized the defense of the homeland. The stability of the American democratic system now rests more than ever on the “epistemic resilience” of its citizens. The results of the 2025 Hurricane Melissa disinformation crisis serve as a stark warning: in a digital world, information failure leads directly to physical danger.24

For the average American, the path forward is not to stop consuming information, but to change how it is consumed. By prioritizing analytical scrutiny over emotional reaction and adopting the rigorous verification habits of professionals—such as lateral reading and technical cross-referencing—citizens can neutralize the “force multiplier” effect of foreign adversaries.9 The goal of foreign influence is to make the public believe that nothing is true and everything is possible. The civilian defense, therefore, is to insist on a shared reality based on evidence, skepticism of the sensational, and an unwavering commitment to the truth.


If you find this post useful, please share the link on Facebook, with your friends, etc. Your support is much appreciated and if you have any feedback, please email me at in**@*********ps.com. Please note that for links to other websites, we are only paid if there is an affiliate program such as Avantlink, Impact, Amazon and eBay and only if you purchase something. If you’d like to directly contribute towards our continued reporting, please visit our funding page.


Sources Used

  1. Annual Threat Assessment of the U.S. Intelligence … – DNI.gov, accessed January 31, 2026, https://www.dni.gov/files/ODNI/documents/assessments/ATA-2025-Unclassified-Report.pdf
  2. Election Security: U.S. Government’s Efforts to Protect the 2024 U.S. Election from Foreign Malign Influence – United States Department of State, accessed January 31, 2026, https://2021-2025.state.gov/briefings-foreign-press-centers/protecting-the-2024-election-from-foreign-malign-influence/
  3. CHAPTER 3: AXIS OF AUTOCRACY: CHINA’S REVISIONIST AMBITIONS WITH RUSSIA, IRAN, AND NORTH KOREA Executive Summary, accessed January 31, 2026, https://www.uscc.gov/sites/default/files/2025-11/Chapter_3–Axis_of_Autocracy.pdf
  4. Interference Interrupted: The US Government’s Strides In Defending …, accessed January 31, 2026, https://www.gmfus.org/news/interference-interrupted-us-governments-strides-defending-against-foreign-threats-2024
  5. How foreign actors are using media to influence opinion before Election Day – AJC.com, accessed January 31, 2026, https://www.ajc.com/news/business/how-foreign-actors-are-using-media-to-influence-opinion-before-the-election/52IZR4P7SFGYXKJIE2WZUC5URE/
  6. The Shark in the Pool: How AI Weaponized the Attention Economy – rbb Communications, accessed January 31, 2026, https://rbbcommunications.com/blog/how-ai-weaponized-the-attention-economy/
  7. Disinformation as a driver of political polarization: A strategic framework for rebuilding civic trust in the U.S, accessed January 31, 2026, https://journalwjarr.com/sites/default/files/fulltext_pdf/WJARR-2025-2564.pdf
  8. Disinformation as a driver of political polarization: A strategic framework for rebuilding civic trust in the U.S – ResearchGate, accessed January 31, 2026, https://www.researchgate.net/publication/393669858_Disinformation_as_a_driver_of_political_polarization_A_strategic_framework_for_rebuilding_civic_trust_in_the_US
  9. Teaching Lateral Reading | Civic Online Reasoning – Digital Inquiry Group, accessed January 31, 2026, https://cor.inquirygroup.org/curriculum/collections/teaching-lateral-reading/
  10. Spotting AI-Generated Disinformation and Deepfakes Online – Anomali, accessed January 31, 2026, https://www.anomali.com/blog/spotting-ai-generated-disinformation-and-deepfakes
  11. Voting | Foreign Malign Influence – Department of Justice, accessed January 31, 2026, https://www.justice.gov/archives/voting/foreign-malign-influence
  12. (U) The Psychology of (Dis)information: A Primer on Key …, accessed January 31, 2026, https://www.cna.org/reports/2021/10/The%20Psychology-of-(Dis)information-A-Primer-on-Key-Psychological-Mechanisms.pdf
  13. Decoding manipulative narratives in cognitive warfare: a case study …, accessed January 31, 2026, https://pmc.ncbi.nlm.nih.gov/articles/PMC12460417/
  14. FBI and CISA Issue Public Service Announcement Warning of …, accessed January 31, 2026, https://www.cisa.gov/news-events/news/fbi-and-cisa-issue-public-service-announcement-warning-tactics-foreign-threat-actors-are-using
  15. ODNI Releases 2025 Threat Assessment: What it Means for CFIUS Reviews – A Fresh Take, accessed January 31, 2026, https://blog.freshfields.us/post/102k8mb/odni-releases-2025-threat-assessment-what-it-means-for-cfius-reviews
  16. Homeland Threat Assessment 2025, accessed January 31, 2026, https://www.dhs.gov/sites/default/files/2024-10/24_0930_ia_24-320-ia-publication-2025-hta-final-30sep24-508.pdf
  17. About the unravelling of Iran’s social contract – Clingendael, accessed January 31, 2026, https://www.clingendael.org/publication/about-unravelling-irans-social-contract
  18. 2025 year in review: AI misinformation – The News Literacy Project, accessed January 31, 2026, https://newslit.org/news-and-research/2025-year-in-review-ai-misinformation/
  19. Don’t Get Fooled: Your Guide to Spotting Deepfakes, accessed January 31, 2026, https://it.ucsb.edu/news/dont-get-fooled-your-guide-spotting-deepfakes
  20. With New AI Resources Fake News Is Challenging Real Events – Like With Hurricane Melissa, accessed January 31, 2026, https://www.klove.com/faith/news/trending/with-new-ai-resources-fake-news-is-challenging-real-events-like-with-hurricane-melissa-56951
  21. Psychological factors contributing to the creation and dissemination of fake news among social media users: a systematic review – NIH, accessed January 31, 2026, https://pmc.ncbi.nlm.nih.gov/articles/PMC11575416/
  22. FAKE NEWS´ COGNITIVE EFFECTS IN COMPLEX DECISION-MAKING AND POLITICAL POLARIZATION – SciELO, accessed January 31, 2026, https://www.scielo.br/j/psoc/a/kpWpjbhsCvfszBp76TyFnDM/
  23. The Psychology of Misinformation Across the Lifespan – Annual Reviews, accessed January 31, 2026, https://www.annualreviews.org/content/journals/10.1146/annurev-devpsych-010923-093547?crawler=true&mimetype=application/pdf
  24. The Impact of AI-Generated Content on Natural Disaster Response: Hurricane Melissa, accessed January 31, 2026, https://catalystmcgill.com/the-impact-of-ai-generated-content-on-natural-disaster-response-hurricane-melissa/
  25. AI Crisis Detection Under Fire: Lessons From Hurricane Melissa – AI CERTs News, accessed January 31, 2026, https://www.aicerts.ai/news/ai-crisis-detection-under-fire-lessons-from-hurricane-melissa/
  26. AI-generated images of Hurricane Melissa bog down social media – The Weather Network, accessed January 31, 2026, https://www.theweathernetwork.com/en/news/weather/severe/melissa-ai-generated-images-of-hurricane-melissa-are-clogging-social-media
  27. The Trump Administration’s Withdrawal from the Fight Against Foreign Interference—Strategic Implications | INSS, accessed January 31, 2026, https://www.inss.org.il/publication/trump-influence/
  28. Issue One criticizes Trump administration’s rollback of safeguards against foreign influence operations, accessed January 31, 2026, https://issueone.org/press/issue-one-criticizes-trump-administrations-rollback-of-safeguards-against-foreign-influence-operations/
  29. Trump Is Gutting Efforts to Combat Foreign Election Interference – Mother Jones, accessed January 31, 2026, https://www.motherjones.com/politics/2025/02/trump-cisa-foreign-election-interference/
  30. Teaching Lateral Reading – No Shhing Here, accessed January 31, 2026, http://noshhinghere.blogspot.com/2022/01/teaching-lateral-reading.html
  31. Intro to Lateral Reading | Civic Online Reasoning, accessed January 31, 2026, https://cor.inquirygroup.org/curriculum/lessons/intro-to-lateral-reading/
  32. Lateral Reading Resources & Practice | Civic Online Reasoning, accessed January 31, 2026, https://cor.inquirygroup.org/curriculum/lessons/lateral-reading-resources-practice/?cuid=teaching-lateral-reading
  33. The Insider: November 2025 – The News Literacy Project, accessed January 31, 2026, https://newslit.org/news-and-research/the-insider-november-2025/
  34. Reporter’s Guide to Detecting AI-Generated Content – Global …, accessed January 31, 2026, https://gijn.org/resource/guide-detecting-ai-generated-content/
  35. Detect DeepFakes: How to counteract misinformation created by AI – MIT Media Lab, accessed January 31, 2026, https://www.media.mit.edu/projects/detect-fakes/overview/
  36. How to detect deepfakes: A practical guide to spotting AI-Generated misinformation – ESET, accessed January 31, 2026, https://www.eset.com/blog/en/home-topics/cybersecurity-protection/how-to-detect-deepfakes/
  37. Insights & Impact: Aug. 2025 – The News Literacy Project, accessed January 31, 2026, https://newslit.org/news-and-research/insights-impact-aug-2025/