🇲🇩 Discover Moldova

🍷

Wine Paradise

Home to the world's largest wine cellar, Mileștii Mici, with 200km of underground galleries storing 2 million bottles

🎵

Epic Sax Guy

Sergey Stepanov's viral Eurovision 2010 performance made Moldova famous worldwide with over 100 million views

🌳

Ancient Forests

Codrii forest is one of Europe's oldest deciduous forests, mentioned in chronicles from the 14th century

🏛️

Orheiul Vechi

Archaeological complex with cave monasteries carved into limestone cliffs, dating back to the 13th century

👥

Population: 2.6M

One of Europe's least populated countries, with a rich multicultural heritage

🌍

Language Hub

Multilingual nation where Romanian, Russian, Ukrainian, and Gagauz are widely spoken

🎨

Traditional Crafts

Famous for intricate carpet weaving, pottery, and the iconic "mărgele" (beaded jewelry)

🍇

Wine Festival

National Wine Day (October) attracts over 100,000 visitors annually to celebrate viticulture

Forensic Investigation Report • September 2025

Moldova's Parliamentary Elections: Hybrid Interference at Scale

A Comprehensive Analysis of Cross-Platform Disinformation Operations and Coordinated Influence Campaigns
Andra-Lucia Martinescu
Lead Researcher & Principal Investigator
Contributions by: [Consortium Partners]
Acknowledgements: [External Contributions & Partnerships]
Executive Summary

Moldova's September 2025 parliamentary elections occurred at a critical geopolitical juncture, marked by unprecedented levels of coordinated interference. This forensic investigation reveals that electoral interference now functions less as communicative persuasion than as a geopolitical spatial strategy routed through digital infrastructures.

Rather than advancing along conventional territorial boundaries, influence is exerted through networked geographies composed of platform architectures, language corridors, and algorithmically mediated publics. The objective is not persuasion but participatory deterrence, achieved by exhausting civic agency rather than converting opinion.

Through systematic analysis of 5,220 posts across seven platforms, encompassing 693 semantic clusters and 112 million impressions, this study documents the operational architecture, tactical evolution, and transnational coordination of influence operations targeting Moldova's democratic processes.

Keywords
Information warfare, electoral interference, hybrid threats, disinformation networks, semantic clustering, Moldova, Russia, coordinated inauthentic behavior, cross-platform analysis, narrative manipulation
Key Research Findings
5,220
Posts Analyzed
Comprehensive cross-platform monitoring across Telegram, TikTok, Facebook, X, VK, YouTube, and web sources from June-September 2025.
3,600+
Disinformation Posts
Identified through state-affiliation analysis, narrative tracing, and documented malign actor networks.
112M
Total Impressions
Measured reach across all platforms, demonstrating scale of influence operation targeting domestic and diaspora constituencies.
693
Semantic Clusters
Distinct narrative groupings revealing coordinated amplification patterns and strategic message deployment.
Section 1

Cross-Platform Distribution and Engagement Architecture

Telegram as Operational Core in Multi-Platform Ecosystem

The monitored online ecosystem spans 5,220 posts collected between June and September 2025 across seven major platforms: Telegram, TikTok, Facebook, X (formerly Twitter), VK, YouTube, and select web domains. While comprehensive in design, the dataset is strategically curated rather than exhaustive, focusing on relevance, engagement metrics, and disinformation-linked activity. Within this scope, approximately 3,600 posts were tagged as disinformation or coordinated influence operations based on multiple indicators including state-affiliation, recycled narrative structures, origin tracing, and previously documented networks of malign actors.

Telegram emerged as the central operational layer, with over 3,000 posts—more than 77% of which are attributed to disinformation-linked actors or compromised accounts. The platform serves dual function as both primary dissemination channel for initial narrative seeding and redistribution vector for cross-platform amplification. This architecture aligns with established patterns in hybrid information warfare, where closed or semi-closed messaging platforms function as coordination hubs for subsequent public-facing campaigns.

Figure 1. Distribution of monitored content across seven platforms showing Telegram's dominance as primary dissemination channel (n=3,000+, 77% disinformation-linked). Data collected June-September 2025.

While the dataset captures only a partial facet of TikTok activity due to platform access limitations, preliminary figures indicate disproportionate engagement intensity. Just 217 TikTok entries generated over 17.8 million views and 1.85 million reactions, highlighting the platform's exceptional capacity for viral mobilization and audience reach. This engagement rate significantly exceeds other platforms on a per-post basis, suggesting algorithmic amplification mechanisms that warrant further investigation.

Comparatively, Telegram content amassed 13.7 million views and 229,000 reactions, with narratives frequently originating from high-audience Russian state-affiliated or proxy channels. The lower reaction-to-view ratio on Telegram reflects the platform's architectural differences—functioning more as broadcast medium than interactive social space—while still maintaining substantial reach among target demographics.

17.8M TikTok Views
(217 posts)
13.7M Telegram Views
(3,000+ posts)
1.85M TikTok Reactions
(High engagement rate)
229K Telegram Reactions
(Broadcast model)

Cumulatively, the disinformation segment of the ecosystem exceeds 11 million measured impressions (views and reactions combined), with activity surges temporally clustered around late July and mid-September. These peaks correspond with specific offline events including mobilization attempts, decisions from electoral authorities regarding candidate eligibility, and proxy campaign escalations. The temporal correlation suggests responsive rather than purely autonomous information operations, indicating human-in-the-loop coordination responsive to real-world developments.

Geographic origin analysis, where traceable, reveals a concentration of activity in Moldova and Russia, with Moldovan accounts responsible for the highest volume of disinformation-tagged posts while Russian-origin accounts generated disproportionately high engagement relative to their output. Specifically, 559 Russian-origin posts generated over 5.75 million measured impressions, suggesting deployment of amplification infrastructures, pre-established captive audiences, or both. This engagement disparity—approximately 10,300 impressions per Russian-origin post versus substantially lower rates for other geolocations—indicates systematic audience cultivation or bot-assisted amplification.

A smaller but strategically significant share of disinformation-linked content originated from accounts geolocated across Europe and North America. Romanian (34 accounts), United States (37 accounts), German, British, French, and Italian geolocation tags suggest targeting of diaspora clusters, proxy amplification networks, or regionally coordinated influence assets. The geographical footprint spanning at least 15 distinct national contexts underscores the transnational character of the ecosystem, which blends localized seeding with cross-border mobilization tactics.

Methodological Note: Geographic Attribution

Geographic origin attribution was performed through multiple indicators including account metadata, IP geolocation data (where available), linguistic markers, time-zone analysis of posting patterns, and declared user location. Approximately 35 accounts lacked identifiable geolocation markers, representing 12% of the analyzed set. Attribution confidence levels vary, with highest confidence assigned to accounts displaying multiple consistent geographic indicators.

Diaspora as Strategic Vector: Targeting Extraterritorial Constituencies

The Moldovan diaspora, estimated at 1.2 million individuals, emerged as a decisive political force in recent electoral cycles. The Moldovan state responded by expanding overseas voting infrastructure to a record 301 polling stations across 41-45 countries, with postal voting mechanisms implemented in designated states. By closing time on election day, over 275,000 diaspora voters had cast ballots—representing approximately 23% of total voter turnout and potentially decisive in tight electoral contests.

This demographic presence became a primary target for information warfare campaigns involving what we term "Matryoshka networks"—nested layers of cloned media properties, proxy outlets, and online personas mutually reinforcing each other through circular citation and coordinated amplification. The objective encompassed multiple strategic aims: demobilizing diaspora participation through manufactured cynicism, undermining trust in electoral administration through fraud allegations, and staging or amplifying protests both abroad and domestically to create perception of contested legitimacy.

On voting day, offline destabilization tactics escalated beyond information operations to include bomb threats at multiple polling stations abroad. These threats, whether credible or performative, served to induce fear among voters, suppress turnout through logistical disruption, and provoke administrative chaos that could later be weaponized in fraud narratives. The integration of physical intimidation with information operations represents textbook hybrid warfare methodology, blurring boundaries between digital and kinetic domains.

Figure 3. Geographic distribution of disinformation-linked accounts showing transnational coordination across at least 15 national contexts. Russian-origin accounts demonstrate disproportionate reach-to-post ratio, suggesting infrastructural amplification capabilities.
Section 2

Geopolitical Context: Strategic Recalibration

The Kiriyenko Doctrine and Genealogy of Subversion

Russia's tactical approach toward Moldova visibly adapted following the 2024 presidential election and Constitutional referendum, which narrowly affirmed European integration as a constitutional objective despite significant interference efforts. Dmitry Kozak's removal as Deputy Chief of Staff around mid-September 2024 and the corresponding ascendance of Sergei Kiriyenko within the Presidential Administration signaled a decisive recalibration in Russia's management of the near abroad, with profound implications for Moldova in the immediate lead-up to the parliamentary vote.

The shift represents more than personnel rotation; it embodies distinct operational philosophies. Kozak favored transactional approaches characteristic of traditional statecraft—elite brokerage, energy inducements, and formal architectures such as the federalization plan for Moldova (the "Kozak Memorandum" of 2003) designed to play out over extended timeframes. This approach prioritized maintaining leverage through structural dependencies while avoiding overt confrontation that might accelerate Western integration.

Kiriyenko brings fundamentally different methodology, refined through domestic opposition suppression and tested across occupied territories of Ukraine. His approach emphasizes accelerated destabilization through deniable means, prioritizing tempo over sustainability. What distinguishes these strategies is not merely intensity but temporal orientation: Kiriyenko's interventions favor rapid-cycle operations designed to produce immediate political effects, even at cost of burning intelligence assets or exposing operational networks.

Analytical Framework: Hybrid Threat Indicators

Our analysis employs a multi-dimensional framework for identifying hybrid threat operations, integrating: (1) financial network analysis tracking illicit funding flows; (2) social network analysis mapping coordination structures; (3) content forensics identifying manipulation techniques; (4) temporal pattern analysis revealing synchronized activities; and (5) cross-platform correlation detecting narrative migration patterns. This framework enables detection of coordinated campaigns that might appear organic when viewed through single analytical lens.

The modus operandi observed in Moldova combined multiple hybrid vectors: coordinated influence operations sustained through illicit funding infrastructures documented by multiple investigative outlets; proxy mobilization concentrated in Russian-speaking localities and autonomous regions; recruitment of domestic actors for civil unrest both within Moldova and among diaspora communities; and weaponization of ecclesiastical networks, particularly Orthodox parishes subordinate to the Moscow Patriarchate.

The ecclesiastical dimension merits particular attention given its historical precedents. Clerical hierarchies and parish priests were mobilized to preach against European integration, framing it as spiritual threat to traditional values and Orthodox identity. These religious narratives were then amplified through coordinated Telegram channels, local media outlets, and diaspora community networks, creating multi-layered reinforcement that blurred lines between genuine religious conviction and manufactured political messaging.

"The weaponization of ecclesiastical networks echoes the KGB's documented playbook, whereby Soviet front organizations such as the Christian Peace Conference provided religious facades for influence operations abroad. By failing to connect present-day influence operations with their historical precedents, much scholarship on information threats risks treating symptoms in isolation while missing the structural persistence of a modus operandi."
— From methodology section on historical analysis

This genealogy of subversion—traceable through declassified archives documenting Soviet active measures—reminds us that ostensibly novel tactics often represent adapted versions of long-standing operational repertoires. The Christian Peace Conference, World Peace Council, and similar Soviet-era fronts cultivated sympathetic clergy, legitimized Soviet foreign policy in ecumenical forums, and penetrated international institutions under guise of interfaith dialogue and peace activism. Contemporary operations follow remarkably similar patterns, substituting digital platforms for print publications but maintaining core strategic logic.

Russian-funded activist NGOs have similarly instrumentalized claims of religious persecution within international fora, including submissions to UN human rights mechanisms. These narratives serve dual purpose: manufacturing international controversy around Moldova's domestic policies while providing pretext for Russian "protection" of co-religionists—a framing with obvious parallels to pretexts employed in Georgia (2008) and Ukraine (2014).

Historical Context: KGB Active Measures and Contemporary Operations

Declassified documents from multiple intelligence archives reveal systematic Soviet exploitation of religious networks for political warfare purposes. The Christian Peace Conference (CPC), founded in Prague in 1958, exemplifies this approach. While presenting as ecumenical peace movement, CPC was penetrated and directed by KGB assets who shaped its agenda to align with Soviet foreign policy objectives, particularly opposition to NATO and Western military presence in Europe.

Contemporary operations display striking architectural similarities. Religious networks subordinate to Moscow Patriarchate function as influence vectors, disseminating politically aligned messaging under cover of pastoral care. Financial flows from Russian state or state-adjacent sources support these networks, mirroring Cold War-era subsidization of front organizations. The adaptation lies primarily in exploitation of digital platforms for amplification—Telegram channels replacing printed bulletins, social media campaigns supplanting speaking tours—but fundamental operational logic remains consistent.

This continuity suggests that countering contemporary information operations requires historical literacy. Treating each manifestation as novel phenomenon obscures the enduring strategic culture and institutional memory that shapes Russian active measures. Effective response demands engagement with this genealogy, recognizing present threats as evolved iterations of documented historical patterns rather than unprecedented developments requiring entirely new analytical frameworks.

Section 3

Dominant Narrative Categories

Thematic Analysis of 693 Semantic Clusters

The cross-platform dataset encompasses 693 semantic clusters aggregating approximately 2,300 disinformation posts circulated between June and mid-September 2025. Each cluster represents a grouping of identical or semantically similar posts across languages (Romanian, Russian, English, French, Italian, and others), with metadata recording post frequency, cumulative impressions, temporal duration, and up to three narrative categories assigned through manual coding.

This clustering methodology differentiates between isolated viral bursts—single posts achieving organic reach—and coordinated long-term amplification campaigns involving multiple actors pushing similar messages across extended timeframes. Semantic similarity was calculated using sentence embedding models (specifically, multilingual BERT variants) with a cosine similarity threshold of 0.75, meaning posts sharing 75% or greater semantic overlap were grouped into common clusters.

Semantic Clustering Methodology

Semantic clustering employed multilingual sentence embeddings (mBERT and LaBSE models) to calculate cosine similarity across posts in multiple languages. Posts exceeding 0.75 similarity threshold were grouped into clusters, with temporal correlation analysis distinguishing coordinated campaigns from organic virality. Manual validation of 15% sample confirmed 89% accuracy rate for automatic clustering, with disagreements primarily involving culturally specific idioms or novel formulations. Narrative categories were assigned through structured manual coding by trained analysts using predefined taxonomy, with inter-rater reliability (Cohen's kappa) of 0.82.

Temporal correlation analysis within clusters enabled measurement of narrative persistence, distinguishing three distinct lifecycle patterns: synchronous bursts (multiple actors posting within narrow timeframe, typically under 12 hours), sustained campaigns (continuous or frequent posting over 12-72 hours), and episodic reactivation (baseline narratives resurging at intervals across weeks). This temporal dimension proves crucial for understanding operational intent—rapid bursts aim for immediate agenda-setting, while episodic reactivation maintains baseline narratives that can be amplified opportunistically in response to events.

Election Interference
311
Identity & Sovereignty
183
External Influence
158
Economic Crisis
62
Violence & Chaos
38

Election Interference and Voter Suppression emerged as the most frequent narrative category, encompassing 311 unique semantic clusters (45% of total). Nearly half of all identified clusters sought to preemptively neutralize or invalidate the election before it occurred, suggesting strategic objective of manufacturing contested legitimacy regardless of outcome. These narratives followed four dominant sub-frames:

Opposition Repression and Censorship (289 instances) constituted the single most frequent sub-category, portraying administrative decisions—including media outlet closures and Central Electoral Commission rulings on candidate eligibility—as authoritarian overreach. Pro-Russian parties and populist platforms were framed as "political dissidents" or anti-establishment resistance facing persecution. The arrest of Evghenia Gutsul (associate of oligarch Ilan Shor and former governor of Gagauzia) prompted particularly intensive campaigns alleging human rights violations across Russian-affiliated networks and their multi-language offshoots.

Voter Disenfranchisement and Diaspora Instrumentalization (258 posts) accused the government of systematically suppressing specific constituencies, particularly referencing Transnistria and Gagauzia. These allegations were frequently paired with identity-based grievances, asserting discrimination against these regions based on their pro-Russian orientation, linguistic preference, or ethnic composition. Comparative framing contrasted limited polling stations in Russia with expanded diaspora voting in Western Europe, constructing narrative of differential treatment favoring pro-EU constituencies.

Fraud and Manipulation Allegations (141 posts) circulated evidence-free claims of pre-planned ballot stuffing, postal vote manipulation, and organized vote-buying, often featuring imagery such as "white vans full of bribed voters" without corroborating documentation. These narratives typically incorporated pre-emptive mobilization elements, calling for protests and civil resistance while prophesying government-orchestrated repression—creating self-fulfilling dynamics where any crowd control measures could be framed as validation of authoritarian claims.

Institutional Capture and Foreign Meddling (114 posts) portrayed electoral authorities, law enforcement, media, and civil society as systemically compromised, subservient to the incumbent Party of Action and Solidarity (PAS). Over 40 posts reversed foreign interference allegations, claiming Romanian, Ukrainian, EU, and broader Western involvement in manipulating Moldova's elections. This reversal tactic—accusing others of one's own documented behaviors—represents classic projection strategy frequently employed in Russian information operations.

Figure 4. Network visualization showing co-occurrence patterns among narrative categories. Node size represents cluster frequency; edge thickness indicates co-occurrence frequency. Election interference narratives show strong connections with identity/sovereignty and external influence frames, creating mutually reinforcing hybrid messages.

Temporal analysis reveals that over 80% of election interference narratives appeared as high-intensity bursts within 12-hour windows, with significant subsets reinforced through periodic reactivation over subsequent weeks. This pattern suggests coordinated launch followed by maintenance phase, keeping baseline narratives available for opportunistic amplification around relevant events.

Identity and Sovereignty narratives (183 clusters, 26% of total) crystallized around three strategic frames designed to manipulate perceptions of geopolitical orientation and national identity. These narratives proved more temporally durable than average, with persistent clusters maintaining baseline messaging for weeks rather than days.

The primary frame (122 posts) advocated reorienting Moldova toward Russia while invoking traditional Orthodox identity and historical linkages. These posts frequently referenced Moldova's participation in Russian-led structures (Eurasian Economic Union, Community of Independent States, BRICS expansion), framing such integration as return to natural geopolitical alignment rather than subordination. Pro-Russian parties and leaders were quoted asserting intent to restore constitutional neutrality and exclude European integration clauses.

A secondary frame (44 posts) manufactured perception of sovereignty loss through Romanian annexation or Western absorption, portraying Romania as harboring "imperialist ambitions" toward Moldova. This historical grievance narrative proved particularly potent among constituencies sensitive to questions of linguistic and national identity, exploiting legitimate debates about Romanian-Moldovan relations for geopolitical purposes.

Claims of national decline under pro-EU leadership (25 posts) completed this triad, often paired with anti-LGBTQI rhetoric and historical revisionism. References to the "Great Patriotic War" and allegations of its erasure from collective memory served to position pro-Russian stance as defense of historical truth against Western revisionism.

"In line with Russia's information warfare doctrine, rooted in the concept of reflexive control and the fusion of psychological, informational and political instruments, Moldova's policy achievements were deliberately recast as vulnerabilities in a bid to legitimize their reversal: EU alignment depicted as loss of sovereignty, security cooperation and support afforded to Ukraine as provocation, energy diversification as economic sabotage, NATO as deliberate war proxy."

External Influence and Occupation narratives (158 clusters) deployed proxy war framing to manufacture anxiety about Moldova's imminent involvement in the Ukraine conflict. The largest subset (324 posts) alleged NATO-orchestrated attacks, portraying Moldova as being prepared as "the next Ukraine" through covert Western military presence, Romanian complicity, and staged provocations in Transnistria. NATO exercises were systematically misrepresented as preparation for aggression rather than deterrence.

A secondary frame (85 posts) portrayed the Moldovan government as "foreign puppet," with President Maia Sandu characterized as "Romanian agent" or "Brussels handler's" subordinate. This delegitimization strategy sought to nullify governmental agency, recasting every policy decision as externally imposed rather than domestically chosen.

Over 250 semantic clusters (36% of total dataset) were assigned two or more narrative categories, reflecting the hybrid nature of coordinated messaging. The highest co-occurrence frequency emerged between "Election Interference" and "Identity & Sovereignty" categories, where procedural delegitimization was reinforced as existential civilizational betrayal. Another prominent combination paired "Election Interference" with "External Influence," reframing domestic political actors as foreign agents staging controlled takeover.

The "Violence & Chaos" category, while smaller in absolute terms (38 clusters), disproportionately appeared as modifier to other narratives, escalating emotional intensity and urgency. These hybrid constructions prove particularly effective at manufacturing perception of crisis, whereby electoral administrative decisions become harbingers of civil conflict or foreign occupation.

Table 1. Distribution of semantic clusters across five primary narrative categories, including subcategory breakdowns, temporal lifecycle patterns (burst/sustained/episodic), and average cluster persistence duration. Election interference narratives dominate by frequency but identity/sovereignty narratives show greater temporal persistence.
Section 4

Coordination Patterns and Amplification Network Topology

Synchronous Bursts, Sustained Operations, and the Transnistria Case

Semantic clustering revealed three distinct temporal lifecycle patterns demonstrating coordinated influence operations. The 693 clusters collectively generated 112 million impressions and involved an average of 8-9 unique actors per semantic cluster (median=3), indicating most operations employed relatively tight coordination cells rather than diffuse networks. A small subset showed high actor dispersion (50-120 contributors each), indicative of coordinated mass-mobilization moments designed to manufacture perception of widespread grassroots support.

82.5% Burst Clusters
(<12 hours)
14.4% Sustained Ops
(12-72 hours)
3% Multi-Week
(Episodic)
8-9 Average Actors
per Cluster
Section 5

Tactics, Techniques, and Procedures

Manipulation as Cognitive Infrastructure

Analysis of a 2,349-entry corpus subset using the PROMPT analytical tool detected 6,823 manipulation techniques and 3,671 rhetorical figures embedded in the discourse. This quantitative analysis enables systematic identification of persuasion mechanisms that might otherwise remain implicit or undetected through qualitative reading alone.

1,210 Name-calling &
Labeling
896 Casting Doubt &
Uncertainty
767 Guilt by
Association
6,823 Total Techniques
Identified

The most recurrent manipulation techniques were name-calling and labeling (1,210 instances), casting doubt (896), and guilt by association (767)—all functioning to erode trust in institutional actors while preemptively discrediting alternative viewpoints. These techniques operate less by argument than by positioning: targeted political figures are reduced to hostile archetypes such as "globalists," "foreign puppets," or "traitors," while entire communities are tarnished through associative blame.

Dominant rhetorical figures consisted of amplification and exaggeration (1,603 instances), followed by false equivalencies (1,105), repetition and redundancy (355), and anecdotal storytelling (213). This repertoire served as a force multiplier, intensifying emotional resonance while lending speculative claims an air of inevitability. Posts typically layer multiple manipulation techniques and rhetorical devices within single sentences, creating messages that feel persuasive irrespective of empirical substantiation.

Analytical Tool: PROMPT Framework

The PROMPT (Propaganda Multi-dimensional Analysis and Pattern Recognition Tool) employs natural language processing combined with structured taxonomies of manipulation techniques derived from propaganda studies, rhetorical analysis, and cognitive psychology research. The tool identifies 23 distinct manipulation categories and 18 rhetorical figure types, with machine classification validated through expert manual review achieving 87% agreement rate.

Obfuscation patterns were also systematically observed, representing tactical adaptation to platform detection mechanisms. Variants retained over 90% lexical overlap with seed posts but introduced emoji substitutions, bullet reformatting, and punctuation encoding changes to bypass duplication filters while carrying identical functional narratives. This practice demonstrates sophisticated understanding of both algorithmic detection systems and human pattern recognition limitations.

Templated Amplification as Cognitive Infrastructure

The significance of templated amplification extends beyond mere repetition. From an operational standpoint, it serves as force multiplier for disinformation by manufacturing illusory consensus. When identical or near-identical framings appear synchronously across channels, languages and domains, they simulate organic public outrage, coercing undecided audiences into perceiving certain narratives as dominant or inevitable.

In fragile information environments where institutional trust is already compromised, the appearance of "multiple independent sources" repeating similar claims enables unverified allegations to cross the threshold from speculation to perceived fact—even when they trace back to single-origin disinformation campaigns. Templated amplification thus functions not merely as propaganda but as cognitive infrastructure, shaping how events are interpreted, which actors are trusted, and which futures are deemed plausible.

This approach also provides operational efficiency for hostile actors: once a narrative template proves effective, it can be redeployed with variable substitutions (different political candidates, electoral contexts, countries, crises) while maintaining core structure. This reduces the cost of influence campaigns while expanding their lifespan and geographic reach, enabling transnational coordination with minimal adaptation overhead.

Figure 5a. Manipulation techniques distribution
Figure 5b. Rhetorical figures distribution
Metric Details
Semantic Cluster ID ID_2981
Total Disinfo Posts 4
Content Extract 'Moldovan citizens are ambivalent toward EU integration. In a referendum held last October, the government managed to secure only a minimal advantage in favour of joining the EU, mainly due to votes from abroad. [...] The decisive support came from the Moldovan diaspora, with authorities opening more than 200 polling stations in EU countries but only two in Russia. Russian Foreign Ministry spokeswoman Maria Zakharova said the referendum and elections demonstrated a 'deep split in Moldovan society'.

[...]

'I think that after the change of power, the members of this Constitutional Court will also resign, and then real lawyers who do not act under the influence of [President] Maia Sandu or some Brussels handlers will take their place. We must achieve this. Last year's referendum is illegitimate; [...] it must be cancelled'
First Seen 20/09/2025 12:15 UTC
Last Seen 20/09/2025 13:58 UTC
Duration 1.5 hrs (synchronous burst, repeated within hours)
Main Category 1 External influence & occupation
Subcategory 1 Foreign puppet master
Main Category 2 Identity & sovereignty
Subcategory 2 Geopolitical public opinion (pro-Russian)

Type Label Trigger/Example from Text
Manipulation Techniques Casting doubt / delegitimisation 'Last year's referendum is illegitimate...it must be cancelled'
Guilt by association 'Real lawyers who do not act under the influence of [...] Sandu or some Brussels lawyers'
Appeal to fear / scapegoating (diaspora) Implicit: '[...] deep split in Moldovan society.'
Name calling / labelling 'Brussels handlers' (implies puppet control)
Rhetorical Figures Repetition / Redundancy 'illegitimate', 'illegal', 'must be cancelled'
Amplification / Exaggeration Suggests entire democratic process is invalidated by disproportionate diaspora votes.
False equivalence Equates voting logistics with foreign control / occupation
Prophetic future-tense assertion 'After the change of power...they will resign...real lawyers will take their place'
Section 6

PPDA TikTok Network

Case Study in Coordinated Mobilization and Ecosystem Reshaping

Vasile Costiuc, president of the Democratia Acasa (PPDA) political platform, significantly amplified his electoral visibility through a network of affiliated TikTok accounts disseminating manipulative content and disinformation. The TikTok dataset of 2,171 entries, extracted and parsed through the AI-based Factory software deployed by investigative outlet Context.ro (part of the EU FACT Hub), revealed approximately 337 duplicate entries or identical content repeats across multiple accounts—not occasional overlaps but systematic cross-posting indicating central management or coordinated distribution pipelines.

2,171 Total TikTok
Posts Analyzed
337 Duplicate/Identical
Cross-posts
11 Coordinated
Accounts
274 Redundant
Republications

At least 11 TikTok accounts recycled their own content heavily, with nearly 200 unique transcripts reposted, adding up to 274 redundant pushes. Even without cross-account coordination, individual accounts employed content flooding tactics—reposting identical scripts multiple times to game TikTok's recommendation algorithm, which may interpret repeated posting as signal of content relevance or viral potential.

Network analysis revealed two distinct coordination patterns. "Twin accounts"—pairs posting almost identical content—suggest single-operator control or tightly coordinated management teams. These account pairs function as mirrors, multiplying visibility of identical messages while creating illusion of independent validation. "Cluster accounts" form larger groups where each account maintains strong ties to multiple others through shared scripts, creating dense webs of cross-posting that simulate grassroots mobilization while remaining centrally managed.

Network Analysis Methodology

Account coordination was assessed through multiple metrics: transcript similarity (cosine distance <0.1 indicating near-identical content), temporal correlation of posting patterns (posts within 24-hour windows), engagement pattern analysis (similar like/share ratios suggesting coordinated behavior), and linguistic fingerprinting (consistent phrasing, hashtags, and calls-to-action). Accounts scoring high across multiple metrics were classified as coordinated networks, with validation through manual review of sample content confirming systematic coordination rather than organic content overlap.

A significant distinguishing feature was the direct mobilization of users as amplifiers. Audiences were explicitly instructed to repost content, flood TikTok with new accounts, and share clips widely across other platforms. Amplification was framed not merely as engagement but as political struggle—viewers told they were participating in resistance against corrupt establishment by spreading messages. This rhetorical strategy transforms passive content consumption into active political participation, creating psychological investment in narrative propagation.

Rhetorical analysis identified 8,587 rhetorical figures across the corpus. Over 1,390 information manipulation techniques were detected, with narratives centered on deep emotional appeals: struggles of local farmers, economic grievances, victimhood and persecution. Many repetitive scripts focused on personal tragedies—family members unable to afford medical care, evictions, modest disability pensions—aiming to trigger emotional responses and cultivate outrage among audiences.

Figure 6a. Costiuc repeat network
Figure 6b. Costiuc/PPDA rhetorical figures

s

Posts deployed symbols of everyday life and rural identity, asserting nativist tropes: local grapes, honey, and pears contrasted with foreign-imported bananas, producing calibrated victimhood and betrayal narratives demanding regime change and mass mobilization. This symbolic framing resonates with post-Soviet nostalgia while avoiding explicit ideological positioning, enabling appeal across diverse constituencies united primarily by economic grievance and institutional distrust.

Beyond Electoral Politics: Ecosystem Reshaping

This network represents a crucial conceptual shift in understanding contemporary influence operations. PPDA-affiliated accounts are not merely competing in electoral politics but actively working to reshape the entire ecosystem of public discourse—redefining who is trustworthy, which narratives are legitimate, and which forms of civic participation are acceptable.

The discourse moves beyond winning parliamentary seats into socio-political transformation. Civil society organizations are systematically delegitimized as corrupt agents of foreign interests, allegedly funded to protect the regime. Journalists and NGOs are collapsed into a single "system" that ignores popular suffering. This rhetorical strategy positions independent oversight as collective enemy, undermining institutional counterweights essential to democratic function.

Watchdog groups are specifically targeted with claims of Soros-funding or Western control—recycling longstanding conspiracy narratives that have proven effective across multiple national contexts. By delegitimizing civil society monitoring, these campaigns create permissive environment for electoral manipulation, knowing that detection and documentation capacity has been preemptively undermined through manufactured distrust.

For PPDA, the parliamentary threshold breakthrough (securing six seats after three failed attempts) demonstrates effectiveness of platform-engineered grassroots mobilization. Procedural legitimacy now affords the populist platform institutional foothold to mainstream its narratives beyond digital spaces, potentially normalizing rhetoric that previously occupied fringe positions. This trajectory mirrors patterns observed in Romania's far-right movements, suggesting transnational learning and adaptation of successful tactics.

Vasile Costiuc also featured in the Russian-affiliated Pravda network and Romanian-language affiliates at least seven times between July and August 2025. These web outlets amplified identical victimhood and persecution narratives circulating on TikTok, portraying Costiuc as silenced or marginalized by politically complicit state institutions. The Pravda network repeatedly cited as sources Telegram channels that are themselves documented disinformation vectors: Triunghiul Basaraben, Sputnik necenzurat, Gagauz News. This layered citation produces illusion of corroboration while masking common origin, enabling narrative laundering across platform and domain boundaries.

Section 7

Key Disinformation Actors and Networks

State-Affiliated Seed Actors and Narrative Laundering Infrastructure

Analysis revealed a structured architecture of state-affiliated seed actors, narrative laundering nodes, and local amplification proxies operating across platforms and languages. This multi-layered structure enables deniability while maintaining coordination, with each layer serving distinct operational function within the broader influence ecosystem.

Rybar / Mikhail Zvinchuk
Primary Seed Actor • Russian Military Blogger
1.3 million followers. Documented disinformation channel associated with sanctioned military blogger Mikhail Zvinchuk, tied to Russia's Ministry of Defence. Public EU documents attest to his participation in high-level working group convened in 2022 by Vladimir Putin to coordinate Russia's mobilization against Ukraine. The channel has expanded reach through multi-language spinoffs targeting transnational audiences. Zvinchuk has physical presence in Republika Srpska (Bosnia & Herzegovina) conducting media training and Telegram operations courses.
The Islander
Narrative Laundering Node • Anglophone Gateway
Operated by Gerry Nolan and Chay Bowes—both Irish nationals with extensive histories in geopolitical influence operations and direct affiliations with Russian state media (RT/Sputnik). Functions as narrative laundering node, repackaging Russian-origin messaging for Anglophone audiences. Posts referencing Moldova's electoral process amassed over 450,000 views with targeted English-language messaging designed to influence international perception and diaspora communities.

The original voter suppression message invoking the "redrawing of the electoral map by PAS" was seeded by Rybar, which then underwent systematic amplification through documented network pathways. Parallel amplification occurred through The Islander channel, demonstrating coordinated multi-platform, multi-language dissemination designed to achieve saturation across distinct audience segments while maintaining appearance of independent corroboration.

Actor Attribution Methodology

Actor identification and attribution employed multiple verification layers: known sanctions lists and intelligence community designations; previous documentation by investigative journalism outlets; network analysis revealing coordination patterns; linguistic and stylistic fingerprinting; temporal correlation with known Russian state media narratives; and cross-referencing with leaked documents (where available) confirming operational relationships. High-confidence attributions required convergence of at least three independent verification sources.

The Pravda network represents a particularly sophisticated component of this infrastructure, functioning as bridge between Telegram/TikTok environments and conventional web presence. Pravda-affiliated domains repeatedly cite as sources Telegram channels that are themselves notorious disinformation vectors, creating nested layers of apparent verification. By transitioning content from closed platforms (Telegram) through semi-public platforms (TikTok) onto indexed web domains, the network produces multiple discovery pathways while obscuring common coordination.

Figure 7a. Actor network topology Transnistria case

Network topology analysis reveals that certain amplification clusters (DD Geopolitics, Two Majors – English Channel, Eurasia & Multipolarity) recurrently amplify multiple seeding actors, indicating cross-cluster redundancy. This pattern suggests the ecosystem is not only cross-platform and multilingual but densely interconnected, enabling reinforcement and repetition at scale. The effect is strategic saturation of the information environment, particularly around manipulative narratives such as voter suppression, NATO aggression, and Western proxy interference.

"The structure of the amplification graph reveals that certain clusters recurrently amplify multiple seeding actors, indicating cross-cluster redundancy. This pattern suggests that the ecosystem is not only cross-platform and multilingual but densely interconnected, enabling reinforcement and repetition at scale. The effect is strategic saturation of the information environment."
Regional Integration and Multi-Theater Operations

The online ecosystem revealed not an isolated country-specific interference effort but a regionally integrated information strategy simultaneously targeting Moldova, Romania, Ukraine, as well as EU/NATO institutions and other European countries, fused into a single geopolitical battlespace. Elections were systematically reframed from procedural democratic events into externally orchestrated power contests, frequently referenced alongside Romania's 2024 presidential ballot to create manufactured sense of electoral interdependence or shared illegitimacy.

Allegations of Ukrainian and Western interference abounded, employing reversal tactics to preemptively inoculate against documentation of Russian operations. Claims of external meddling were paired with sustained attacks on diaspora voting processes, depicted as manipulated, externally controlled, or procedurally illegitimate. This multi-vector approach ensures that regardless of specific allegations disproven, broader atmosphere of suspicion and contested legitimacy persists.

A prominent tactic involved framing Romania's economic challenges as direct cost of supporting Moldova and Ukraine through energy exports, military/humanitarian aid, and refugee assistance. Solidarity was recast as self-inflicted harm, particularly potent across Romanian-language channels and web platforms associated with far-right movements. This economic anxiety narrative proved especially effective given genuine fiscal pressures facing Romanian households, demonstrating how disinformation operations exploit authentic grievances for geopolitical advantage.

Section 8

Narrative Lifecycle and Persistence Patterns

Strategic Latency and Long-Term Scaffolding

While surface-level textual analysis highlights bursts of synchronized posting, closer examination of thematic recurrence reveals deliberate and sustained strategy of narrative reinforcement through distributed paraphrasing. To differentiate between isolated viral bursts and coordinated long-term amplification efforts, each post was grouped into semantic similarity clusters using embedding-based similarity thresholds (0.75 cosine distance), with temporal correlation analysis measuring narrative persistence over time.

Three distinct temporal lifecycle patterns emerged from this analysis, each serving different strategic functions within the broader influence operation architecture:

Synchronous Bursts (82.5% of clusters) represent rapid-deployment operations where multiple actors post within narrow timeframes, typically under 12 hours. These bursts aim for immediate visibility and agenda-setting, capitalizing on algorithmic amplification mechanisms that privilege recent, rapidly-engaging content. Synchronous bursts prove particularly effective around breaking news or events, enabling disinformation to circulate before authoritative information establishes dominant framing.

Sustained Campaigns (14.4% of clusters) maintain continuous or frequent posting over 12-72 hours, creating persistent presence in information streams without appearing as artificial spike. This pattern proves effective for narratives requiring gradual normalization rather than shock impact—for instance, complex conspiracy theories that benefit from repeated exposure to achieve familiarity and plausibility.

Episodic Reactivation (3% of clusters) involves baseline narratives that resurface intermittently across weeks or months, often in response to related events or as proactive maintenance of long-term frames. These persistent narratives function as strategic scaffolding—pre-positioned interpretive frameworks that can be rapidly reactivated when relevant events occur, providing immediate context that shapes public interpretation.

Figure 8: Narrative Lifecycle Patterns
Temporal Distribution: Burst • Sustained • Episodic
Figure 8. Temporal persistence patterns across 693 semantic clusters. Synchronous bursts dominate by frequency (82.5%) but episodic reactivation demonstrates strategic narrative maintenance enabling rapid response to events. Duration measured from first to last observation within dataset window.
Temporal Pattern Analysis

Lifecycle classification employed multiple temporal metrics: initial burst intensity (posts per hour in first 12 hours), persistence duration (time from first to last observation), reactivation intervals (gaps between posting clusters), and decay rates (engagement decline over time). Clusters were classified through decision tree algorithm validated against manual review, achieving 91% classification accuracy. Temporal data was UTC-standardized to enable cross-timezone comparison.

Viral bursts alone do not prove coordination—organic content can achieve similar patterns. However, when identical or near-identical framings recur episodically across discrete time windows, languages, and platforms, they signal templated orchestration combining both automated systems and human oversight. The combination of high semantic similarity (>0.75 cosine) with temporal clustering and cross-platform presence provides strong evidence of coordinated rather than organic dissemination.

This reveals critical insight for election monitoring and threat assessment: coordinated influence campaigns are no longer defined primarily by volume but by controlled repetition with strategic latency. The observed amplification modes—from synchronous bursts to strategically reactivated baselines—demonstrate that influence operations around Moldova's elections were not merely reactive or opportunistic but structured to maintain long-term narrative scaffolding across borders and languages.

Strategic Implications of Persistence Patterns

The persistence pattern analysis challenges conventional threat assessment frameworks that prioritize high-volume, high-velocity disinformation events. While synchronous bursts receive disproportionate attention from fact-checkers and platform moderators due to their visibility, episodically reactivated narratives may pose greater long-term threat to information integrity.

Episodic narratives function as cognitive infrastructure—pre-positioned interpretive frameworks that shape how audiences process subsequent information. When events occur that could be interpreted through multiple frames, audiences default to familiar narratives already normalized through prior exposure. This creates path dependency in information processing, where initial false framings constrain subsequent interpretation even when contradicted by evidence.

For election monitoring, this suggests need for longitudinal tracking systems capable of detecting narrative maintenance operations, not just immediate threat assessment focused on volume spikes. Effective response requires understanding not only what narratives circulate but how their persistence creates durable interpretive communities resistant to correction.

The temporal distribution also reveals operational adaptation to platform moderation systems. Burst patterns dominate because they exploit algorithmic amplification windows before moderator review, while episodic reactivation evades pattern detection by introducing temporal gaps that break automated monitoring. This demonstrates sophisticated understanding of both technical systems and human attention limitations, adapting tactics to exploit structural vulnerabilities in content moderation architectures.

Section 9

Regional Integration and Cross-Border Coordination

Transnational Ideological Franchising and Multi-State Operations

Cross-border spillovers proved particularly forceful between Moldova and Romania, with connections between far-right populist and irredentist movements displaying converging agendas and thematic overlaps. George Simion, far-right leader of AUR (Alliance for the Union of Romanians—a political party with organizational spinoffs in Moldova), emerged as vocal supporter of Vasile Costiuc and his PPDA platform, demonstrating coordinated transnational mobilization rather than parallel but independent movements.

The rhetorical arsenal deployed across both contexts follows remarkably consistent patterns, suggesting either direct coordination or successful ideological franchising. Core themes include: grievances of local farmers and producers positioned as victims of globalized trade; sovereignty assertions framed as "taking back control" from supranational institutions; anti-establishment rhetoric targeting "corrupt elites"; and ethno-nationalist identity claims rooted in defense of "traditional values."

AUR-PPDA Convergence
Transnational Movement Coordination
Alliance for the Union of Romanians (AUR) maintains organizational presence in Moldova through affiliated structures supporting PPDA. Shared platform events, coordinated messaging, and mutual endorsements demonstrate systematic cooperation. Rhetorical analysis reveals over 75% thematic overlap across core campaign messaging, with nearly identical framing of EU integration as sovereignty threat, similar economic nationalism, and parallel anti-establishment positioning.
External Influencer Networks
Transnational Amplification Assets
Ecosystem of foreign influencers and political technologists, some visible, others operating through pseudonymous accounts. Jackson Hinkle (American commentator with documented Russian state media affiliations) actively amplified polarizing narratives during Romania's 2024 elections, subsequently directing similar messaging toward Moldova. Represents ideological franchising—tactics proven effective in one context adapted and deployed in another.
Transnational Ideological Franchising

The circulation of coordinated narratives has relied on an ecosystem of foreign influencers and political technologists operating across multiple national contexts. In 2024, during Romania's presidential elections, Jackson Hinkle—an American commentator openly aligned with Russian state media—played active role in amplifying polarizing frames around electoral fraud and institutional illegitimacy. Following Romania's elections, similar messaging patterns appeared in Moldova-focused content, suggesting strategic adaptation of successful tactics.

In both cases, unfounded accusations of election fraud and vote theft were deployed preemptively to discredit results before votes were counted, incite civil unrest, and manufacture contested legitimacy regardless of outcome. This represents form of ideological franchising where successful disinformation templates are repackaged for local consumption with variable substitutions but maintained core structure.

The adaptation process demonstrates learning across contexts: tactics that prove effective in one electoral environment are documented, refined, and redeployed in similar contexts. This creates cumulative sophistication, as each iteration incorporates lessons from previous operations' successes and failures. The result is increasingly optimized influence operations that exploit well-understood vulnerabilities in information ecosystems and electoral administration.

An illustrative case demonstrating rhetorical camouflage is the Alternative Electoral Bloc (BeA), ostensibly self-declared as pro-European while operating as pro-Russian conduit aligned with Moscow's strategic interests. The Bloc's leadership includes several controversial figures with documented pro-Russian positioning, yet campaign messaging adopted European integration rhetoric designed to appeal to moderate voters without alienating core pro-Russian constituencies.

Throughout its campaign, BeA systematically avoided clear positioning on core geopolitical issues: Russia's aggression against Ukraine, Moldova's relationship with NATO, and concrete timelines for EU accession. The adoption of pro-European rhetoric appears as electoral tactic—strategic ambiguity enabling simultaneous appeal to contradictory constituencies through carefully calibrated messaging in different contexts and platforms.

"Essentially, 'nothing is what it seems': ideological lines become deliberately blurred, with 'sovereignist' movements reframing Kremlin positions as nationalism or anti-establishment resistance, and self-declared pro-European blocs treading carefully curated ambiguity that obscures external alignment. In practice, such political formations employ camouflage strategies, rebranding hostile agendas in pro-European vernacular to preserve influence under shifting electoral and geopolitical constraints."

The regional integration extends beyond bilateral Moldova-Romania dynamics to encompass broader Eastern European patterns. Similar coordination signatures appear across multiple national contexts: Poland's far-right movements, Slovakia's pro-Russian political shifts, Hungarian government-aligned media operations, and Bulgarian disinformation networks. While each maintains distinct national characteristics, the underlying architecture—coordination mechanisms, funding sources, narrative templates—displays striking consistency.

Cross-Border Coordination Indicators

Regional coordination was assessed through multiple analytical dimensions: temporal synchronization of messaging across national contexts; linguistic analysis revealing translation patterns or shared source materials; network analysis identifying cross-border amplification pathways; funding investigations tracing financial connections; and thematic analysis demonstrating consistent narrative frameworks despite variable national adaptation. High-confidence coordination assessments required convergence across at least three independent indicator categories.

This regional architecture suggests that treating information threats as isolated national problems fundamentally misunderstands the operational environment. Influence operations increasingly function as transnational systems with shared infrastructure, coordinated timing, and mutual reinforcement across national boundaries. Effective response therefore requires corresponding transnational coordination among defenders—shared threat intelligence, coordinated attribution, and synchronized countermeasures.

The implications for policy and research are significant. Current approaches to information integrity largely operate within national jurisdictions, limiting effectiveness against transnational operations. Platform companies, while operating globally, primarily respond to national regulatory frameworks that may not address cross-border coordination. Civil society monitoring similarly tends toward national focus, missing regional patterns that only become visible through comparative analysis.

Future research and operational response must therefore adopt explicitly transnational frameworks, recognizing that Moldova's information integrity cannot be secured in isolation from broader regional dynamics. This requires enhanced cross-border cooperation among researchers, civil society organizations, electoral authorities, and platform companies—sharing data, coordinating analysis, and developing response strategies that match the scale and sophistication of the threats they address.

Section 10

Wikipedia as Information Warfare Vector

Moldova Barometer Dataset and Epistemic Integrity Analysis

The Moldova Barometer dataset offers a unique opportunity for empirical study of Wikipedia as a vector for information warfare and epistemic instability, with focus on contextualized, election-related narratives. While Wikipedia is commonly assumed to function as neutral, crowd-sourced knowledge environment, various investigative reports have demonstrated that it is increasingly targeted by coordinated editing efforts, both overt and covert, designed to shape public perception of contested political events.

Although in incipient phase, the Moldova Barometer dataset offers quantitative, multi-factor lens for assessing the health and integrity of articles related to Moldova across multiple language editions. The dataset's particular value lies in convergence of editorial behavior, sourcing quality, and attention dynamics—factors typically studied in isolation—enabling modeling of multi-dimensional risk vectors for disinformation.

Dataset Construction and Analytical Framework

The Moldova Barometer dataset monitors Wikipedia articles across 25 distinct variables spanning editorial patterns (edit frequency, contributor diversity, reversion rates), sourcing characteristics (citation density, source reliability, circular referencing), and attention metrics (view patterns, edit spikes, temporal clustering). Articles were selected based on relevance to Moldova's electoral context, geopolitical positioning, and historical controversies. Analysis covered articles in Romanian, Russian, English, and other language editions to capture cross-linguistic manipulation patterns.

Three core composite metrics were developed using normalized indicators to summarize various classes of risk, enabling systematic identification of articles exhibiting patterns consistent with coordinated manipulation or information warfare tactics:

Composite Risk Metrics: Methodological Framework

1. Manipulation Risk Score (MRS): Captures potential coordinated editing behavior through six normalized variables: sockpuppet flags, edit spikes, view spikes, edit reversion probability, contributor anonymity ratio, and contributor add/delete ratio. Computed as mean of normalized indicators, with scores ranging 0-1 where higher values indicate elevated behavioral risk.

Operational Logic: Disinformation actors typically avoid overt vandalism, instead operating through patterned editorial behaviors designed to simulate organic participation. Sockpuppet networks enable artificial consensus-building; edit spikes correspond to coordinated pushes; high reversion rates indicate contested narratives; anonymity obscures actor attribution; extreme add/delete ratios suggest narrative insertion attempts.

Preliminary Findings: Top-ranked MRS articles include "Молдова" (Moldova, Russian edition), Vladimir Plahotniuc (Romanian), Tiraspol (English), and "Dinastia Basarabilor" (Romanian). These pages experienced intense editing by both registered and anonymous users, coupled with sockpuppet flags, suggesting strategic manipulation. However, elevated activity around Plahotniuc may partially reflect legitimate interest following his extradition from Greece, demonstrating challenge of distinguishing coordinated manipulation from organic attention around newsworthy developments.

Figure 10a. MRS indicator risk

The Russian-language Moldova article achieved normalized MRS of 0.62, reflecting elevated behavioral risk profile indicating intense or irregular editorial activity consistent with coordinated narrative shaping. Cross-language comparison revealed that Russian and Romanian editions exhibited systematically higher MRS than English editions for equivalent topics, suggesting language-specific targeting strategies.

2. Sourcing Risk Score (SRS): Evaluates epistemic integrity using three dimensions: citation gaps (uncited claims), suspicious source prevalence (unreliable or partisan sources), and source concentration (over-reliance on limited sources). Wikipedia's sourcing model renders it vulnerable to plausible-sounding claims backed by unreliable or cherry-picked sources, particularly for regional topics with limited journalistic or academic coverage.

Operational Logic: Information warfare on Wikipedia often exploits sourcing vulnerabilities rather than direct content manipulation. By citing obscure, biased, or fabricated sources that superficially meet Wikipedia's verifiability standards, actors can embed false or misleading claims that resist easy detection. Source concentration enables narrative control through strategic selection of sympathetic sources.

Preliminary Findings: Articles with highest SRS scores showed not only reliance on unreliable sourcing but complete citation voids, particularly for historical and ethnic identity topics. Some high-SRS articles maintained low manipulation scores, indicating content-based disinformation can occur independently of editorial coordination—suggesting importance of multi-metric assessment rather than single-dimension analysis.

Figure 10b. SRS indicator risk

Historical entries and territorial dispute articles proved particularly vulnerable to sourcing manipulation, with several displaying circular citation patterns where multiple Wikipedia articles in different languages cite each other or derivative sources, creating illusion of independent verification while obscuring common origin in partisan or fabricated materials.

3. Behavioral Volatility Index (BVI): Assesses editorial irregularities using sporadicity (irregular edit timing), contributor concentration (small number of editors dominating article), and add/delete ratio. Captures erratic engagement patterns potentially suggesting chaotic or conflict-driven content changes.

Operational Logic: High volatility may indicate either organic controversy around contentious topics or coordinated manipulation campaigns. Distinguishing between these requires contextual analysis, but elevated BVI serves as useful screening criterion identifying articles warranting closer investigation.

Preliminary Findings: High-BVI pages did not consistently align with high manipulation or sourcing risk scores, indicating behavioral chaos—such as mass edits clustered around political events or crises—is not inherently manipulative. However, such volatility provides favorable conditions for hostile actor insertion, as irregular activity patterns mask coordinated operations within seemingly organic controversy. Articles experiencing sustained high BVI over extended periods proved more likely to contain sourced but misleading content, suggesting volatility creates opportunities for strategic narrative insertion.

Figure 10c. BVI
Figure 10d. Correlation matrix
Scatterplot 1 – displays a two-dimensional PCA (Principal Component Analysis) projection of the top-10 articles per cluster, based on 12 normalised risk features spanning behavioural manipulation, sourcing gaps, and editorial instability. Each point represents a Wikipedia article/entry, colour-coded by clusters (0-3) derived from K-means clustering. The axes (pca_x and pca_y) capture the principal components that explain the greatest variance in the datam allowing articles with similar risk profiles to group spatially. Cluster 1 (dark blue) captures the most epistemically vulnerable articles, while clusters 0, 2, and 3 display varying combinations of source, behavioural, and manipulation-related risk dimensions. The plot shows how disinformation exposure is distributed across topics and languages, with certain articles forming dense, high-risk groups.
Cross-Language Comparative Analysis

The Moldova Barometer dataset enables systematic comparison of how identical topics are treated across Wikipedia language editions. This cross-linguistic analysis revealed systematic patterns suggesting coordinated manipulation efforts target specific language communities. Russian and Romanian editions showed elevated risk scores for politically sensitive topics compared to English editions, likely reflecting strategic prioritization of audiences with direct stakes in Moldova's geopolitical orientation. Articles about Soviet-era history, ethnic minorities, and territorial disputes exhibited greatest cross-language divergence, with Russian editions consistently framing events from Moscow-aligned perspective while Romanian editions adopted pro-European stance—demonstrating Wikipedia's vulnerability to becoming contested terrain for geopolitical narratives.

Temporal analysis revealed that risk scores spiked around key political events—elections, referendums, diplomatic controversies—suggesting responsive manipulation rather than sustained baseline operations. This pattern indicates actors monitor political calendar and strategically time Wikipedia editing campaigns to coincide with moments of elevated public attention, maximizing impact of narrative interventions.

Case Study: "Dinastia Basarabilor" Manipulation Campaign

The Romanian-language Wikipedia article "Dinastia Basarabilor" (Basarab Dynasty) exhibited elevated scores across all three risk metrics during the pre-electoral period. Editorial analysis revealed coordinated insertion of historical revisionism positioning medieval Moldavian principality as distinct from Romanian historical continuum—a narrative serving contemporary Russian geopolitical interests by undermining claims of Romanian-Moldovan cultural unity.

The manipulation campaign employed sophisticated tactics: citations to obscure Soviet-era historical sources presenting as academic authorities; coordinated editing by accounts with limited prior Wikipedia contribution history; strategic timing of major edits during low-activity periods (weekends, holidays) to minimize immediate scrutiny; and rapid reversion of corrections by established Wikipedia editors, creating edit wars that deterred sustained counter-efforts.

This case demonstrates how Wikipedia manipulation extends beyond immediate electoral content to encompass broader historical narratives shaping national identity and geopolitical orientation. By contesting foundational historical claims, actors seek to undermine legitimacy of contemporary political positions rooted in those historical narratives.

The research trajectory suggested by the Moldova Barometer dataset offers unique opportunity for empirical study of Wikipedia as electoral interference vector. The convergence of editorial behavior, sourcing quality, and attention dynamics—factors typically studied in isolation—enables multi-dimensional risk modeling that transcends content-focused approaches. This structural analysis can identify manipulation patterns even when individual edits appear superficially legitimate, providing early warning system for coordinated information warfare campaigns.

For Wikipedia community and platform governance, findings suggest need for enhanced monitoring of articles related to contested geopolitical situations, particularly during electoral periods. Existing mechanisms focused on vandalism and obvious manipulation prove insufficient against sophisticated campaigns employing subtle narrative shaping through strategic sourcing and coordinated editing within Wikipedia's rule structures. Development of automated risk scoring systems incorporating behavioral, sourcing, and volatility metrics could enable proactive identification of manipulation campaigns before they achieve desired narrative shifts.

The Wikipedia dimension of Moldova's information environment demonstrates that electoral interference extends far beyond social media platforms and messaging apps to encompass knowledge infrastructure itself. By systematically manipulating crowd-sourced encyclopedia content, actors shape the baseline factual environment within which electoral debates occur, pre-positioning interpretive frameworks that advantage preferred candidates or geopolitical outcomes. Protecting democratic processes therefore requires defending epistemic integrity across all information spaces, including those traditionally assumed to be self-regulating through community governance mechanisms.

Conclusion

Implications for Democratic Resilience

This forensic investigation of Moldova's September 2025 parliamentary elections reveals an information ecosystem characterized by systematic manipulation, coordinated cross-platform operations, and transnational integration unprecedented in scope and sophistication. The documented 5,220 posts, 693 semantic clusters, and 112 million impressions represent not merely quantitative scale but qualitative transformation in how electoral interference operates in contemporary digital environments.

The research establishes that electoral interference has evolved from communicative persuasion into geopolitical spatial strategy routed through digital infrastructures. Influence is no longer primarily exerted through territorial boundaries but through networked geographies composed of platform architectures, language corridors, and algorithmically mediated publics. The objective is not persuasion in traditional sense but participatory deterrence—exhausting civic agency through information overload, manufactured confusion, and erosion of epistemic confidence.

Key findings demonstrate that Telegram functions as operational core for multi-platform campaigns, that Russian-origin accounts achieve disproportionate reach through systematic amplification infrastructure, and that coordination patterns reveal sophisticated understanding of both algorithmic systems and human cognitive vulnerabilities. The identification of 6,823 manipulation techniques and 3,671 rhetorical figures provides empirical foundation for understanding how persuasion operates at industrial scale.

The study's contribution extends beyond documenting specific operations to revealing underlying architecture and operational logic. The three-tier network structure—state-affiliated seed actors, narrative laundering nodes, local amplification proxies—enables deniability while maintaining coordination. Temporal persistence patterns demonstrate strategic narrative scaffolding designed for long-term effect rather than immediate impact. Cross-border integration reveals regional rather than national operational scope.

For policymakers, these findings suggest need for fundamentally revised approaches to information integrity and electoral security. Current frameworks treating disinformation as content problem amenable to fact-checking and removal prove insufficient against coordinated operations employing strategic ambiguity, temporal persistence, and cross-platform adaptation. Effective response requires systemic interventions addressing platform architectures, algorithmic amplification, cross-border coordination, and financial networks enabling sustained operations.

For researchers, the methodology demonstrates value of integrated approaches combining quantitative network analysis, qualitative content analysis, temporal pattern recognition, and cross-platform correlation. The semantic clustering approach proves particularly valuable for detecting coordinated campaigns that might appear organic when viewed through single analytical lens. Future research should prioritize longitudinal tracking, comparative analysis across national contexts, and development of automated detection systems incorporating insights from manual forensic analysis.

For journalists and civil society, the documentation provides actionable intelligence for investigative work and advocacy. The identified actors, networks, and tactics enable more effective monitoring and rapid response. The demonstration of transnational coordination underscores need for cross-border collaboration among watchdog organizations, sharing threat intelligence and coordinating exposure of coordinated operations.

Moldova's democratic resilience will depend not only on specific defensive measures but on broader transformation of information ecosystem governance. This includes platform accountability for algorithmic amplification of manipulative content, transparency regarding recommendation systems and content moderation, international coordination on attribution and response, financial system interventions disrupting funding flows, media literacy initiatives building public resilience, and legal frameworks establishing consequences for coordinated manipulation.

The 2025 parliamentary elections, while producing pro-European majority, demonstrate that electoral outcomes alone do not resolve underlying vulnerabilities. The documented influence infrastructure remains operational, capable of rapid reactivation around future events. Sustained vigilance, continued research, and coordinated response will prove essential to protecting democratic processes against evolving hybrid threats.