How ‘Chat Control’ Smuggles Authoritarianism Into Your Phone
an Investigative Report
A Whispered Warning and a Déjà Vu
At an Amsterdam café on a grey September morning in 2025, a Dutch mother noticed that her WhatsApp account had been locked. A pop‑up on her phone alleged that she had been reported for sharing inappropriate images of her own newborn. The app’s AI had mistakenly classified a father’s proud photo of his son’s first bath as child sexual abuse material (CSAM). Appeals failed: the platform purged the entire chat history and warned that authorities had been notified computerweekly.com. The experience felt like a dystopian déjà vu. The mother had followed previous debates about the European Commission’s “Chat Control” proposal and wondered if her account had become a testing ground for the mass scanning regime she had feared.
Meanwhile, in the European Parliament, lawmakers implemented the Council Decision (CFSP) 2025/966 on May 20th, 2025. Framed as a routine amendment to Russian sanctions, the decision quietly introduced new provisions for “coercive measures” against information warfare eur-lex.europa.eu. It gives Brussels and member states a lega scaffold to target anyone seen as “aligned” with sanctioned entities. This doesn’t just mean oligarchs, it can spill into civil society, academics, journalists or activists whose speech or research does not support the mainstream narrative.
At the same time, NATO’s public‑relations arm unveiled a glossy brochure about cognitive warfare—the alliance’s program to “weaponise human cognition” by controlling information flows, which was discretely implemented in 2022 act.nato.int. In Brussels, these disparate initiatives were presented as unrelated policy tools. Yet a deeper investigation reveals a coherent trajectory.
This report will dissect how the EU’s Chat Control proposal fits into a broader architecture of state and supranational control: NATO’s cognitive warfare doctrine, the EU’s CFSP 2025/966 foreign and security policy decision, and the rollout of digital IDs for internet access. Drawing on dozens of sources, we expose how these frameworks coalesce into a creeping digital panopticon.
The Chat Control Proposal – Client‑Side Scanning as a Mandate
Official Justification and Mechanisms
In 2022, the European Commission proposed the in 2024 and 2025 revised Regulation laying down rules to prevent and combat child sexual abuse (nicknamed Chat Control). Its declared objective is to require online platforms to detect, report and remove CSAM. The draft envisions detection orders—court orders compelling messaging services (WhatsApp, twitter/X, Telegram etc.), to scan all communications, including encrypted ones, to identify known and “new” child abuse content euronews.com. The Commission argued that voluntary efforts by companies were insufficient and that a mandate would protect children.
However, the proposal goes far beyond targeted investigations. It establishes a regime where every message, photo and video would be scanned before being encrypted and after begin decrypted.. This client‑side scanning would run on users’ devices, circumventing end‑to‑end encryption and building a universal surveillance infrastructure. As a parliamentary question to the Commission noted, the proposal “envisages the possibility of scanning mass communications, including encrypted and private chat services” europarl.europa.eu. Automated systems are expected to identify “grooming patterns” and “self‑generated images” using machine‑learning models, then alert platforms and law enforcement.
Critics’ Warnings: False Positives and Function Creep
More than 500 scientists—including cryptographers, digital rights advocates and privacy scholars—issued an open letter on 9 September 2025 warning that Chat Control is “dangerous and technically infeasible” patrick-breyer.de. They argued that scanning all private messages violates the EU Charter of Fundamental Rights, undermines encryption and inevitably produces false positives. Algorithms trained on limited datasets cannot reliably distinguish harmless images from abuse; a father’s photo of his child could be flagged as CSAM. At scale, false positives would inundate authorities, while real predators would easily circumvent detection.
The scientists emphasised that once such a system exists, it is impossible to confine it to child protection. Authorities would possess an unprecedented surveillance capability, ripe for expansion to political speech, dissent or whistle‑blowing patrick-breyer.de. The letter observed that similar scanning orders were rejected in earlier draft legislation because they amounted to general monitoring, which European courts have ruled unconstitutional. Yet under the new proposal, national authorities may issue detection orders without judicial oversight, enabling mass surveillance.
The Danish Compromise: Consent Theatre and Certification Illusions
In autumn 2025, Denmark pushed a “compromise” to salvage the regulation. The compromise would require platforms to deploy client‑side scanning technologies certified by an EU centre and to obtain users’ consent to monitoring computerweekly.com. If users refuse, they might be denied access to messaging services. The scanning algorithms would be kept secret and periodically updated. Users would not be informed which types of content are flagged.
Technology experts warned that this compromise still breaks encryption and legitimises universal scanning. Requiring consent is meaningless when the alternative is exclusion from essential communication tools. The scheme presumes that detection algorithms can operate accurately without undermining security. Yet researchers note that any scanning tool integrated into a device becomes a backdoor; attackers could exploit it to intercept communications or plant malicious evidence. The false positive problem remains unsolved computerweekly.com.
Corporate Surveillance and the Meta Scandal
Metadata: The Backdoor in End‑to‑End Encryption
Proponents of Chat Control often claim that scanning is necessary because law enforcement cannot access encrypted communications. In reality, companies already collect vast amounts of metadata—information about who communicates with whom, when, for how long and from which location. Metadata reveals social graphs, behaviour patterns and sensitive habits. The Electronic Frontier Foundation (EFF) notes that even without message content, metadata can “provide all the information law enforcement might need to infer your social graph and behaviour” eff.org. Another EFF briefing points out that metadata can identify not only contact networks but also users’ locations through IP addresses, giving authorities or corporations intimate knowledge of daily routines eff.org.
WhatsApp’s own compliance documents confirm that the platform can provide subscriber information, IP addresses, device details and message logs (who communicated with whom and when) in response to government requests faq.whatsapp.com. Meta’s communications emphasise that the company can share metadata and “information about who you message, at what time and your location” when required about.fb.com. This undermines the narrative that encryption leaves law enforcement helpless. In practice, authorities can access metadata through subpoenas, court orders or data brokers eff.org.
The September 2025 Meta Scandal
On 1 April 2025 (no joke), the EU Commission found that Apple and Meta had violated the Digital Markets Act (DMA), concluding that Meta’s “consent or pay” advertising model forced users to consent to cross‑service data combination digital-markets-act.ec.europa.eu. Meta offered EU users a binary choice: allow the company to harvest personal data across Facebook, Instagram and WhatsApp for targeted advertising, or pay a monthly subscription. The Commission ruled that the model failed to provide a “less personalised but equivalent” alternative and prevented users from exercising their right to refuse data combination ec.europa.eu. Meta also did not allow users to freely consent to data combination ec.europa.eu.
By November 2024, after negotiations, Meta introduced an allegedly more privacy‑friendly ad model, but the Commission found that the changes applied only after the DMA’s obligations became legally binding. In June 2025 the Commission warned Meta that it may face daily fines of up to 5% of global turnover for non‑compliance, pointing out that its new model still forced users into surveillance capitalism reuters.com. The scandal highlighted how tech giants routinely violate data protection rules, demonstrating that self‑regulation fails and that EU institutions are struggling to enforce even existing privacy laws.
Profiling, Advertising and Cooperation With States
Meta and other tech companies have built business models on targeted advertising that require aggregating personal data across multiple services. The EFF warns that apps like WhatsApp and Instagram continuously collect metadata and feed it into AI systems that determine ad targeting eff.org. An IEEE analysis notes that despite encryption, Meta can still glean enough metadata to “infer behaviour patterns and enrich user profiles” spectrum.ieee.org. Google, Apple and other platforms engage in similar practices, often purchasing data from brokers or reading unencrypted data via in‑app trackers eff.org.
Law enforcement agencies exploit these corporate surveillance infrastructures. Platforms like Slack, Zoom and Telegram do not consistently use end‑to‑end encryption, meaning administrators or governments can access content eff.org. Even where encryption is used, metadata can be subpoenaed or purchased. The interplay between corporate data collection and state demands has created an ecosystem where privacy is eroded by design. Chat Control threatens to expand this to a new level by legalising direct scanning of private content.
NATO’s Cognitive Warfare Doctrine – Weaponising the Mind
In 2021, NATO’s Allied Command Transformation announced a “Cognitive Warfare” initiative. The concept describes cognition as a battlefield and the brain as both weapon and target act.nato.int. The initiative views individuals as nodes in an information network that can be shaped and influenced to affect decision‑making, trust and behaviour. NATO documents stress that modern warfare involves “deliberate, synchronised military and non‑military activities aiming at affecting the way we think and act” act.nato.int. Adversaries like Russia and China purportedly use psychological operations and disinformation campaigns to erode trust in democratic institutions.
NATO proposes countermeasures: cognitive resilience, information dominance, and responsible integration of AI. These involve identifying and countering harmful narratives, controlling the information environment and shaping public opinion in allied countries. While framed as defence against propaganda, this doctrine effectively legitimises mass monitoring of digital communications to identify and influence citizens’ thoughts.
I have published a full report on NATO’s attempt to control what you think here.
The Chat Control proposal intersects with this doctrine by providing the technical means to surveil and analyse private conversations. If every message is scanned for “harmful” content, governments can develop predictive models of dissent or ideological deviation. As algorithms identify speech patterns, authorities could label certain phrases as extremist or aligned with foreign propaganda, triggering intervention or censorship. This is not speculative; the NATO documents openly describe lawfare and psychological operations as tools to degrade adversaries’ cognitive capacities act.nato.int. Normalising client‑side scanning thus lays the groundwork for cognitive warfare waged against the EU’s own population.
Council Decision (CFSP) 2025/966 – Security Policy and Coercive Measures
On 20 May 2025, the Council of the European Union adopted Decision (CFSP) 2025/966, amending earlier measures against Russia. Hidden within the annex is a list of 21 individuals and six entities targeted for “malign activities.” The preamble justifies the decision by alleging that Russia’s actions threaten EU security and democratic processes. At first glance, the decision appears narrowly focused on sanctions.
Yet the broader CFSP framework outlines mechanisms for restrictive measures, disinformation control and sanctions that have far‑reaching implications. The decision sits within a series of new instruments allowing the Council to impose travel bans, asset freezes and information operations against individuals or organisations accused of “destabilising activities.” The language is vague, enabling its application to whistle‑blowers, journalists or activists who criticise EU policies. By embedding such provisions in CFSP decisions, the EU normalises the conflation of external threats with domestic dissent.
The synergy between the Chat Control regulation and the CFSP regime becomes apparent when we consider enforcement. If messaging platforms are compelled to scan and flag “suspicious patterns,” the data could feed into EU security databases. Individuals deemed to be spreading “malign narratives” may find themselves subject to sanctions, travel restrictions or asset freezes without due process. The CFSP thus provides the coercive apparatus to complement the surveillance infrastructure of Chat Control.
What does this mean for you? In Short: Disagree With the Government? Say Goodbye to Your Money!
Digital IDs for Internet Access – From Wallets to Panopticon
The EU Digital Identity Wallet
The EU has touted digital identity wallets as a convenient way for citizens to prove their identity online. The European Digital Identity Framework came into force in May 2024, requiring each Member State to offer at least one EUDI Wallet by 2026 ec.europa.eu. These wallets would allow users to store credentials such as driving licences, diplomas or bank accounts and to sign documents digitally ec.europa.eu. The Commission promises that the wallet will give individuals “control” over their data, enabling them to decide what information to share with service providers.
However, digital rights groups warn that the wallet could become a centralised surveillance tool. An analysis of the Authentic Recording Framework (ARF 1.4), which underpins the wallet, identifies the introduction of a “Pseudonym Provider.” This entity issues pseudonyms tied to real identities and retains a “unique identifier” that allows authorities to re‑identify users when required epicenter.works. The ARF lacks meaningful safeguards against over‑collection of data: relying parties can ask for more information than necessary, and there is no strict limitation on linking different services epicenter.works. Once pseudonyms are linkable, anonymity disappears and a comprehensive record of online activity becomes possible.
My article “Scanned, Scored, Silenced!” Outlines in detail what digital ID’s mean for you.
Age Verification and the “Mini Wallet”
In May 2025, the Commission proposed a “mini wallet” to verify users’ ages when accessing social networks and pornographic sites. The system requires individuals to present a government‑issued ID to a third‑party provider or to their telecom operator before browsing. EDRi, a digital rights organisation, warns that this approach will have a chilling effect on access to information and exclude marginalized groups edri.org. People without smartphones or official IDs—such as undocumented migrants or young adults estranged from their families—may be locked out of social networks. The plan also introduces zero‑knowledge proofs as an optional privacy mechanism, but the technology is untested and not mandated edri.org.
EDRi emphasises that linking digital identities to internet access transforms the web into a controlled space. Service providers and governments would be able to track not only who uses a service but also when, from where and for how long edri.org. This information could be combined with messaging metadata to build comprehensive behavioural profiles, enabling discrimination and targeted manipulation.
eIDAS 2.0 and Certificate Insertion
The eIDAS 2.0 regulation (Electronic Identification and Trust Services) expands the digital identity framework. Security researchers warn that Article 45 of the regulation allows Member States to insert root certificates into browsers and devices, enabling them to intercept encrypted web traffic cybernews.com. An open letter signed by hundreds of experts warns that this effectively gives governments the technical means to surveil all encrypted communications, undermining trust in TLS certificates and making the wallet a “virtual panopticon” cybernews.com.
A report from the German consumer protection agency similarly warns that the EUDI wallet could permit tracking and profiling by governments and corporations if strict privacy safeguards are not implemented biometricupdate.com. The report notes that current specifications allow for collusion between state and commercial actors, with users having limited visibility into how their data is used biometricupdate.com.
Big Tech and Law Enforcement: A Collusive Ecosystem
WhatsApp, Signal and Data Requests
Messaging apps differ in how much data they collect and how they respond to law enforcement. Signal is designed to collect minimal metadata, storing no logs of contacts or timestamps. WhatsApp, in contrast, records contact names, profile pictures, IP addresses and message logs. The company discloses that it provides the “last seen” information and IP addresses associated with messages to authorities faq.whatsapp.com. The AtomicMail security blog notes that because metadata is unencrypted, law enforcement can reconstruct conversation patterns even without message content atomicmail.io.
The EFF explains that network traffic can also be passively monitored to collect metadata. Police may purchase location data from brokers or require service providers to install devices that record connection logs eff.org. Other platforms like Zoom and Slack lack consistent end‑to‑end encryption, enabling corporate administrators to read messages or transmit them to law enforcement eff.org.
Data Brokers, AI and the Black Box of Algorithmic Policing
An emerging surveillance industry combines data from apps, brokers and social media to build predictive models for law enforcement. AI companies scrape public posts and feed them into systems that flag “risky” individuals based on keywords. The Chat Control mandate would normalise algorithmic scanning, providing a steady flow of training data. This data could be used not only to detect CSAM but also to identify “extremism,” “misinformation” or any content flagged by governments. Without transparency, false positives become inevitable.
The Illusion of Child Protection – Political Deception and Authoritarian Drift
EU officials present Chat Control as a necessary measure to protect children. This rhetorical strategy mirrors the War on Terror’s use of national security to justify intrusive measures. Yet as the open letter from scientists points out, criminals can easily circumvent scanning by using obscure file formats, steganography or custom encryption patrick-breyer.de. Meanwhile, ordinary users become subjects of continuous surveillance.
The political deception lies in framing surveillance as “voluntary.” Under the Danish compromise, users may decline scanning but risk losing access to essential communication tools. This replicates Meta’s “consent or pay” model, where consent is coerced, not freely given. Similarly, the age‑verification scheme pressures users to hand over identity documents to third parties, normalising the notion that anonymous speech is inherently suspicious.
Erosion of Civil Liberties and Democracy
The consequences of this convergence are profound. End‑to‑end encryption, once a guarantee that private communications remain private, becomes meaningless when governments require client‑side scanning. Freedom of association is chilled because people fear that innocuous messages may be misinterpreted. Freedom of expression is curtailed as algorithms flag political discourse as “harmful.” Journalists, whistle‑blowers and lawyers cannot communicate securely. The presumption of innocence is inverted, as everyone is treated as a potential criminal.
Democracy itself is undermined. Trust in institutions erodes when citizens perceive them as surveilling and controlling rather than serving. The CFSP 2025/966 decision demonstrates how easily sanctions and coercive measures can be expanded to domestic dissent. NATO’s cognitive warfare doctrine reveals a mindset that sees populations as targets to be manipulated act.nato.int. The digital ID schemes entrench a society where access to services is contingent on presenting state‑issued credentials, erasing the possibility of anonymous speech edri.org.
Resistance and the Growing Opposition
Civil Society and Scientific Community
Civil society groups such as EDRi, privacy international, epicenter.works, EFF and the Chaos Computer Club have spearheaded resistance against Chat Control and related policies. The open letter from scientists lists countries that support or oppose the regulation and emphasises that Germany’s stance could determine the outcome patrick-breyer.de. These advocates call on the European Parliament to reject client‑side scanning and to uphold the right to privacy.
Researchers at epicenter.works warn that the EUDI wallet’s pseudonym provider will allow re‑identification and mass surveillance epicenter.works. EDRi critiques the mini‑wallet and age‑verification scheme for threatening digital inclusion and chilling free expression edri.org. Cybernews and Biometric Update highlight the risk of eIDAS 2.0 enabling certificate insertion and state surveillance cybernews.com biometricupdate.com.
Political Resistance in Member States
Within the Council of the EU, countries like Germany, Luxembourg, Austria, Poland and the Netherlands have expressed concerns about blanket scanning. Others, such as France, Italy and Denmark, support the measure. Civil liberties committees in the European Parliament argue that detection orders should be targeted rather than universal and that scanning should occur only after a court order. However, the Council holds blocking power, and Denmark’s compromise attempts to appease opposing sides while preserving the core scanning requirement.
The EU’s digital sovereignty narrative frames Chat Control, digital IDs and cognitive warfare as necessary to protect Europe from foreign influence. But critics argue that the real threat is the emergence of a techno‑authoritarian bloc in the West.
Conclusion – Constructing the Panopticon
Returning to our Amsterdam mother, her WhatsApp ban exemplifies the new digital reality. A misclassified image triggered the suspension of her account, and she found herself powerless against an opaque algorithm. Under Chat Control, this scenario could become the norm. The regulation normalises the idea that every communication is subject to surveillance and that privacy is a privilege granted by the state, not a right.
Combined with NATO’s cognitive warfare doctrine, CFSP 2025/966, digital ID wallets, and cooperative big tech platforms, Chat Control paves the way for a totalitarian digital panopticon. The EU is building an infrastructure that conflates child protection, security and public health with universal surveillance. The rhetoric of protecting children and democracy obscures the reality that the architecture being constructed will persist long after the initial justification fades.
If the regulation passes, it will create an irreversible precedent. Once governments and corporations have the power to monitor and classify every message, the scope of monitoring will inevitably expand. Today’s target is CSAM; tomorrow’s could be dissident speech, protest coordination or “harmful content” defined by whichever coalition holds power. The EU must choose whether to uphold the principles of privacy, freedom of expression and democratic accountability or to embrace a model of governance that treats its citizens as adversaries. The choice will shape the future of communication and liberty not only in Europe but worldwide.
Comprehensive Source List (URLs verified Sept. 17, 2025)
* CFSP 2025: Disagree With the Government? Say Goodbye to Your Money!
* Digital ID’s: Scanned, Scored, Silenced!
* Congnitive Warfare; the silent coup against you Thoughts
1 Comoputerweekly.com: Suspicionless mass surveillance
2 eur-lex.europa.eu: Council Decision (CFSP) 2025/966
3 act.nato.int: NATO cognitive warfare (only accessible without VPN)
4 Euronews: Viral posts are fuelling panic that the EU will soon scan text messages.
5 Europarl.europa.eu: Proposed Chat Control law presents new blow for privacy
6 Patrick-breyer.de: ‘Danger to Democracy’: 500+ Top Scientists Urge EU Governments to Reject ‘Technically Infeasible’ Chat Control
8 Computer weekly: Chat Control: EU to decide on requirement for tech firms to scan encrypted messages
9 Eff.org: How Cops Can Get Your Private Online Data
10 Eff.org: When Platforms and the Government Unite, Remember What’s Private and What Isn’t
11 WhatsApp FAQ: https://faq.whatsapp.com/808280033839222
12 About.fb.com: Our Approach to Safer Private Messaging
13 Digital Markets act: Commission sends preliminary findings to Meta over its “Pay or Consent” model for breach of the Digital Markets Act
14 Reuters.com: Meta may face daily fines over pay-or-consent model, EU warns
15 eff.org What WhatsApp’s “Advanced Chat Privacy” Really Does
16 Spectrum.ieee.org: Meta’s Global Encryption Rollout Ups Privacy Stakes
17 ec.europa.eu: A digital ID and personal digital wallet for EU citizens
18 epicenter.works: eIDAS: Building Trust or Invading Privacy?
19 Edri.org: Showing your ID to get online might become a reality
20 Cybernews.com: The privacy pitfalls of EU's eIDAS framework
21 biometricupdate.com: EUDI Wallet sees progress but also criticism
22 Atomicmail.io: Is WhatsApp Safe? The Truth About Your Privacy in 2025



You might want to add Palantir into your research. That would add to your global perspective.
I am a keen listener to your podcasts and lectures. I would like to add the following languages captions to your podcases in YouTube and other social media outlets:
Arabic
Persian
Turkish
There are serious topics that are mainly discussed and presented online in mainly English and other Western languages but not many are available with translation in languages such as Arabic, Persian, Turkish and many others in the global South. Individuals like me cannot do a good reliable translation and transmission of the messages discussed in progressive channels such as your.
I am suggesting to you and others to please add captions to your online publications of videos with additional languages please. Of course, if you agree to allow me to edit your videos and adding Arabic, Persian and Turkish captions, I will gladly get my team of collaborators to oblige speedily. This would inject topics for high quality debates effectively amongst the listeners in large segments of Middle-Eastern societies.
I hope you'd receive this request and respond generously.