Est. 2026
A Framework for the Digital Age

Digital
Bill of Rights

Your data is being collected, sold, and weaponized — without your knowledge, without your consent, and without any law stopping it. That ends here.

Read the Framework Follow the Dispatches
Scroll to read
The Problem

No one voted on the surveillance economy. It was built through the accumulation of individually permissible decisions — terms of service agreements, procurement contracts, zoning approvals — none of which, individually, looked like the construction of a surveillance state. Collectively, they built one. This is the gap the Digital Bill of Rights addresses. Not the technology. The accountability.

The Articles

I
The Right to Informational Self-Determination

Individuals own the data generated by their lives. No government agency or private entity may collect, share, or sell personal data without informed, specific, and revocable consent — unless authorized by a judicial warrant supported by probable cause. Government purchase of data from commercial data brokers is classified as a search requiring a warrant. Individuals retain the Right of Retroactive Revocation: the right to withdraw consent and require deletion of all raw data and derived records.

II
The Right Against Algorithmic Harm

No consequential decision in housing, employment, credit, healthcare, or criminal justice may be made solely by an automated system. Every person has the right to human review, an explanation of the decision, and an appeal process. The burden of proof falls on the entity deploying the system to demonstrate its accuracy and fairness. An algorithm does not take the stand — a human being must be accountable for the machine's assessment.

III
Surveillance Boundaries

Mass surveillance conducted through private infrastructure requires the same constitutional scrutiny as direct government surveillance. The government may not acquire through commercial purchase what it could not legally compel through a warrant. Technologies deployed at scale to monitor people not under individual suspicion — including facial recognition, automated license plate readers, and cell-site simulators — require authorization equivalent to a warrant.

IV
The Right to an Authentic Information Environment

Individuals have the right to be free from covert algorithmic manipulation designed to alter their political beliefs or emotional states without disclosure. AI-generated content impersonating humans must be clearly labeled at the point of distribution. Platforms engaging in deliberate algorithmic amplification for radicalization or behavioral change are treated as publishers, not neutral conduits, and are liable for the resulting manipulation.

V
The Human Command Requirement

No machine may decide to kill a human being. Any use of lethal force initiated by an autonomous system requires a meaningful human decision point — a person with authority and situational awareness. The legal and moral chain of command must be traceable to an identifiable human commander. The speed of technological capability does not justify removing human judgment or accountability from lethal decisions.

VI
The Right to Digital Due Process

Predictive profiling and secret algorithmic classifications do not override due process. Individuals have the right to be notified if placed on government watchlists or restricted in travel based on algorithmic scores, and a meaningful opportunity to contest those classifications. Trade secret protections for algorithmic methodologies cannot be used to deny a person the right to confront evidence against them.

VII
Children's Digital Sovereignty

No data may be collected from anyone under 18 without verified parental consent. Behavioral targeting directed at children is prohibited. Platforms bear strict liability for psychological harm caused by design choices intended to maximize engagement — variable reward schedules, sleep-disrupting notifications, and similar mechanisms. These are product liability questions, not terms-of-service agreements.

VIII
Cognitive Sovereignty

The interior territory of the mind is protected. Cognitive signals — brainwave activity, emotional states, attention patterns — require explicit, granular consent for collection or transmission, separate from standard terms of service. Cognitive and physiological data may not be used for targeted advertising or as inputs for consequential decisions including employment screening, insurance underwriting, or credit determination.

IX
The Right to Compensation for Commercial Data Use

Personal data is a form of labor. Individuals are entitled to a share of the commercial value their data generates. Companies must offer three opt-in tiers: a Revenue Share Tier with direct payment for data use, a Fee Reduction Tier with reduced subscription costs for limited disclosed data use, and a Full Privacy Tier with no data collection. Commercial entities must publish an annual data valuation disclosure showing how much revenue is attributed to user data.

Dispatches

Subscribe on Substack →
Loading dispatches...

Who Authorized This?

No one asked the American people whether they wanted to live inside a surveillance economy. It's time to ask.

Join the Movement