Surveillance Capitalism Explained: Why Your Data Is So Profitable

Surveillance Capitalism Explained: Why Your Data Is So Profitable

Discover how ‘free’ apps really make money: learn how surveillance capitalism turns your clicks, location, social interactions and browsing history into profit — and why our privacy is the price.
A clear breakdown of the modern data-economy, its impact on autonomy, democracy and human behavior.

You are being watched. Every website you visit, every product you view, every message you send, every location you travel—it’s all being recorded, analyzed, packaged, and sold.

This isn’t paranoia. It’s the business model of the modern internet.

In Enshittification, Cory Doctorow explains that surveillance capitalism—the systematic collection and monetization of personal data—isn’t a side effect of digital services. It’s the foundation that enables platform monopolies to exist, grow, and exploit users. Breaking enshittification requires breaking the surveillance economy that funds it.

The Surveillance Business Model: How “Free” Services Make Billions

When a service is free, you’re not the customer—you’re the product being sold.

Facebook, Google, TikTok, Instagram, and most major platforms operate on the same model: collect massive amounts of data about users, use that data to target advertising with precision, and sell access to user attention to advertisers willing to pay for guaranteed eyeballs.

This model produces staggering profits. Google’s parent company Alphabet generates over $200 billion annually, primarily from advertising. Meta (Facebook) generates over $100 billion. These aren’t technology companies—they’re surveillance companies that happen to use technology.

The data collection is comprehensive and intrusive:

Behavioral Tracking: Every click, scroll, hover, and pause is recorded. Platforms track how long you look at each post, which ads you view, what makes you engage, what makes you angry, what makes you click.

Location Surveillance: Your phone reports your location constantly—not just when you use maps, but all the time. Platforms know where you live, where you work, where you shop, where you worship, where you seek medical care, and everywhere else you go.

Social Graph Mapping: Platforms know everyone you communicate with, how often, about what topics. They know your family relationships, friendships, professional connections, and romantic interests. They know your social network better than you do.

Predictive Profiling: Platforms use collected data to build predictive models of your behavior, preferences, vulnerabilities, and future actions. They know what you’re likely to buy, which political messages will influence you, when you’re most susceptible to manipulation.

Cross-Platform Tracking: Even when you’re not directly using Facebook or Google, they track you across the web through embedded pixels, advertising networks, and login widgets. Your browsing history is never private.

Purchase History: Platforms track every product you view, research, compare, and purchase. They know your buying patterns better than you remember them yourself.

This data collection serves one purpose: maximizing the value of your attention to advertisers. The more platforms know about you, the better they can manipulate your behavior and guarantee results to advertisers.

Why This Is Harmful: Beyond “Nothing to Hide”

Many people dismiss privacy concerns with “I have nothing to hide.” This profoundly misunderstands the harm of surveillance capitalism.

Manipulation and Behavioral Control

Surveillance data doesn’t just enable targeted advertising—it enables behavioral manipulation. Platforms use psychological insights derived from your data to show you content designed to provoke specific emotional responses and actions.

Facebook’s own research demonstrated that it could make users feel happier or sadder by manipulating which posts appeared in their feeds. This was an experiment—they did it deliberately to see if emotional manipulation worked. It did.

This manipulation extends to:

Political Influence: Platforms can target political ads with precision, showing different messages to different voters based on their vulnerabilities and susceptibilities. This micro-targeting undermines democratic discourse by preventing public accountability—everyone sees different messages, making it impossible to have shared political conversations.

Commercial Exploitation: Platforms know when you’re most likely to make impulse purchases, when you’re feeling insecure, when you’re vulnerable to status-driven consumption. They time ads and recommendations to maximize sales, regardless of whether purchases serve your interests.

Addiction Engineering: Platforms design their services to be maximally addictive using data about what triggers compulsive usage. This isn’t accidental—it’s the explicit goal. More time on platform means more ad impressions and more data collection.

Discrimination and Bias

Surveillance data enables sophisticated discrimination that would be illegal if done explicitly:

Housing Discrimination: Facebook allowed advertisers to exclude people from seeing housing ads based on race, religion, and other protected characteristics. This is straightforward housing discrimination, laundered through algorithmic targeting.

Employment Discrimination: Job ads can be targeted to exclude older workers, women, or minorities—all legal under Facebook’s ad platform despite violating employment discrimination law.

Credit and Insurance: Data brokers compile detailed profiles used to determine creditworthiness and insurance rates. These profiles reflect and amplify existing inequalities, disadvantaging already marginalized groups.

Criminal Justice: Surveillance data feeds predictive policing algorithms that direct law enforcement to over-police communities of color while under-policing white neighborhoods, creating self-fulfilling prophecies of criminality.

Power Asymmetry

The fundamental harm of surveillance capitalism is the power imbalance it creates. Platforms know everything about you; you know almost nothing about them.

This asymmetry enables exploitation:

  • Platforms can manipulate you because they know your vulnerabilities
  • Platforms can price discriminate, charging different users different amounts based on willingness to pay
  • Platforms can suppress competition by using data advantages that startups can’t replicate
  • Platforms can evade accountability because users can’t verify what data is collected or how it’s used

Doctorow emphasizes that this power imbalance is the foundation of enshittification. Without surveillance data, platforms couldn’t achieve the lock-in and market dominance that allows them to degrade service while retaining users.

surveillance capitalism
Your Data, Their Profit: Understanding Surveillance Capitalism and the Privacy Crisis

Privacy Law Failure: Why Current Protections Don’t Work

The United States has weak, fragmented privacy protection compared to other developed democracies. This isn’t accidental—it reflects successful lobbying by companies that profit from surveillance.

No Comprehensive Federal Privacy Law: Unlike Europe’s GDPR or California’s CCPA, the US has no general privacy protection. Instead, there are sector-specific laws (HIPAA for health, COPPA for children, FCRA for credit) that leave massive gaps.

Consent Theater: Platforms comply with minimal requirements through deliberately confusing privacy policies that no one reads and consent dialogs designed for user agreement rather than informed choice. “Agreeing” to surveillance isn’t meaningful consent when the alternative is being excluded from essential services.

No Private Right of Action: Most privacy laws don’t allow individuals to sue for violations. Only government agencies can enforce, and agencies are understaffed and outgunned by platform legal teams.

Weak Penalties: Fines for privacy violations are trivial compared to surveillance profits. Facebook faced a $5 billion FTC fine for privacy violations—enormous by historical standards but representing less than one month of revenue.

Data Broker Exemptions: Data brokers—companies that collect and sell personal information without direct user relationships—operate largely outside privacy regulation. They compile detailed profiles purchased by advertisers, insurers, employers, and government agencies.

Surveillance Capitalism Explained: Why Your Data Is So Profitable

The European Alternative: GDPR Shows What’s Possible

The European Union’s General Data Protection Regulation (GDPR) demonstrates that comprehensive privacy protection is achievable:

Consent Must Be Meaningful: Users must affirmatively opt in to data collection with clear explanation of how data will be used. Pre-checked boxes and bundled consent don’t count.

Data Minimization: Companies can only collect data necessary for specified purposes. Collecting everything just in case is prohibited.

Right to Access and Deletion: Users can request copies of all data companies hold about them and demand deletion. Companies must comply within strict timeframes.

Data Portability: Users can export their data and move it to competing services. This reduces lock-in and enables competition.

Significant Penalties: GDPR violations can result in fines up to 4% of global revenue—enough to actually matter to even the largest platforms.

Enforcement by Regulators: European data protection authorities have levied billions in fines against Google, Meta, and other platforms for GDPR violations.

GDPR isn’t perfect—enforcement is uneven, loopholes exist, and platforms game requirements. But it demonstrates that surveillance capitalism can be constrained through regulation when political will exists.

What Real Privacy Protection Requires

Doctorow argues that meaningful privacy protection needs more than incremental reform—it requires fundamentally restructuring the surveillance economy.

Ban Surveillance Advertising

Prohibit the collection and sale of personal data for advertising targeting. This is the nuclear option that directly attacks the surveillance business model.

Platforms could still sell advertising—contextual advertising based on the content users are viewing rather than behavioral targeting based on comprehensive surveillance. This is how advertising worked before surveillance capitalism, and it funded plenty of profitable businesses.

Mandatory Data Minimization

Require companies to collect only data necessary for services they provide. If you’re providing email service, you need email addresses—you don’t need location history, browsing behavior, or purchase records.

Meaningful Consent Requirements

Ban dark patterns and manipulative design in privacy choices. Require clear, simple explanations of data collection. Make privacy the default, requiring active opt-in for any data sharing.

Private Right of Action

Allow individuals to sue companies for privacy violations with statutory damages significant enough to deter violations. Class action lawsuits create accountability that regulatory enforcement alone can’t achieve.

Data Deletion Requirements

Mandate deletion of personal data after specified periods unless users explicitly consent to continued retention. Data should expire by default, not persist indefinitely.

Prohibit Data Brokers

Ban the collection and sale of personal information by companies without direct user relationships. If someone didn’t choose to give you their data, you shouldn’t have it.

Liability for Data Breaches

Hold companies liable for harm caused by data breaches when they failed to implement adequate security. The current model—companies profit from collecting data, then disclaim responsibility when that data leaks—must end.

The Intersection of Privacy and Antitrust

Doctorow emphasizes that privacy protection and antitrust enforcement work together:

Data Monopolies Enable Platform Monopolies: Exclusive control of user data creates barriers to entry that protect monopolies. New competitors can’t replicate the data advantages of established platforms. Mandating data portability and interoperability reduces these barriers.

Monopolies Resist Privacy Protection: Platforms with monopoly power can ignore user privacy preferences because users have nowhere else to go. Competition on privacy only works when users can switch to more privacy-protective alternatives.

Surveillance Funds Predatory Pricing: Platforms use surveillance advertising revenue to subsidize below-cost services that crush competitors, then recoup profits through monopoly power. Constraining surveillance reduces their ability to engage in predatory pricing.

Breaking surveillance capitalism requires attacking both monopoly power and the surveillance business model. Neither alone is sufficient.

What You Can Do While Demanding Policy Change

Individual actions can’t solve structural problems, but they’re not meaningless:

Use Privacy Tools: Ad blockers, privacy-focused browsers, VPNs, and encrypted messaging reduce surveillance. These aren’t perfect solutions, but they raise the cost of tracking.

Minimize Platform Use: Delete apps you don’t need. Limit time on surveillance platforms. Use privacy-protective alternatives where they exist.

Reject “Nothing to Hide”: Push back against dismissive attitudes toward privacy. Everyone has things they don’t want corporations tracking, even if those things are perfectly legal and morally acceptable.

Support Privacy Legislation: Contact representatives about comprehensive privacy protection. Join organizations like the Electronic Frontier Foundation advocating for digital rights.

Vote with Your Wallet: Support businesses with privacy-protective practices. Avoid services that profit primarily from surveillance.

Most importantly, recognize that surveillance capitalism isn’t inevitable and that the privacy crisis isn’t your responsibility to solve individually. You’re not failing because you can’t avoid surveillance—you’re being surveilled because policy allows companies to profit from it.

Changing that requires political action to constrain corporate surveillance, not individual perfection in avoiding tracking.

The Bottom Line: Surveillance Enables Enshittification

You can’t understand enshittification without understanding surveillance capitalism. The data asymmetry created by comprehensive surveillance gives platforms the power to manipulate users, lock in customers, crush competitors, and degrade service while maintaining monopoly control.

Breaking enshittification requires breaking surveillance capitalism. That means comprehensive privacy protection, bans on surveillance advertising, mandatory data minimization, and meaningful enforcement against violations.

The platforms won’t reform themselves. Surveillance is too profitable, and they’ve invested too much in the infrastructure of tracking. Only policy change can constrain them—and only political will can create policy change.

But the tools exist. The models exist. Europe has demonstrated that comprehensive privacy protection is achievable. What’s needed is the courage to implement it in the United States and globally.

Your privacy matters. Not because you have something to hide, but because privacy is power—power to think freely, associate freely, and live without constant corporate surveillance.

Reclaiming that power requires recognizing surveillance capitalism for what it is: theft of human autonomy for corporate profit. And demanding that policymakers end it.

This Article is based on “Enshittification” by Cory Doctorow

enshittification book cover