The Path to Enshittification: How Ad-Driven Models Poison Information and Society

In the digital age, much of what we consume online is shaped by algorithms tuned to maximize one thing: ad revenue. Platforms like Facebook, Google, and YouTube have built empires by leveraging user data to deliver hyper-targeted ads. This business model, while profitable for these corporations, has a dark side—a phenomenon aptly termed enshittification. This describes the degradation of digital spaces and content as platforms prioritize profit over user experience, well-being, and the truth itself.

As these platforms evolve, their relentless focus on ad revenue has led to a toxic information ecosystem. At its core, this model feeds us content not because it’s accurate or beneficial, but because it keeps us scrolling, clicking, and consuming. It’s a system that thrives on outrage, sensationalism, and polarization—traits that undermine democracy, public discourse, and mental health. The question we must grapple with is whether an alternative is possible—and whether society is ready to embrace it.

How Big Tech’s Business Model Harms Society

Exploitation of User Data

Big Tech platforms collect massive amounts of user data to refine their ad-targeting capabilities. This data isn't just harvested for convenience—it’s a goldmine that advertisers pay billions to access. This monetization of personal information raises significant privacy concerns and turns users into products, eroding trust.

Algorithmic Manipulation

Algorithms prioritize engagement over everything else, pushing divisive and emotionally charged content to the forefront. The result is an amplification of misinformation, conspiracy theories, and propaganda, which can destabilize societies and erode democratic values.

Monopolistic Practices

With their dominance, companies like Google and Facebook stifle competition, limiting the rise of alternative platforms or models. This monopoly ensures their profit streams remain unchallenged, perpetuating the cycle of enshittification.

Erosion of Mental Health

Social media platforms exploit psychological vulnerabilities to keep users hooked. This “attention economy” fosters anxiety, depression, and addiction, further contributing to societal harm.

Are We Fed Up? The Case for Change

For decades, these issues have festered, yet the ad-driven model remains ubiquitous. Why? Because it's free—or at least it appears to be. However, the hidden costs—privacy invasion, societal polarization, and mental health crises—are too high to ignore.

The growing disillusionment with Big Tech is evident. Movements like data privacy advocacy, regulatory crackdowns, and the rise of decentralized platforms signal a desire for alternatives. But can we truly escape this trap?

Can P2P Technologies Like IPFS and Blockchain Offer a Way Out?

Peer-to-peer (P2P) technologies such as IPFS (InterPlanetary File System, including frameworks like Iroh) and blockchain networks like Polkadot represent a promising alternative to the centralized, ad-driven internet dominated by Big Tech. These systems are designed to be decentralized and community-owned, making them resilient to corporate monopolization. The network that corporations can't own. However, their potential to revolutionize content monetization and information dissemination depends on addressing key challenges, particularly Sybil attacks and the risks of exploitation by bad actors or grifters.

One possibility is to use reputation-based and public consensus mechanisms for monetization. Unlike engagement-based monetization, which rewards sensationalism and manipulation, reputation-based systems could prioritize quality, trustworthiness, and community approval. By requiring at least 51% consensus or approval for significant decisions like funding or validations and making the system resilient to Sybil attacks—where bad actors use fake identities to manipulate outcomes—through methods such as staking and KYC. With modern cryptographic tools and algorithms, these mechanisms can be made secure, scalable, and transparent.

Overcoming the Stigma Around P2P

P2P technologies have been criticized for enabling scams and grifters, partly because of their association with poorly regulated projects and bad actors. Yet, these problems are not inherent to the technology—they stem from design flaws and lack of oversight. Algorithms already exist to mitigate these risks, such as proof-of-stake systems, decentralized identity protocols, and zero-knowledge proofs, which can ensure fairness and accountability without compromising decentralization.

To change the narrative, developers must focus on designing systems that prevent abuse while preserving the core values of decentralization. This requires shifting away from the "monetize engagement" mindset, which has plagued Big Tech, and instead rewarding meaningful contributions and community trust.

Building on blockchain comes with inherent constraints, including limited capacity for heavy computation and storage. Similarly, off-chain computing methods like zero-knowledge proofs, while powerful for privacy, are slow and computationally intensive. Even basic computations can take hours, making the user experience challenging. However, these limitations are not insurmountable. By designing systems that account for these constraints, we can still create systems with trade-offs that are user-friendly enough.

The Road Ahead

The current system has persisted because it works—for Big Tech. But as enshittification continues to corrode the digital world, the cracks are showing. Public pressure, coupled with regulatory action and innovative alternatives, could pave the way for a healthier, more democratic internet.

We are at a crossroads. Change is not only possible—it’s necessary. The question is whether we’re ready to step out of our comfort zone and invest in an internet that serves people, not profits.

The time for alternatives is now. Are we ready to build them?

Can Fascism and Dictatorshi might come back?

Why fascism is so tempting -- and how your data could power it

Fascism and dictatorships might come back, but they will come back in a new form, a form which is much more relevant to the new technological realities of the 21st century. In ancient times, land was the most important asset in the world. Politics, therefore, was the struggle to control land. And dictatorship meant that all the land was owned by a single ruler or by a small oligarch. And in the modern age, machines became more important than land. Politics became the struggle to control the machines. And dictatorship meant that too many of the machines became concentrated in the hands of the government or of a small elite. Now data is replacing both land and machines as the most important asset. Politics becomes the struggle to control the flows of data. And dictatorship now means that too much data is being concentrated in the hands of the government or of a small elite.

The greatest danger that now faces liberal democracy is that the revolution in information technology will make dictatorships more efficient than democracies.

In the 20th century, democracy and capitalism defeated fascism and communism because democracy was better at processing data and making decisions. Given 20th-century technology, it was simply inefficient to try and concentrate too much data and too much power in one place.

But it is not a law of nature that centralized data processing is always less efficient than distributed data processing. With the rise of artificial intelligence and machine learning, it might become feasible to process enormous amounts of information very efficiently in one place, to take all the decisions in one place, and then centralized data processing will be more efficient than distributed data processing. And then the main handicap of authoritarian regimes in the 20th century -- their attempt to concentrate all the information in one place -- it will become their greatest advantage.

The Impact of Moore’s Law Ending: A Shift Towards Decentralized Systems

Is Moore's Law Finally Dead?

As Moore's Law ends and the pace of exponential improvements in centralized processing power diminishes, the reliance on centralized AI systems will face increasing challenges. This shift opens the door for distributed and decentralized systems to emerge as more viable and efficient alternatives. With these systems evolving and gaining prominence, we can anticipate reduced dominance by Big Tech and oligarchic control. Decentralized systems, by design, democratize computational power and decision-making, offering a future where technology is less prone to monopolization and more equitably distributed.

We can expect less disruption from oligarchs and Big Tech with Moore's law end, as distributed and decentralized protocols begin to take over the system.