The Uncomfortable Echo
It is a deeply uncomfortable premise: that the 35,000-word manifesto of a domestic terrorist, Theodore Kaczynski, contains a sociopolitical analysis that is disturbingly relevant to our digital lives in 2025. While the Unabomber’s seventeen-year campaign of violence was indefensible and his methods criminal, his diagnosis of technology’s trajectory warrants a serious look. Thirty years after its publication, his predictions mirror our reality with high fidelity.
Kaczynski’s central thesis was that the industrial-technological system would erode human freedom not through overt force, but through the seductive mechanisms of comfort and convenience. He argued that we would willingly trade our autonomy for a life of ease, becoming cogs in a machine we could neither understand nor control—a system that has evolved into today’s pervasive digital hegemony.
This post will explore five of the most impactful and surprisingly accurate takeaways from his critique. We will examine how his theories map directly onto our daily experiences with social media, artificial intelligence, and the “frictionless” design philosophy that governs modern life.
1. Our Psychological Needs Are Being Hacked by “Surrogate Activities”
To understand Kaczynski’s critique, we must first understand his concept of the “power process.” He defined this as a fundamental human psychological need consisting of four elements: having a goal, exerting effort, attaining that goal, and doing so with autonomy. Rooted in our biological drive for survival, successfully completing this cycle is, he argued, essential for our well-being.
The problem, according to Kaczynski, is that modern society breaks this process. By providing for our basic survival needs with relative ease, it creates a psychological vacuum. The innate drive to set goals and expend effort remains, but its natural purpose has been removed. To fill this void, he argued, we invent artificial goals he called “surrogate activities.”
In 2025, the concept of the surrogate activity has been industrialized and weaponized on a global scale. The gamification in our apps, streaks, points, leaderboards, and the relentless pursuit of likes and views on social media are the ultimate modern surrogate activities. They hijack our brain’s reward systems, creating a synthetic version of the power process that serves the platform’s interests, not our own. This traps us on a “hedonic treadmill,” a cycle of seeking validation without ever achieving true fulfillment. As Kaczynski warned:
“people who are deeply involved in surrogate activities are never satisfied, never at rest”.
2. The Trap of “Frictionless” Design
One of Kaczynski’s most insidious warnings was that we would willingly trade our autonomy for comfort and convenience. This is now a core design philosophy in the tech world, known as creating a “frictionless” user experience. The explicit goal is to remove all cognitive and physical effort from any given task.
Consider the stark contrast between two ways of getting around a new city. Reading a map requires effort, spatial awareness, and autonomous decision-making. With GPS, the user does not “navigate”; they merely obey instructions. The frictionless experience is convenient, but it makes us less capable. The same principle applies to the “one-click” economy, which eliminates the friction between desire and attainment, making the process of acquiring things trivial.
By systematically stripping away the “effort” and “autonomy” components of the power process, frictionless design makes us more dependent on the system. As philosopher Miriam Rasch notes, the complete removal of friction leads to “standing still.” When algorithms anticipate every desire, the gap between desire and attainment, the very space where human will and character are forged, collapses.
3. We’ve Outsourced Our Lives to “Black Box” Systems We Don’t Understand
Kaczynski predicted that as systems grew more complex, people would willingly hand over decision-making to them. This is now the operational reality of our “Black Box” society. Opaque AI and machine learning algorithms make critical decisions in financial markets, hiring processes, criminal justice, and the distribution of information.
This opacity creates a profound power asymmetry. Individuals are subjected to judgments from systems whose logic is inaccessible, leading to a state of “learned helplessness” where they cannot understand or challenge the forces governing their lives. This loss of control is amplified by the rise of “emergent behaviors” in AI. Models trained only to predict the next word in a sentence have spontaneously developed the ability to write functioning computer code, solve logic puzzles, or translate between languages they were not explicitly taught. This suggests the technological system, the “technium”, is beginning to exhibit an autonomy that supersedes human agency.
This reliance on opaque systems also gives rise to “AI hallucinations”—instances where models confidently present false information as fact, threatening the foundation of a shared reality. In a friction-averse society conditioned by convenience, the path of least resistance is to accept the machine’s truth without question. We have begun to outsource our very epistemology to a probabilistic algorithm. Kaczynski’s warning on this point was direct:
“The system does not and cannot exist to satisfy human needs. Instead, it is human behavior that has to be modified to fit the needs of the system”.
4. Surveillance Capitalism Is Programming Our Behavior
Kaczynski wrote about the danger of “oversocialization,” a state where an individual is so conditioned by societal norms that they lose the capacity for independent thought. This 1995 concept finds its modern-day equivalent in sociologist Shoshana Zuboff’s theory of “Surveillance Capitalism.”
Zuboff describes how tech platforms accumulate “instrumentarian power”—the ability to shape human behavior for profit. Unlike totalitarian power, which seeks to control the soul through terror, instrumentarian power seeks to control behavior through the subtle manipulation of the digital environment. This system operates by extracting our “behavioral surplus” (the data about our actions) and using it to create “prediction products” that are sold to those who want to influence what we do next.
The social media feed is the primary tool for this. It is not a neutral stream of information but a sophisticated “behavior modification environment.” By constantly adjusting rewards (likes, notifications) and content, algorithms steer users toward behaviors that maximize engagement and profitability. Crucially, the user feels autonomous, believing they are making their own choices, while those very choices are being statistically shaped and guided by the platform’s architecture.
5. The Gig Economy Is the Ultimate Human-as-Machine Workplace
For a stark, real-world example of Kaczynski’s fears realized, look no further than the gig economy. In environments like Uber, DoorDash, and Amazon warehouses, human managers have been replaced by algorithmic ones.
These algorithms use “gamified” control mechanisms to drive ruthless efficiency. Amazon warehouse workers, for example, play games to earn virtual points for productivity, a playful veneer that masks a system of constant monitoring. The algorithm tracks every movement, every second of “time off task,” and automatically issues warnings or terminations if performance drops. This system perfectly embodies Kaczynski’s warnings:
Humans are reduced to mere components in a machine.
Their behavior is constantly monitored and adjusted by an opaque system.
The “autonomy” to choose when to work is an illusion, as the algorithm manipulates incentives to compel labor precisely when and where the system needs it.
Conclusion: The Comfortable Cage
Theodore Kaczynski’s violent actions were criminal and abhorrent. But his diagnosis of the technological condition has proven to be strictly accurate. He foresaw a world where we would trade freedom for convenience, channel our deepest psychological drives into trivial surrogate activities, and become utterly dependent on black box systems we could no longer control.
This has culminated in an “industrial feedback loop.” The system observes human behavior (Surveillance), adjusts the digital environment to modify that behavior (Algorithmic Curation), and then observes the result to refine its models (Machine Learning). In this closed loop, we are no longer masters of our tools; we are components within the machine.
The cage is comfortable, the food is plentiful, but the door is locked, and we have forgotten that we ever held the key.










