Douglas Rushkoff’s “Team Human” opens with a stark observation: “Autonomous technologies, runaway markets, and weaponized media seem to have overturned civil society, paralyzing our ability to think constructively, connect meaningfully, or act purposefully.”
This isn’t hyperbole. The systems we interact with daily—social platforms, algorithms, automated interfaces—are systematically reengineering human behavior to be more mechanical, predictable, and profitable. Understanding this process is crucial for any man committed to living deliberately and maintaining his humanity in an increasingly inhuman digital landscape.
The Mechanization of Human Life
Rushkoff identifies a pervasive pattern he calls “mechanomorphism”—the process by which we unconsciously adopt machine qualities as ideals. “When autonomous technologies appear to be calling all the shots,” he writes, “it’s only logical for humans to conclude that if we can’t beat them, we may as well join them.”
This manifests everywhere. We talk about “optimizing” ourselves. We measure our worth through metrics and data points. We strive to be “efficient” and “productive” as if we were industrial equipment. We multitask compulsively, trying to operate like computers with parallel processors, despite abundant evidence that human brains don’t work that way.
The digital environment doesn’t just offer tools—it imposes values. And the core values embedded in most digital systems are mechanical ones: efficiency over meaning, quantification over quality, speed over depth, predictability over spontaneity.
How Algorithms Shape Behavior
The most insidious aspect of digital life is how algorithms subtly train us to behave in ways that make us easier to predict, manipulate, and monetize.
Rushkoff explains that algorithms don’t care about your actual life or wellbeing. They care about behavioral patterns that correlate with profitable outcomes: “Algorithms don’t engage with us humans directly. They engage with the data we leave in our wake to make assumptions about who we are and how we will behave. Then they push us to behave more consistently with what they have determined to be our statistically most probable selves.”
This creates a feedback loop. The algorithm observes your behavior, predicts your future behavior based on that data, and then feeds you content designed to make you behave more predictably. Over time, you start conforming to your algorithmic profile—not because it reflects your true self, but because the system rewards predictability and punishes deviation.
The result? You become less yourself and more like what the algorithm expects you to be. Your edges get smoothed off. Your anomalous behaviors—the weird interests, the curious explorations, the surprising choices—get suppressed in favor of more predictable patterns.
The Loss of Human Complexity
“Algorithms use our past behavior to lump us into statistical groups and then limit the range of choices we make moving forward,” Rushkoff notes. “We are using algorithms to eliminate that 20 percent: the anomalous behaviors that keep people unpredictable, weird, and diverse.”
This matters more than it might seem. Human vitality comes from unpredictability, creativity, and the capacity for genuine novelty. A life lived entirely within algorithmic expectations is a life that never surprises itself, never grows in unexpected directions, never encounters the unexpected that makes existence interesting.
For men specifically, this flattening is particularly damaging. Masculine development requires confronting uncertainty, making difficult choices without clear data, and developing judgment in ambiguous situations. Algorithms offer none of this—they offer certainty, prediction, and the illusion that all questions have data-driven answers.
The Metrics That Control You
Digital life teaches us to measure ourselves by metrics designed to benefit platforms, not people. Likes, follows, engagement rates, productivity scores, step counts, efficiency metrics—none of these capture what makes a life meaningful, but all of them shape how we experience ourselves.
Rushkoff observes: “The things we learn from one another are helpful with the logistics of mutual survival, but the process of learning itself—the sense of connection, rapport, and camaraderie we develop while communicating—may be the greater prize.”
Yet digital metrics ignore this entirely. They can count interactions but not quality of connection. They can measure activity but not meaning. They can track consumption but not satisfaction.
When you let these metrics define success, you’re optimizing for someone else’s agenda. The platform profits from your engagement time and data generation. But your actual life—the texture of your days, the depth of your relationships, the sense of purpose and aliveness—these may be deteriorating while your metrics look great.
Recovering Human Modes of Being
Fighting algorithmic dehumanization requires deliberately cultivating capacities that machines can’t replicate or understand.
Embrace paradox and ambiguity. Machines require binary choices—yes or no, 1 or 0. Human consciousness thrives in the space between certainties. “Team Human has the ability to tolerate and even embrace ambiguity,” Rushkoff writes. “The stuff that makes our thinking and behavior messy, confusing, or anomalous is both our greatest strength and our greatest defense against the deadening certainty of machine logic.”
Practice sitting with questions that don’t have clear answers. Get comfortable with “both/and” instead of “either/or.” Resist the pressure to have definitive opinions on everything immediately.
Prioritize unmeasurable experiences. Do things that can’t be quantified or optimized. Have conversations that go nowhere in particular. Take walks without tracking your steps. Create something without measuring its engagement or reach. Let experiences be complete in themselves, not data points toward some metric.
Cultivate depth over breadth. Algorithms reward constant shallow engagement. You reward yourself by going deep. Read books that require sustained attention. Build relationships that develop slowly over years. Develop expertise in something that takes decades to master. Resist the pressure to sample everything lightly and instead commit fully to a few things.
Practice analog living. Spend time in purely physical spaces doing physical things. The body knows modes of being that the digital environment can’t accommodate. Cook a meal from scratch. Build something with your hands. Have a face-to-face conversation. Garden. Play music. The point isn’t to reject digital entirely—it’s to maintain connection to non-digital modes of existence.
Make genuinely weird choices. Do things that won’t fit your algorithmic profile. Explore interests that don’t make sense given your demographic. Make choices that surprise yourself. The goal isn’t to be contrarian for its own sake—it’s to maintain your capacity for genuine spontaneity and self-determination.

The Attention Economy’s Assault on Presence
Perhaps the most profound loss in digital life is presence itself—the capacity to be fully here, now, with what’s actually happening rather than half-present while monitoring multiple feeds and anticipating notifications.
Rushkoff describes the shift: “Going online went from an active choice to a constant state of being. The net was strapped to our bodies in the form of smartphones and wearables that can ping or vibrate us to attention with notifications and updates, headlines and sports scores, social media messages and random comments.”
This perpetual availability and interruption makes sustained presence nearly impossible. You’re never fully anywhere—you’re always partially in digital space, monitoring, checking, ready to respond. This fragments consciousness and undermines the direct experience of life.
Recovery requires deliberate practice. Set boundaries around when and where you’re digitally available. Create spaces and times that are completely phone-free and notification-free. Practice being fully present in one thing at a time—a conversation, a meal, a task—without the pull of digital elsewhere.
Reclaiming Autonomy
The core issue isn’t technology—it’s whether you’re using technology according to your own values and purposes, or whether technology is using you according to others’ agendas.
“Human beings can intervene in the machine,” Rushkoff insists. “That’s not a refusal to accept progress. It’s simply a refusal to accept any particular outcome as inevitable.”
This intervention requires consciousness and will. You must actively decide how you want to live, what matters to you, what you’re willing to sacrifice your life energy for. Then you must shape your relationship with technology to serve those priorities, rather than letting technology’s embedded priorities shape you.
This isn’t easy. The systems are designed to be sticky, compelling, and difficult to resist. But the alternative—drifting into a progressively more mechanized, quantified, algorithmically determined existence—is far worse.
Taking Action
This week, identify one way your life has become more mechanical and less human because of digital habits. Then take concrete action to reclaim that aspect of your humanity:
- If you’ve started thinking in terms of “optimizing” everything, pick one area of life where you’ll consciously choose meaning over efficiency
- If you’re constantly measuring yourself against metrics, identify which metrics actually matter to you (not to platforms) and ignore the rest
- If you’ve lost the capacity to be bored or sit with nothing happening, practice doing nothing for 15 minutes daily without reaching for your phone
- If your decisions have become overly data-driven, make one choice based entirely on intuition or feeling this week
- If you can’t remember the last time you did something truly anomalous and weird, do something this week that makes no sense for your demographic profile
The point isn’t to reject helpful technology. It’s to prevent technology from remaking you in its image.
As Rushkoff reminds us: “Our values—ideals such as love, connection, justice, distributed prosperity—are not abstract. We have simply lost our capacity to relate to them.”
Reclaim that capacity. Be human, not a data point pretending to be human.