top of page

When the Boss Is an Algorithm: What Happens When Performance Management Loses Its Human Element

  • Feb 24
  • 5 min read

Stephen Normandin, a 63-year-old army veteran, had been an Amazon delivery driver with a near-perfect record. Every delivery completed. Never late. Never a cancelled shift. Then one day, he was fired - not by a manager, not after a conversation, but by an automated system that generated a termination notice without any human input.

He told Bloomberg he was baffled: "I depend on this job to survive. This just doesn't make any sense."

His story is not an edge case. It's a preview.


Digital illustration of a person in profile with glasses. Their face shows circuit patterns, suggesting a blend of human and technology.

The Quiet Spread of Algorithmic Management


Algorithmic management - where AI systems take over the traditional functions of a manager, including assigning tasks, monitoring performance, and triggering discipline or dismissal - is no longer confined to gig economy platforms. It began with Uber drivers and food delivery couriers. It matured in Amazon's warehouse network. And it is now moving into call centers, healthcare, logistics, banking, and increasingly, white-collar work.

The appeal to organizations is obvious. Algorithms don't have favorites. They don't take breaks. They track everything, instantly, at scale. Amazon's internal system - documented in leaked legal filings - automatically monitors every worker's productivity rate and generates warnings or terminations "without input from supervisors." At some facilities, roughly 300 workers were let go in a single year through this automated process alone.


For operations leaders, the efficiency argument is compelling. For HR leaders, it should also be deeply uncomfortable.


What the Research Actually Shows


A growing body of organizational psychology research is catching up to what workers have been saying for years: algorithmic management is not a neutral efficiency tool. Its effects on human beings at work are significant, measurable, and frequently negative.


A 2024 study in Humanities and Social Sciences Communications found that algorithmic management consistently erodes job autonomy and increases psychological strain. A large-scale cross-sectional study of logistics workers in Sweden — published in the International Archives of Occupational and Environmental Health — found that workers with higher exposure to algorithmic control were more than twice as likely to report psychological distress, and significantly more likely to experience occupational accidents, musculoskeletal pain, and sleep disturbances.


Read that again: the workers most monitored by algorithms were twice as likely to report psychological distress. This is not an HR abstraction. It is a measurable health outcome.

Research from Finland, drawing on interviews with food delivery couriers, found that the constant threat of algorithmic penalty - being bumped to lower shifts, ranked publicly against peers, or deactivated — created a permanent state of low-grade anxiety. Workers described skipping proper breaks and ignoring physical warning signs because falling behind on the algorithm's clock felt more immediately threatening than the ache in their knees.


Meanwhile, a 2024 study in Frontiers in Artificial Intelligence identified what researchers call the core psychological cost of algorithmic management: the systematic removal of what organizational psychologists call "action regulation opportunities" — the small moments of judgment, discretion, and autonomy that make work feel meaningful and dignified rather than mechanical.


The Performance Paradox


Here is where it gets counterintuitive for leaders who see algorithmic management purely as a productivity play: the research suggests it may be undermining the very performance it is designed to optimize.

A 2024 study in Humanities and Social Sciences Communications examined how algorithmic management affects employee creative and adaptive performance — the kind of judgment-based, flexible thinking that organizations increasingly say they need. The findings were clear: high algorithmic control was negatively associated with both. When people feel constantly surveilled and evaluated against rigid metrics, they stop taking initiative. They optimize for the measurement, not the outcome.


This is sometimes called "teaching to the test" in education. In management, it produces workers who are extraordinarily focused on the numbers the algorithm can see - and entirely disengaged from everything it can't.

A warehouse worker packs faster. A call center agent closes tickets more quickly. And both do so in ways that quietly degrade quality, erode customer relationships, and hollow out the kind of institutional knowledge that doesn't show up on a dashboard.


The Trust Problem No Dashboard Can Fix


There is a deeper issue underneath the performance data, and it is one that HR leaders ignore at significant cost: algorithmic management fundamentally changes the psychological relationship between an employee and their employer.


Work, at its best, is a relationship. Humans extend discretionary effort — going beyond what's strictly required - when they feel trusted, seen, and treated as whole people rather than productivity units. Research in organizational behavior has consistently shown that this discretionary effort is where the real competitive advantage in knowledge work lives.


Algorithmic management sends a very clear signal about that relationship. It says: we don't trust you enough to let a human make these decisions. The effect on employee trust is predictable. When workers feel they are being monitored to the minute, evaluated by a system they cannot appeal to, and potentially fired by software, they reciprocate with exactly the level of engagement the algorithm demands - no more.


A researcher studying Amazon's warehouse operations described workers' relationship with the system as one of constant, exhausting compliance: performing not for pride in their work or loyalty to a team, but simply to avoid triggering an automated penalty. That is a fragile foundation for any organization's workforce, and a particularly alarming one for companies now rolling algorithmic tools into professional and knowledge work.


This Is Coming for White-Collar Work Too


It is tempting for managers in knowledge-intensive industries to read stories about warehouse workers and delivery drivers as concerning but distant. They shouldn't.

The same logic that justified constant productivity monitoring in fulfillment centers is now being applied to software developers tracked by keystroke monitoring tools, to salespeople scored in real time by conversation AI, to customer service managers whose teams are ranked on dashboards updated every 15 minutes. The infrastructure is the same. The psychological effects are the same.


The question for HR and business leaders is not whether algorithmic management will reach their organizations. It has, or it will. The question is whether they will deploy it thoughtlessly - chasing short-term efficiency while eroding trust, creativity, and sustainable performance — or whether they will treat it as what it actually is: a fundamental redesign of the employment relationship that demands careful, intentional governance.


What Leaders Should Do Differently


The research is not arguing that performance data is bad, or that technology has no role in management. It is arguing that replacing human judgment entirely is a different thing - and a dangerous one.

The organizations navigating this well share a common approach: they use algorithmic tools for visibility, not verdicts. Dashboards flag patterns for human managers to investigate and address. They do not automate consequential decisions about people's livelihoods. Feedback is delivered by a person who can explain it, contextualize it, and hear a response.


They also recognize that the way performance is measured shapes the culture around it. If your system can only see speed and volume, that's what people will optimize for. If you want judgment, creativity, and care — the things that actually differentiate organizations - those have to be part of what gets noticed and valued, by humans.


Stephen Normandin's story ends where too many like it do: he was eventually reinstated, after intervention. But he had already lost income, lost confidence in the system, and lost something harder to quantify - the sense that his years of reliable, careful work actually meant something to the people he worked for.

No algorithm tracked that loss. None of them do.

Comments


Top Stories

Stay updated with the latest articles and insights on building a people-focused organization. Subscribe to our newsletter for weekly updates.

© 2025 by People in Focus. All rights reserved.

bottom of page