The Milgram experiment is often explained as proof that people are disturbingly obedient to authority. But when the experiment is explained more fully, a different conclusion emerges — one about responsibility, not obedience.
Most people remember it like this:
Ordinary participants were instructed by an authority figure to administer electric shocks to another person. The study measured how far individuals would go in obeying authority, even when actions conflicted with personal conscience. Conclusion: people are disturbingly obedient.
That reading is convenient. It lets us believe the problem is weakness, naivety, or a moral flaw we’d like to think we don’t have.
That’s not the only thing Milgram showed—and arguably not the most important.
Milgram didn’t expose bad character. He exposed a quiet, dangerous truth: we will hand off responsibility whenever we can—and organizations quietly normalize this transfer.
The Question The Milgram Experiment Was Really Asking
Participants didn’t ask, “Is this right?”
They asked, “Am I responsible for this?”
The moment the answer felt like no, the shocks continued.
“I was just following instructions” wasn’t a rationalization after the fact.
It was the operating logic during the decision.
Milgram later described this as the “agentic state”—a psychological shift where people stop seeing themselves as responsible for their actions.
This is the part we miss when we talk about ethics at work.
Most harm doesn’t come from malicious intent.
It comes from outsourcing judgment to authority—and feeling relieved when we do.
Why Organizations Make Milgram Easier
Modern workplaces are structurally designed to reduce personal accountability.
Responsibility is distributed upward.
Execution is distributed downward.
And moral weight evaporates somewhere in the middle.
You hear it in phrases like:
-
“This came from leadership.”
-
“Legal signed off.”
-
“We’re aligned at the exec level.”
-
“I don’t agree, but it’s not my call.”
These statements don’t describe powerlessness.
They describe psychological cover.
The more layers between you and the outcome, the easier it becomes to stop asking hard questions.
Distance Isn’t Neutral — It’s Protective
One of Milgram’s most telling findings had nothing to do with authority.
When participants could see or hear the person being shocked, obedience dropped sharply.
When the victim was distant, abstract, or invisible, obedience increased.
Now look at how work happens today:
-
Layoffs announced by email
-
Decisions justified by metrics
-
Harm reduced to dashboards, decks, and KPIs
-
Hiring decisions delegated to AI
-
Consequences absorbed by people you’ll never meet
Distance doesn’t just reduce empathy.
It protects careers.
The Escalation Problem
No participant jumped straight to the highest voltage.
They moved in small, incremental steps.
15 volts at a time.
That’s how ethical erosion works at work, too:
-
“Just this once”
-
“Just for the quarter”
-
“Just until things stabilize”
-
“This isn’t ideal, but it’s temporary”
No single step feels decisive.
But looking back, the path is unmistakable.
No one wakes up unethical.
They wake up compliant.
Why High Performers Are the Most at Risk
Here’s the uncomfortable part.
The people most likely to comply aren’t disengaged employees or bad actors.
They’re:
-
Reliable
-
Conscientious
-
Invested in their careers
-
Trusted by leadership
High performers believe in systems.
They’re rewarded for execution.
They’ve learned that friction has consequences.
So when authority signals “this is handled,” they stand down—not because they don’t care, but because they’ve been trained to prioritize alignment over dissent.
Organizations don’t run on obedience.
They run on people who don’t want to be the problem.
The Modern Milgram Experiment Explained
Milgram’s experiment wasn’t only notable for how many people complied.
It was equally notable for those who didn’t.
In the original study, approximately 65% of participants administered the highest voltage level when instructed by an authority figure. But a significant minority—roughly one-third of participants—refused to continue, even under pressure. Their reasoning wasn’t complex. They didn’t debate the science or the setup. They simply rejected the idea that responsibility could be transferred to someone else.
The decision was still theirs.
That belief mattered more than authority, instructions, or consequences. This is the part of Milgram that’s rarely discussed—and most relevant to work.
Where Responsibility Actually Lives
People with clear principles don’t deliberate endlessly under pressure. They’ve already decided what they’re accountable for. And when others believe those principles are real—that you will act on them—authority loses much of its force.
This is why principles aren’t abstract values. They’re strategic constraints. They reduce ambiguity, limit escalation, and make certain actions non-negotiable before the moment arrives. In every system, the people who refuse to relinquish responsibility quietly reshape how power operates around them.
The real lesson of the Milgram Experiment isn’t about obedience.
It’s about where responsibility is allowed to land.
If you want to make better decisions under pressure, don’t wait for the moment to test yourself. Clarify your principles in advance. Make them visible. Make them credible. That’s the foundation of all worthwhile strategy.
Clarify the Principles That Guide Your Decisions
Under pressure, most people default to the system around them.
The professionals who shape outcomes are different—they’re clear about what they’re accountable for before the moment arrives.
Our structured self-discovery tools help you surface your principles, understand how you make decisions, and make that judgment visible and credible at work—so responsibility stays where it belongs.

