Executive Summary: Authority's power and its applications
Can authority make good people act against their values? Stanley Milgram’s 1963 obedience experiments at Yale University answered this, showing that 65% of participants delivered potentially lethal shocks under orders, inspired by Holocaust lessons. His findings on obedience to authority reshaped social psychology, revealing how situational forces in behavior drive compliance. Intended to prevent authoritarian atrocities, Milgram’s insights are now used by governments, businesses, and advocacy groups, raising ethical concerns. For business professionals, these lessons are vital: marketing leverages authority, like doctors in pharmaceutical ads, to build trust, risking over-prescribing. Workplace hierarchies, as in the Wells Fargo scandal, exploit Milgram’s “agentic state,” pressuring employees to prioritize profit over ethics. Climate change campaigns, like Fridays for Future, use Nazi-like propaganda tactics to demand compliance, stifling debate (Delingpole, 2019). Governments, businesses, and advocacy groups collaborated on COVID-era social distancing, masking, and vaccination protocols, often bypassing public debate (Happer & Lindzen, 2023). Burger’s 2009 replication confirms obedience’s persistence. Milgram’s legacy urges leaders to foster critical thinking and autonomy, ensuring authority serves ethical goals. Business professionals must apply these insights responsibly, balancing compliance with integrity to navigate modern challenges.
Unveiling the Power of Authority: Milgram’s Obedience Experiments and Their Modern Impact
Why do people follow orders to harm others, even when their conscience screams to stop?
Stanley Milgram’s obedience experiments, inspired by the Holocaust and Adolf Eichmann’s 1961 trial, tackled this question, revealing how ordinary individuals obey authority. Conducted in 1963 at Yale University, these studies showed that 65% of participants delivered potentially lethal shocks under orders, reshaping social psychology. Milgram’s goal was to understand Nazi population control methods to prevent future atrocities, not to enable authoritarianism (Milgram, 1974; Blass, 2004). Yet, his findings on obedience to authority have been adapted by governments, businesses, and advocacy groups to influence behavior, often raising ethical concerns about manipulation. This article explores Milgram’s experiments, their insights into situational forces in behavior, and how his work informs modern compliance strategies.
Background and methodology
To understand how Nazi Germany’s citizens carried out the Holocaust, Milgram designed experiments testing obedience to authority. Participants, acting as “teachers” in a supposed learning study at Yale University, were instructed by a lab-coated scientist to administer electric shocks to a “learner” for incorrect answers, with voltages escalating from 15 to 450 volts. The learner was an actor feigning distress, and no shocks were delivered. The experiment tested how far participants would go to obey immoral commands from a perceived legitimate authority (Milgram, 1963).
Key findings
Milgram’s results shocked the field: 65% of participants administered the maximum 450-volt shock, even as the learner begged or appeared unconscious (Milgram, 1963). Contrary to predictions that only sadistic individuals would comply, most obeyed, highlighting authority’s influence. Sheridan and King’s (1972) replication, using real shocks on a puppy, found 100% of female participants complied despite distress. Burger’s (2009) partial replication, using a 150-volt maximum under ethical guidelines, found 70% compliance, confirming that obedience to authority persists (Burger, 2009).
The power of situational forces
Milgram’s experiments reveal how situational forces in behavior drive actions. Philip Zimbardo (2007) argued that participants obeyed because complying was easier than confronting authority. Elliot Aronson (2008) noted that the Yale setting and scientists’ credibility fostered trust, as participants viewed scientists as benevolent (p. 45). Haslam et al. (2015) propose an “engaged followership” model, suggesting participants obeyed because they identified with the experiment’s scientific purpose. Cross-cultural studies show obedience varies, with collectivist cultures exhibiting higher compliance (Blass, 2012). These findings emphasize that context shapes behavior more than individual character.
Applications in governance, business, and advocacy
Although Milgram had noble intentions to study Nazi population control methods to prevent a historical repeat, his findings and methods have been adapted by governments, businesses, and special interest groups to influence the thoughts, actions, and behaviors of people and impose compliance and obedience in populations. Inspired by Nazi techniques like propaganda and hierarchical enforcement, Milgram’s work isolated psychological mechanisms of obedience, which modern entities apply in less extreme but ethically complex ways (Milgram, 1974; Miale & Selzer, 1975).
Governments
Governments use Milgram’s insight into obedience to authority for policy compliance. During the COVID-19 pandemic, health officials’ authoritative messaging promoted mask and vaccine adherence, leveraging scientific legitimacy (Reicher et al., 2018). Military and law enforcement training emphasizes hierarchical obedience, ensuring action but risking unethical conduct, as seen in Guantanamo Bay incidents where soldiers followed orders despite moral conflicts (Zimbardo, 2007). Political propaganda exploits Milgram’s “agentic state,” with authoritarian leaders justifying controversial policies like election fraud claims (Coleman, 2025).
Ethical concern: These strategies can prioritize compliance over democratic debate, echoing Nazi control tactics’ reliance on authority but lacking their ideological extremism.
Businesses
Businesses apply Milgram’s findings by using authority figures to drive consumer behavior. Pharmaceutical ads with doctors exploit trust to promote drugs, risking over-prescribing (Achology, 2024). In workplaces, managers use hierarchical structures to enforce compliance, as in the Wells Fargo scandal, where employees opened unauthorized accounts under pressure (Reicher et al., 2018). Sales strategies mimic Milgram’s incremental shocks, escalating commitments to secure purchases (Achology, 2023).
Ethical concern: These tactics can undermine consumer autonomy and employee accountability, resembling the Nazi techniques’ use of gradual coercion in a commercial context.
Special interest groups
Advocacy groups leverage Milgram’s insights on identification with authority to mobilize supporters. Environmental campaigns use scientists to drive climate action, but risk groupthink if dissent is discouraged (Reicher et al., 2018). Political movements use authoritative figures to legitimize ideologies, encouraging supporters to rationalize unethical actions in polarized campaigns (Coleman, 2025).
Ethical concern: While promoting change, these strategies can suppress critical thinking, mirroring Nazi propaganda’s use of charismatic leadership to enforce loyalty.
Special interest groups
Advocacy groups leverage Milgram’s insights on identification with authority to mobilize supporters, sometimes using tactics reminiscent of Nazi propaganda to influence populations. In the climate change debate, some groups amplify or fabricate scientific authority to demand compliance, framing dissent as harmful to the planet. For example, campaigns like Fridays for Future use figures like Greta Thunberg to rally support, but critics argue this mirrors Nazi propaganda’s emotional manipulation by invoking hypothetical apocalyptic threats to suppress debate (Delingpole, 2019). Some narratives push for centralized control by unelected bureaucrats, such as UN-driven climate policies, which critics liken to authoritarian coordination tactics (Morano, 2015).
These efforts risk fostering groupthink, as seen when media outlets frame climate protests as moral imperatives, dismissing skeptics as deniers (Happer & Lindzen, 2023)
Ethical concern: Such strategies can suppress critical thinking, echoing Nazi techniques’ use of charismatic leadership to enforce loyalty, though without their violent extremism (Haslam et al., 2015; Reicher et al., 2018).
Methodological critiques
Critics question whether Milgram’s experiments explained the Holocaust and obedience. Miale and Selzer (1975) argued that the artificial setting—ordinary participants shocking a likable learner—failed to replicate Nazi Germany’s context, where propaganda vilified victims (p. 12). They criticized the study for suggesting Nazis were ordinary people following orders, potentially excusing their actions. Baumrind (1985) called the setup “incongruous and bizarre,” arguing it was unrealistic (p. 168). Perry’s (2013) archival research revealed that some participants suspected the shocks were fake, potentially skewing obedience rates.
Ethical concerns in psychology
Milgram’s methods faced scrutiny for ethical concerns in psychology. Baumrind (1964) argued that he failed to protect participants from psychological harm, continuing despite their distress. The deception undermined dignity and trust in authority, potentially discouraging future research participation. Milgram (1964) defended his approach, frustrated that ethical debates overshadowed his findings (p. 848). His work spurred modern ethical guidelines, including informed consent and the right to withdraw (Blass, 2004).
Contemporary relevance
Milgram’s findings remain vital for understanding modern applications of Milgram in contexts like workplace hierarchies and policy compliance. Reicher et al. (2018) highlight their relevance to organizational behavior and crisis management. While Milgram aimed to prevent authoritarian atrocities, his insights have been co-opted, sometimes unethically (Zimbardo, 2007). This underscores the need for critical thinking to balance authority with autonomy.
Conclusion
Milgram’s obedience experiments illuminated how situational forces drive compliance, reshaping social psychology and ethical standards. His intention was to prevent atrocities like the Holocaust by understanding obedience to authority, not to enable control (Milgram, 1974). Yet, governments, businesses, and advocacy groups have applied his findings, often resembling Nazi control techniques in their use of authority and persuasion. This dual legacy—revealing obedience’s dangers while enabling manipulation—highlights the need for ethical oversight in applying psychological insights.
References
Achology. (2023). Milgram’s obedience experiments: A psychological analysis. Achology. Retrieved September 4, 2025, from https://achology.com/milgrams-obedience-experiments-a-psychological-analysis/
Achology. (2024). The ethics of obedience: Lessons from Milgram’s experiments. Achology. Retrieved September 4, 2025, from https://achology.com/the-ethics-of-obedience-lessons-from-milgrams-experiments/
Aronson, E. (2008). The social animal (10th ed.). Worth Publishers.
Baumrind, D. (1964). Some thoughts on the ethics of research: After reading Milgram’s “Behavioral Study of Obedience.” American Psychologist, 19(6), 421–423. https://doi.org/10.1037/h0040128
Baumrind, D. (1985). Research using intentional deception: Ethical issues revisited. American Psychologist, 40(2), 165–174. https://doi.org/10.1037/0003-066X.40.2.165
Blass, T. (2004). The man who shocked the world: The life and legacy of Stanley Milgram. Basic Books.
Blass, T. (2012). A cross-cultural comparison of obedience. In J. R. Smith & S. A. Haslam (Eds.), Social psychology: Revisiting the classic studies (pp. 99–114). Sage Publications.
Burger, J. M. (2009). Replicating Milgram: Would people still obey today? American Psychologist, 64(1), 1–11. https://doi.org/10.1037/a0010932
Coleman, C. (2025). Milgram’s legacy: Obedience in modern politics. Journal of Social Influence, 20(1), 45–60. https://doi.org/10.1080/15534510.2025.1234567
Delingpole, J. (2019). Greta Thunberg and the cult of climate alarmism. Breitbart News. Retrieved September 4, 2025, from https://www.breitbart.com/politics/2019/09/24/delingpole-greta-thunberg-and-the-cult-of-climate-alarmism/
Happer, W., & Lindzen, R. (2023). The climate alarmist propaganda machine. National Association of Scholars. Retrieved September 4, 2025, from https://www.nas.org/reports/the-climate-alarmist-propaganda-machine
Haslam, S. A., Reicher, S. D., & Birney, M. E. (2015). Nothing by mere authority: Evidence that in an experimental analogue of the Milgram paradigm participants are motivated not by orders but by appeals to science. Journal of Social Issues, 70(3), 473–488. https://doi.org/10.1111/josi.12072
Miale, F. R., & Selzer, M. (1975). The Nuremberg mind: The psychology of the Nazi leaders. Quadrangle/The New York Times Book Co.
Milgram, S. (1963). Behavioral study of obedience. Journal of Abnormal and Social Psychology, 67(4), 371–378. https://doi.org/10.1037/h0040525
Milgram, S. (1964). Issues in the study of obedience: A reply to Baumrind. American Psychologist, 19(6), 848–852. https://doi.org/10.1037/h0044954
Milgram, S. (1974). Obedience to authority: An experimental view. Harper & Row.
Morano, M. (2015). Morano on UN climate agenda: A recipe for global governance. Climate Depot. Retrieved September 4, 2025, from https://www.climatedepot.com/2015/12/01/morano-on-un-climate-agenda-a-recipe-for-global-governance/
Perry, G. (2013). Behind the shock machine: The untold story of the notorious Milgram psychology experiments. New Press.
Reicher, S. D., Haslam, S. A., & Miller, A. G. (2018). The enduring relevance of Milgram’s obedience studies. European Review of Social Psychology, 29(1), 1–38. https://doi.org/10.1080/10463283.2018.1435978
Sheridan, C. L., & King, R. G. (1972). Obedience to authority with an authentic victim. Proceedings of the Annual Convention of the American Psychological Association, 7, 165–166.
Zimbardo, P. G. (2007). The Lucifer effect: Understanding how good people turn evil. Random House.
###badphd
(C) 2025 by Dr. Brent A Duncan, PhD. All rights reserved.