Hatem Abdel Rahman is an Assistant Professor of Management and Organizations at the Kellogg School of Management, Northwestern University, in Illinois
You're probably being tested right now. Organizations conduct countless tests online, trying to figure out how they can keep our eyes glued to the screen, convince us to buy a new product, or elicit a reaction to the latest news. But they often do so without letting us know — and it leads to unintended, sometimes negative, consequences.
In a recent study, my colleagues and I examined how a digital job platform can determine your next job, your salary, and your visibility to potential employers. Such experiments are often conducted without the worker's consent or awareness, and are widespread.
In 2022, a different study published in The New York Times found that the professional networking platform LinkedIn recruited millions of users without their knowledge. The authors claimed that these tests had a direct impact on users' professional lives, with many facing fewer opportunities to network with potential employers.
Uber has also experimented with fare pushing, which several drivers told media has led to lower profits. Tests of social media platforms have contributed to the polarization of online content and the development of “echo chambers,” according to research published by the journal Nature. Google is constantly experimenting with search results, which German academics have found puts unwanted websites at the top of search results.
The problem is not experimentation per se, which can be useful for helping companies make data-driven decisions. The problem is that most of them don't have any internal or external mechanisms to ensure that experiences are clearly beneficial to their users, as well as to themselves.
Countries also lack strong regulatory frameworks to govern how organizations use online experiences and the indirect impacts they can have. Without guardrails, the consequences of unregulated experimentation could be disastrous for everyone.
In our study, when workers found themselves unwilling lab rats, they expressed paranoia, frustration, and contempt at having their livelihoods subjected to experimentation without knowledge and consent. The consequences continued and affected their income and well-being.
Some declined to offer ideas on how to improve the digital platform. Others stopped believing that any change was real. Instead, they sought to limit their participation online.
The impact of unstructured online experiences is likely to become more widespread and evident.
US regulators accused Amazon of using experiments to raise product prices, stifle competition and increase user fees. Fraudsters use online and digital experiences to exploit older and vulnerable people.
Now generative AI tools are reducing the cost of producing content for digital experiences. Some organizations are even deploying technology that may allow them to sample our brain waves.
The increasing integration of experience represents what we call the “experimental hand” – which can have powerful impacts on workers, users, customers and society, in ways that are poorly understood but can have profound consequences. Even with the best intentions, and without multiple checks and balances, the consequences of the current culture can be disastrous for people and society.
But we don't have to embrace a Black Mirror future where our every move, interaction, and thought is subject to exploitative experimentation. Organizations and policymakers would be wise to learn lessons from the mistakes scientists made half a century ago.
The infamous 1971 Stanford Prison Experiment, in which university psychology professor Philip Zimbardo randomly assigned participants to the role of prisoner or prison guard, quickly descended into guards subjecting prisoners to horrific psychological abuse.
Despite noticing these consequences, he did not stop the experiment. It was doctoral student Christina Maslach, who came to help conduct the interviews, who voiced strong objections and contributed to its closure.
The lack of oversight of trial design and implementation has hastened the adoption of Institutional Review Boards (IRBs) at universities. Its purpose is to ensure that every experiment involving human subjects is conducted ethically and complies with the law, including obtaining informed consent from subjects and allowing them to opt out.
For internal review boards to operate beyond academia, organizational leaders must ensure they include independent experts with diverse expertise to enforce the highest ethical standards.
But this is not enough. Facebook's infamous 2012 experiment, which measured how users reacted to changes in positive or negative posts in their feed, was approved by Cornell University's IRB. The social media platform claimed that its users' agreement to the terms of service constituted informed consent.
FT Online MBA Rankings 2024 – 10 of the best
Find out which schools are in our online MBA rankings and read the rest of our coverage at ft.com/reports/online-mba.
We also need collective accountability to ensure organizations implement ethically robust experiences. Users themselves are often the closest and best informed people to provide input. A diverse group should have a voice in the design of any experience.
If organizations are unwilling to respond to user demands, those vulnerable to experimentation can create their own platforms to keep each other informed. For example, workers at the outsourcing platform Amazon Mechanical Turk, known as MTurk, founded Turkopticon to create a collective rating system for employers when MTurk refused to provide them with ratings.
We should not need another Zimbardo experiment to motivate organizations and governments to create barriers to ethical experiments. They should not simply wait for regulators to act. Maslach is not late, and neither should we.
This article was contributed by Tim Weiss, associate professor of innovation and entrepreneurship at Imperial College London, and Arvind Karunakaran, associate professor of management science and engineering at Stanford University.