Professor of Integrative Thinking, Rotman School of Management
Related Expertise: Data and Analytics, People Strategy, Culture and Change Management
By Mihnea Moldoveanu and Martin Reeves
One should be especially careful in using the words “reality,” “actually,” etc., since these words very often lead to statements [without any empirical content].—Werner Heisenberg
As recently as a decade ago, most business decisions were based on very limited data. More recently, an explosion of data, a dramatic decline in the cost of processing power, and advances in machine learning have created the expectation that we will be able to capitalize on a data windfall to revolutionize most aspects of business. The data (from the Latin datum, a thing that is given) on which machine learning and analytics are built are taken as given and incontrovertible. But what managers, data scientists, and social scientists think of as data is in fact not given. It is the outcome of a process of measurement—an interaction between an observer, a technique or apparatus, and a context.
As every seasoned executive knows, asking for data to inform an important decision can set in motion a process in which facts and figures are colored, filtered, obscured, deleted, shaded, clipped, and even fabricated. In practice, the result depends on whom you ask for it (observer dependence), how you ask for it (frame dependence), and when and under what conditions you ask for it (context dependence). Equally, the process by which we ask people to report preferences, emotions, and perceptions can interfere with the underlying state, to the point where enquiries can create rather than report the states they refer to. “Are you happy?” triggers a complex set of considerations about self and others that make it overly simplistic to interpret “yes” as “s/he is happy.” Research suggests that individual dispositions and propensities are actually second-person specific (“happy toward whom?”; “signaling happiness in the presence of whom?”).
Business does not have a clear and cogent way of dealing with these limitations of data and measurement, and they are usually ascribed to error and noise. But these limitations are precisely the grist of quantum mechanics—one of the most successful predictive theories humans have developed to describe the world. Quantum mechanics has also produced a model of measurement that uses concepts like indeterminacy, superposition, entanglement, and observer-dependence very precisely. This model can be used in other contexts, with phenomena that do not occur on the space-time scales of quantum mechanics.
Indeterminacy refers to the impossibility of jointly measuring certain pairs of variables, such as the momentum and position of a particle, with high accuracy. When we survey social and organizational interactions that create measurements we call data, we find many examples of such Heisenberg-complementary variable pairs.
Say you are trying to simultaneously measure the motivational force of an emotional state of a person and her awareness of that state. If a person’s awareness of being in a state (“excited”) impacts the motivational force of the state (because, for instance, the process of answering a question or filling out an instrument changes the intensity of the state) or vice versa, then we cannot measure both variables with the same accuracy. And, if both variables are relevant to the person’s propensity to act in a certain way in a given situation, then we are faced with a Hobsonian choice.
Quantum mechanics describes physical systems as being in superpositions of states (spin-up and spin-down), rather than discrete states (spin-up or spin-down). Moreover, when we do measure the state of one entity (an electron), our choice of measurement (spin-up or spin-down?) can impact the measured state of other entities (other electrons), even if the second measurement is performed so far away as to preclude the two entities from communicating with one another at velocities lower than the speed of light. That’s what physicists mean by “entanglement.”
Data-generating processes (producing “measurements”) in organizations can exhibit similar patterns.
Superposition. Humans frequently experience radical ambivalence with respect to internal states and dispositions to act. Research suggests that ambivalence is in fact not “uncertainty about how you really feel“—which could be resolved by clever observation and inquiry—but a superposition of dispositional, motivational, or emotional states. Measurements will collapse this superposition into a single state corresponding, say, to one motivation or another. Whatever we end up declaring to be data—the stuff we feed to predictive algorithms and take as axiomatic—depends on the measurement process we used to “collapse the superposition.”
Nonlocality. When we try to create composite estimates of the variables that matter to the success of an organization (like “heed” or ”openness”) by measuring individual-level variables (like “attention span” and “reward sensitivity”), the responses we get likely reflect nonlocal interactions, even if information flow is strictly classical in nature. Managers behave and respond in ways that depend on underlying “epistemic networks,” which reflect what they think others in their network think, what they think others think they think, and so forth. They often give answers and produce behaviors that shape and adapt to the perceived social context.
As every CEO knows, initiating a “transformation process” is not just hard but treacherous work: the information you need to do the right thing will be highly dependent on the perceptions, incentives, and behaviors of those on whom you rely to provide that information. The choice of dimensions and rubrics on questionnaires and interview and focus group scoring sheets will generate different response patterns based on the choice of words, sequence of questions, grammatical complexity, and perceived intent and purpose—as well as the perceived emotional temperature (active/passive, positive/negative, dominant/submissive) of the instrument we use to inquire.
We labor under the comfortingly simple but ultimately dysfunctional illusion of “classical information” and “classical measurement” when we deal with human and organizational phenomena. Insights from quantum epistemology raise powerful doubts about the given-ness of data. But businesspeople need more than reasonable doubt: they need insights and action prompts. How can we leverage “quantum effects” in human organizations? The “quantum epistemology of social phenomena” is in its infancy but already can provide a battery of new questions for those who want to understand and shape the process of measurement.
Contextualizing the “Given.” The examples above demonstrate how important it is to specify the process by which data is generated, and, in particular:
Calibrating the Importance of Quantum Effects. These effects will be more critical in some contexts than others, leading us to ask:
Separating the Separable. The examples also show how important it is to consider the pairwise interactions among different measurements, which suggests that we should also ask:
Something is real if it is real in its consequences, argued sociologist David Émile Durkheim in the early 20th century. Every strategist has a little Durkheim floating around in her mind: she knows that even talking about measuring a variable that impacts a decision with economic consequences will itself have economic consequences. These considerations are rarely applied to measurement processes—leading us to ask:
Taking Superpositions Seriously. Just as quantum computation harnesses phenomena of superposition to generate useful work, taking the radical indeterminacy of affective states seriously can generate useful strategies for intervening in organizations, by asking:
In order to create value in the fluid and dynamic organizations and business ecosystems of today, we need to better understand how people’s interactions themselves create value. We need to reinvent the measurement tools we use to understand organizations, and quantum mechanics provides a promising basis to achieve this.
Originally published by Scientific American
The BCG Henderson Institute is Boston Consulting Group’s strategy think tank, dedicated to exploring and developing valuable new insights from business, technology, and science by embracing the powerful technology of ideas. The Institute engages leaders in provocative discussion and experimentation to expand the boundaries of business theory and practice and to translate innovative ideas from within and beyond business. For more ideas and inspiration from the Institute, please visit Featured Insights.
Managing Director & Senior Partner, Chairman of the BCG Henderson Institute
San Francisco - Bay Area
ABOUT BOSTON CONSULTING GROUP
Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we work closely with clients to embrace a transformational approach aimed at benefiting all stakeholders—empowering organizations to grow, build sustainable competitive advantage, and drive positive societal impact.
Our diverse, global teams bring deep industry and functional expertise and a range of perspectives that question the status quo and spark change. BCG delivers solutions through leading-edge management consulting, technology and design, and corporate and digital ventures. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, fueled by the goal of helping our clients thrive and enabling them to make the world a better place.
© Boston Consulting Group 2024. All rights reserved.
For information or permission to reprint, please contact BCG at permissions@bcg.com. To find the latest BCG content and register to receive e-alerts on this topic or others, please visit bcg.com. Follow Boston Consulting Group on Facebook and X (formerly Twitter).