Campbell’s Law: Why your metric will be gamed

The more any quantitative social indicator is used for social decision-making, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor.

–Donald Campbell, 1979

Campbell, originally an experimental psychologist and trained in experimental method as was customary in his field, soon realized that true experiments could not be done in any of the social sciences because no one would let social scientists treat human beings the way laboratory scientists treated rats and other experimental animals. You couldn’t manipulate people that way because they were free enough to reinterpret the conditions of any experiment and because the institutions where experiments were done were sensitive to the public relations, if not always the moral, issues involved.

[…]

An experimenter might choose a condition for the social program to be tested, but the subjects of the experiments–organizations and the people responsible for them–inevitably and quickly understood how the numbers their actions piled up could be used in ways that might help or hurt their interests. And so, just as routinely, did their best to make sure that the numbers came out the way that gave the best outcome from them, manipulating them in ways their organizational positions and knowledge made available to them. Who knew better how to to that? And that’s been a robust finding. It’s what people organizations do, if they can (and usually they can).

–Howard S. Becker, Writing for Social Scientists (find in a library)

Campbell’s Law: Why your metric will be gamed

Evicted: Matthew Desmond’s Pulitzer Prize-Winning Ethnography of Tenants, Landlords, and Eviction in an American City

I’ve always felt that my first duty as an ethnographer was to make sure my work did not harm those who invited me into their lives. But this can be a complicated and delicate matter because it is not always obvious at first what does harm.

With all the talk of data science, big data, and computational modeling, it’s increasingly important to highlight exceptional examples of rigorous research employing different methods, as these methods are no less important in our quest to better understand human social systems.

Perhaps the best book I read in 2018 was Matthew Desmond’s Pulitzer Prize-winning ethnographic study of tenants and landlords, Evicted: Poverty and Profit in the American City (find in a library).

In “About This Project,” Desmond details how his ethnographic study ultimately led to a mixed-methods research inquiry. Desmond describes designing a survey, the Milwaukee Area Renters Study (MARS), to assess the experience of renters in the Milwaukee rental market. He notes that his measurement (i.e., survey) items were greatly influenced by what he learned during his ethnography, which, in my experience, is a critical feature of good survey research – qualitative inquiry driving quantitative measurement (and vice-versa). He noted, for example, that simply asking, “Have you ever been evicted?” is likely to undercount evictions, since “eviction” connotes sheriffs and courts for many of the respondents, and a better measurement item would assess the lose of a rental home due to nonpayment or for other reasons.

The multiple methods and different data sources used in this book informed one another in important ways. I began this project with a set of questions to pursue, but lines of inquiry flexed and waned as my fieldwork progressed. Some would not have sprung to mind had I never set foot in the field. But it was only after analyzing court records and survey data that I was able to see the bigger picture, grasping the magnitude of eviction in poor neighborhoods, identifying disparities, and cataloguing consequences of displacement. My quantitative endeavors also allowed me to assess how representative my observations were. Whenever possible, I subjected my ground-level observations to a kind of statistical check, which determined whether what I was seeing on the ground was also detectable within a larger population.

Desmond also highlights the importance when conducting qualitative research that information be verified whenever possible through alternative sources and, in particular, using official records such as those collected by social services and the courts.

I analyzed two years’ worth of nuisance property citations from the Milwaukee Police Department; obtained records of more than a million 911 calls in Milwaukee; and collected rent rolls, legal transcripts, public property records, school files, and psychological evaluations.

The two surveys that Desmond designed following his fieldwork both achieved very respectable response rates: 84 percent for the MARS survey and 66 percent for the Milwaukee Eviction Court Study.

Desmond was also clear when he noted, in multiple places, his own involvement in the events that he was studying. He describes two instances, in particular, in which he provided funds for the rental of a U-Haul truck and a loan to a mother to purchase a stove and refrigerator in advance of an anticipated visit by Child Protective Services. He also explained that he occasionally provided transportation for individuals looking for housing.

Researchers, particularly those working in field settings–which includes organization scientists–rarely seem to do as good a job as Desmond in examining potential biases introduced by the researcher’s mere presence. In survey methodology training, we’re explicitly taught to understand how the presentation of measurement items can affect response data – whether the survey is incentivized or not and if so, what type of incentive is used (overincentivizing survey participation, for example, will generally reduce the quality of the data); whether surveys are presented electronically, on paper, or by a field interviewer; and even the colors and fonts used when presenting items to respondents.

In light of what we know about survey measurement, it’s a tall order to disentangle and fully understand the bias introduced by a researcher doing ethnographic fieldwork, so I was pleased that Desmond did so in Evicted, and did so in an accessible and highly engaging way (in “About This Project.”)

Quantitative social scientists could learn a great deal from our colleagues with more experience using qualitative methods and inquiry.

Desmond practices open science and promotes re-use of his data:

I have made all survey data publicly available through the Harvard Dataverse Network.

And he suggests that other researchers must attempt to replicate his extensive findings in other geographic areas:

That said, it is ultimately up to future researchers to determine whether what I found in Milwaukee is true in other places. A thousand questions remain unanswered. We need a robust sociology of housing that reaches beyond a narrow focus on policy and public housing. We need a new sociology of displacement that documents the prevalence, causes, and consequences of eviction. And perhaps most important, we need a committed sociology of inequality that includes a serious study of exploitation and extractive markets.

Yet Desmond questions what, in the context of a human socio-economic system like landlords and tenants, we actually mean when we talk of “generalizing” findings or replicating them elsewhere:

Still, I wonder sometimes what we are asking when we ask if findings apply elsewhere. Is it that we really believe that something could happen in Pittsburgh but never in Albuquerque, in Memphis but never in Dubuque? The weight of the evidence is in the other direction, especially when it comes to problems as big and as widespread as urban poverty and unaffordable housing. This study took place in the heart of a major American city, not in an isolated Polish village or a brambly Montana town or on the moon.

Finally, Desmond describes the power of storytelling in conveying research:

Ethnographers shrink themselves in the field but enlarge themselves on the page because first-person accounts convey experience—and experience, authority.

While the product of Matthew Desmond’s extensive ethnographic fieldwork, follow-up research, and synthesis stands on its own and should be read by every social scientist in the U.S., I cannot do better than to close with Desmond’s own words at the end of his methodological documentation, revealing the intense interplay between researcher and subjects in any ethnography:

The harder feat for any fieldworker is not getting in; it’s leaving. And the more difficult ethical dilemma is not how to respond when asked to help but how to respond when you are given so much. I have been blessed by countless acts of generosity from the people I met in Milwaukee. Each one reminds me how gracefully they refuse to be reduced to their hardships. Poverty has not prevailed against their deep humanity.

I highly recommend Matthew Desmond’s Evicted: Poverty and Profit in the American City (find in a library).

Evicted: Matthew Desmond’s Pulitzer Prize-Winning Ethnography of Tenants, Landlords, and Eviction in an American City

Measurement: Validity, Reliability, Accuracy (The Basics)

Validity. Data have validity if they accurately measure the phenomenon they are supposed to represent.

Reliability. Data have reliability if similar results would be produced if the same measurement or procedure were performed multiple times on the same population.

Accuracy. Data are accurate if estimates from the data do not widely deviate from the true population value.

So basic, but so important.

From the National Science Foundation – Science & Engineering Indicators 2018 Methodology.

Measurement: Validity, Reliability, Accuracy (The Basics)