The COVID-19 pandemic has brought out some ugly truths about modern-day ageism. A combination of the virus’s properties, an overwhelmed health care system and systematic neglect have taken a brutal toll on the elderly.

Unfortunately, it could get worse — if countries effectively automate ageism by allowing what has happened so far to dictate future decisions about care.

This crisis has highlighted a shocking lack of concern about older people. As one journalist at the British newspaper Daily Telegraph opined: “COVID-19 might even prove mildly beneficial in the long term by disproportionately culling elderly dependents.” In the United States, nursing homes are particularly vulnerable because the people who work there are so poorly paid that they must hold down multiple jobs, increasing the risk that they will spread the virus. The death count at such facilities is at least 7,000, but more are certainly coming from states such as Florida that have been slow to respond and report.

Older people also lose out the medical personnel, inundated by coronavirus patients, must make difficult decisions on rationing care. In Italy, for example, hospitals had to refuse care to older patients — a practice that undoubtedly increased the mortality rate in that age group.

In short, it’s fair to say that the death rate among the elderly is probably higher than it would be if only physiology were at play. Now consider what will happen if data scientists try to take this experience, bake it into predictive algorithms and apply them in places where the pandemic is still on the rise, or where it flares up as countries attempt to reopen.

It’s possible they’ll recognize that they lack the information needed to build reliable algorithms. The data available are too deeply flawed to calculate overall mortality rates, let alone rates by age. It’s hard to even use other proxy data, such as internet searches for “fever,” to get a sense of the overall infection rate, because lockdowns have so radically changed peoples’ behavior that we’re all staying home, drinking tea, watching Netflix and googling symptoms. Basically, we’re acting sick.

But I wouldn’t count on humility. Researchers have become far too accustomed to imagining that if they collect enough data — even if it’s incomplete or biased — the sheer volume will provide a more or less comprehensive view. It’s something that they’ve gotten pretty good at — for example, inferring your political party by looking at which articles you repost on Facebook. The flawed data we have will be seen as better than nothing.

The resulting models will lack critical context and nuance. They won’t account for the likelihood of the patient surviving if they’d been given better treatment. Causation will be lost, creating a denuded description of the past — which, in turn, will skew against treating old people in the future.

Suppose such a model were used to decide where to send ventilators. The scarce life-saving equipment would go to places where it saw high percentages of people likely to benefit. This might improperly tip the balance away from hospitals that serve large elderly populations, for example near The Villages in Florida, on the grounds that they’ll just die anyway.

Although I’m focusing on older people, the same could apply to any number of disadvantaged groups, such as African Americans, fat people, prisoners. And it wouldn’t be new. A recent study, for example, found that a widely used health care algorithm allocated inadequate resources to black Americans, because it relied on data from a history of discrimination, in which less money was spent on black patients than on white patients with the same level of need.

In the scramble to model COVID-19 spread and fatalities, data scientists — and the officials who use their models — would do better to recognize and admit what they cannot do, rather than jury-rig something that could end up doing more harm than good.


Cathy O’Neil is a Bloomberg Opinion columnist. She is a mathematician who has worked as a professor, hedge-fund analyst and data scientist. She founded ORCAA, an algorithmic auditing company, and is the author of “Weapons of Math Destruction.”

In a time of both misinformation and too much information, quality journalism is more crucial than ever.
By subscribing, you can help us get the story right.