Contact-Tracing Apps: What’s Needed to Be an Effective Public Health Tool

Read the original article: Contact-Tracing Apps: What’s Needed to Be an Effective Public Health Tool


Jane Bambauer and Brian Ray recently wrote how “Covid-19 Apps Are Terrible—They Didn’t Have to Be.” They argue that a false sense of priorities—most notably, a “fetishized notion of individual privacy”—kept the contact-tracing apps from working well and protecting us. But I’d argue the problem is not that simple.

Bambauer and Ray have much of the big picture right. They’re correct that once the pandemic ends, policymakers around the world will need to revisit what was done right and what was done wrong. As Bambauer and Ray note, policymakers will need to carefully consider trade-offs between individual privacy and public health. (I have more to say on this topic in a book coming out in April.) But the authors miss the fundamental reasons why contact-tracing apps have failed to take hold in the United States and in Europe and end up scapegoating privacy protections as the reasons the apps failed.

Let’s start with South Korea, which is where Bambauer and Ray’s paper begins. They note that South Korea avoided lockdowns and claim that the country’s use of cell site location information (CSLI), closed-circuit television (CCTV), credit card data and old-fashioned interviews enabled better contact tracing than the U.S. managed. That’s correct—but it’s only part of the reason why digital tools for contact tracing have been more effective in South Korea than in the U.S. or Europe.

South Korea’s first reported case of the coronavirus was recorded the same day the first reported case cropped up in the United States. South Korea’s use of contact tracing worked well. It also led to some serious invasions of privacy. Initial efforts, for example, involved public websites detailing the movement patterns of infected individuals. Later on, the government walked back some of the more privacy invasive measures. The U.S., by contrast, did not draw from all those different streams of information; contact-tracing apps and human tracers were left to do the work without CSLI, CCTV and credit card data. But digital support of contact tracing was not the only difference between South Korea and the U.S. 

From the start, South Korea took the coronavirus seriously. This was in part because of its experience with MERS, in which a single ill person infected 28 others and caused illness in four hospitals, leading to a total of 185 infections. Coronavirus testing was ramped up quickly (far faster than in the U.S.), as was contact tracing. Post MERS, the South Korean government changed regulations so that in the event of a new infectious disease, diagnostic testing equipment could be approved rapidly. The nation also changed its laws. While the 2011 Personal Information Protection Act bans the collection, use, and disclosure of personal information without consent, the 2015 Infectious Disease Control and Prevention Act allows South Korea’s Ministry of Health and Welfare to quickly access CSLI and credit card records during an infectious outbreak.

A nondigital aspect of South Korea’s response was also quite important: People in the country exhibited a willingness to isolate, avoid crowds and wear masks. Early on in the pandemic, months before the medical establishment fully understood that the coronavirus spreads from person to person through airborne transmission, 50 percent of South Koreans reported postponing or canceling social events, 42 percent avoided crowded places and 63 percent wore masks outside the home. Even today, after hundreds of thousands of Americans have died from the illness, there isn’t widespread adoption of similar behavior in the U.S. 

Bambauer and Ray point to South Korea’s response, one that steamrolled over potential privacy protections, and blame Google, Apple and privacy advocates for creating a situation in which the apps didn’t provide sufficient information to public health authorities. The argument contains some problems. 

For one, the authors get some facts wrong. They say the two companies have reputational problems because of their “aggressive and well-documented use of personal information.” As a general matter, Google does have a privacy problem (full disclosure: I worked at Google as a senior staff privacy analyst in 2013-2014); but Apple largely does not (a source cited by the authors gives Apple an A+ on privacy). 

But more importantly, the authors’ broader critique fails to tackle the most significant problem for contact-tracing apps. The apps were always playing catch up against the backdrop of the health equity issues that the pandemic sharply exposed. 

First of all, contact tracing works based on trust. Whether it is contact tracing for HIV/AIDS in the United States in the 1980s or 2010s, or Ebola in Monrovia in 2015, establishing trust is critical for contact tracing to be effective. Contact tracers do their jobs asking infected patients what they need, how they can be helped, if they are safe in isolating, and so on. Contact tracers will arrange for someone to get tested, to have social workers visit, to provide food. Apps don’t provide those services. Establishing that personal connection and offering to provide help builds trust—and that enables contact tracing, which is privacy invasive, to work. 

The U.S. is well positioned to have significant trust issues, particularly among disenfranchised communities. Racist actions by the government—including redlining neighborhoods and placing polluting facilities near Black neighborhoods—and by theBecome a supporter of IT Security News and help us remove the ads.


Read the original article: Contact-Tracing Apps: What’s Needed to Be an Effective Public Health Tool