In the next few days, public health officials had to trace more than 7,000 people who had recently visited nightclubs in the same district. It was a task that would be impossible with conventional, manual contact tracing approaches — and a perfect opportunity to use the technology-enabled track and trace model South Korea had adopted with enthusiasm.
There was one complication, though: Itaewon is home to many LGBTQ-friendly nightclubs. News reports focused on that detail and included the age, gender, location, and movements of the Covid-19 patient who had been there — all too predictably resulting in an increase in anti-gay rhetoric on social media. The South Korean LGBTQ community is now afraid that they will be forced to out themselves, and some 3,000 people who may have been exposed in Itaewon have not come forward for testing.
Over the last several years, our lab has been developing tools to allow individuals to report symptoms, confirm diagnoses, and share epidemiological data in real time, including the use of mobile apps, analytics tools, and novel diagnostics. Our experiences have shown us the power of these technologies as well as their dangers, even in low-stakes contexts.
This was just a simulation, of course, but it shows how technology designed to help us work together can instead drive us apart.
Outbreaks expose and amplify the cracks in trust among us. They are a crucible — a chaotic, unpredictable environment in which an insidious threat can cause leaders and citizens alike to revert to basic instincts for self-protection. Mistrust, paranoia, and suspicion can heighten, perverse incentives can take hold, and blame and stigma often thrive, creating a toxic and dangerous culture.
The death of George Floyd at the hands of police, and the consequent protests across the country, have lain bare the discriminatory profiling and brutality that many underrepresented groups faced long before Covid-19. It is reasonable to fear that information gathered from disease-tracking technologies could be misused to target and harm the same vulnerable groups. Afraid of data being weaponized against them, vulnerable communities might opt out of using outbreak surveillance technologies altogether, further increasing their risk of infection.
Sharing outbreak data saves lives. But its potential to do so can be reached only if the global community ensures that the development of the contact tracing technologies is guided by principles that protect and serve the most vulnerable among us. Ideally, these principles would follow those already laid out to ensure privacy protection in Bluetooth contact tracing, such as the Data Rights for Exposure Notification and the Contact Tracing Joint Statement, while also being expanded to cover broader data use scenarios such as self-reporting of symptoms and risk prediction from aggregated health data.
Among the key principles:
- Participation should be completely voluntary, with the option to stop at any time.
- Trust is essential. No one should fear that participation will lead to being tracked, deported, or worse.
- No one should be punished in any way or stigmatized for behavioral information reported through an app.
- No one’s data should be bought or sold by others.
- And no one who wants to participate should be left out, either for lack of access to technology or fear of consequences.
Pardis Sabeti is a professor of organismic and evolutionary biology and immunology and infectious diseases at Harvard University and a researcher at the Broad Institute of MIT and Harvard. Andres Colubri is a computational scientist in the Sabeti lab and an incoming assistant professor of bioinformatics and integrative biology at the University of Massachusetts Medical School.
Contact tracing technology must protect people from discrimination as well as disease
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.