MIT Technology Review, USA: How do the Apple-Google contact tracing apps work?
- When you enable exposure notifications, your phone starts using Bluetooth to constantly scan for nearby phones doing the same thing. (This happens in the background, and it’s designed not to use much extra battery.)
- When two phones connect, they swap anonymous ID codes. Your phone records how long you spend around the other device and guesses how far away you are, based on a mixture of factors such as how the phone is oriented and how strong the signal from the other handset is.
- If you test positive for covid-19, your health department will ask if you’d like to notify people you may have exposed. If you agree, they’ll give you a code to enter into the app. This code authorizes your phone to send its ID codes—still anonymous—to a central server, which is managed by your state or national health authority.
- Meanwhile, your phone periodically checks the server for new IDs that have been associated with positive tests and cross-references them against the ones it’s collected over the past two weeks.
- If your phone thinks it’s been within six feet of flagged devices for at least 15 minutes in a day, you’ll get an alert that you may have been exposed, including information about what to do next.
Also Read: ‘COVID Tracker Ireland' app to help contract trace as well as slow the spread of Coronavirus
What does effective contact tracing look like?
- Effective contact tracing, whether it’s done by a human or by an app, is a three-pronged process: identify who has the virus, identify who those people have spent time with, and convince those contacts to stay home.
- Access to testing has remained a fundamental problem—apps can’t work if users don’t get tested for COVID-19. And if people do get tests, they need to trust their governments (or tech companies) enough to enter positive results into the app. Finally, everyone who gets an exposure notification needs to take advice about properly isolating.
Also Read: Apple, Google release first version of COVID-19 exposure notification API
How do contact tracing apps deal with privacy?
- Health departments have struggled to build trust around contact tracing. A recent Pew survey found that 40% of Americans are unlikely to even talk with manual contact tracers. And despite many layers of anonymity, exposure notification apps have earned significant criticism over privacy concerns. They’ve been called out by Amnesty International, consumer protection groups, and even 39 US attorneys general.
- Health departments can use privacy-preserving technology from Google and Apple and still ask users to send them a phone number if they get an exposure notification. While the feature is entirely voluntary—the apps still work if users don’t add their numbers—many governments don’t ask, in an effort to make people feel more secure about privacy.
- This focus on privacy means certain trade-offs. If people were willing to talk to contact tracers after getting an exposure notification, they could help public health experts understand the spread of disease.
Are contact tracing apps working?
- There’s evidence that apps can help by breaking transmission chains and preventing new cases, even without tons of users. They may be useful as part of a “Swiss cheese” model: even though every approach has holes, stacking lots of them together can make a solid barrier. But it’s unclear how much exposure notifications do to change people’s behavior, particularly since it’s difficult to track how many people get exposure notifications and later test positive.
- Many experts are anxiously following the progress of Ireland’s app, which is actively used by more than a third of the adult population. Between mid-July and mid-October, users uploaded 3,000 positive results, representing around 11% of confirmed cases. In October, Ireland became the first country in Europe to reimpose a nationwide lockdown. (The country’s rate of new cases per capita dropped almost immediately, and is now a sixth of America’s rate.)
- Unfortunately, the promise of a smartphone solution conflicts with one of the harshest realities of the pandemic: marginalized groups around the world are contracting and dying of covid-19 at rates far higher than people with greater socioeconomic power. People in these groups are also less likely to be tested in the first place. Smartphone apps may not be as helpful in such communities, particularly if members have good reasons to distrust the government.
What comes next?
- While many countries now have national apps, there hasn’t been a federal effort in the US—which happens to be the world’s coronavirus hot spot. Instead, health departments in individual American states have been forced to create a patchwork of apps.
- Statewide exposure notifications may finally be picking up steam. In September, Google and Apple started letting health agencies in the US offer exposure notifications without building their own apps. The tool, called Exposure Notifications Express, is baked into operating systems from iOS 13.7 on. That means iPhone users can just turn notifications on in the settings menu. Google, meanwhile, has a ready-made app that it customizes for each state.
- One major roadblock has been a fragmented system for managing the IDs, or “keys,” associated with positive tests. Users weren’t getting notifications from people who were on other states’ apps. In August, the US Association of Public Health Laboratories built a communal server that makes it much easier for apps to talk to one another and send keys across state lines. So far Washington, DC, and 12 states—mostly on the East Coast—have launched apps using this system, and four more have pilot programs
Copyright 2020 Technology Review, Inc.
Distributed by Tribune Content Agency, LLC
Also Read: Moderna says its COVID-19 vaccine is nearly 95% effective