New COVID-19-based contact-tracing and symptom-screening apps are not yet ready for prime time. While the Apple/Google initiative and others seem promising, they currently present too many enterprise challenges around privacy, compliance, employee acceptance and overall effectiveness. This report outlines the challenges with each, along with more secure ways enterprises can ensure the safety and health of employees and those they come into contact with.
Now that organizations are beginning to think about bringing remote employees back to the office, the security team for a municipality is seeing applications rolled out for COVID-19 contact tracing and symptom screening, and would like guidance on how best to assess them. Specifically, the team asks:
- How safe are such apps in general?
- If an app/workplace program is storing people’s names and temperatures, does this fall under the Health Insurance Portability and Accountability Act (HIPAA)?
- Should we consider using such apps or build our own?
Challenges of Contact Tracing
Part of re-entry planning involves controversial methods, such as fever scanners and employee-tracing apps to try to detect infected employees and inform/protect others from possible exposure. One key problem is contact tracing, which seeks to identify the set of people potentially exposed to someone with the virus. How can we achieve this goal without simply building a massive central database that tracks everyone’s whereabouts at all times (a solution that has obvious risks and drawbacks)?
Other big challenges include the collecting of protected health information (PHI), potential invasion of privacy and the stigma that could come with workers testing positive for COVID-19 and possibly being treated differently by coworkers, regardless if they were treated and recovered.
Contact-Tracing Apps on Mobile Devices
There are many different proposals for Bluetooth-based proximity tracking apps on mobile devices, but at a high level, they all begin with a similar approach. The app broadcasts a unique identifier over Bluetooth that other, nearby phones can detect. To protect privacy, many proposals, including the Apple/Google method, rotate each phone’s identifier frequently to limit the risk of third-party tracking.
When two users of the app come near each other, both apps estimate the distance between them using Bluetooth signal strength. If the apps estimate the users are less than six feet (or two meters) apart for a sufficient period of time, the apps exchange identifiers, and each app logs an encounter with the other’s identifier. The users’ location is not necessary, because the application need only know if the users are sufficiently close together to create a risk of infection.
If a user informs the app they are infected with COVID-19, the app then notifies other users of their own infection risk. This is where the different app designs significantly diverge.
Some apps rely on one or more central authorities with privileged access to information about users’ devices. For example, TraceTogether, developed for the government of Singapore, requires all users to share their contact information with the app’s administrators. In this model, the authority keeps a database that maps app identifiers to contact information. When users test positive, their app uploads a list of all the identifiers they came into contact with over the past two weeks. The central authority looks up those identifiers in its database and uses phone numbers or email addresses to reach out to other users who may have been exposed. Unfortunately, this puts a lot of private user information into the hands of the government.
This model creates unacceptable risks of pervasive tracking of individuals’ associations and should not be employed by other public health entities. IANS believes these applications are not ready for production and could bring legal ramifications surrounding privacy violations and misinformation.
The Apple/Google contact-tracing application uses a different approach (see Figure 1).
According to the app’s developers, the system takes a number of steps to prevent people from being identified, even after they’ve shared their data. While the app regularly sends information out over Bluetooth, it broadcasts an anonymous key rather than a static identity, and those keys cycle every 15 minutes to preserve privacy. Even once a person shares that they’ve been infected, the app will only share keys from the specific period in which they were likely contagious.
Crucially, there is no centrally accessible master list of matched phones, whether they belong to users who test positive or not. That’s because the phones themselves are performing the cryptographic calculations required to protect privacy. The central servers only maintain the database of shared keys, rather than the interactions between those keys.
However, several potential weaknesses have been identified with the application:
- In crowded areas, it could flag people in adjacent rooms who aren’t actually sharing space, making people worry unnecessarily.
- It may not capture the nuance of how long someone was exposed – working next to an infected person all day, for example, will expose you to a much greater viral load than walking by them on the street.
- It depends on people having apps in the short term and up-to-date smartphones in the long term, which could mean it’s less effective in areas with lower connectivity.
It’s also a relatively new program, and Apple and Google are still talking to public health authorities and other stakeholders about how to run it. It doesn’t help that Google’s past work on Project Nightingale, a health data aggregation and search project with the Ascension healthcare system, has already created many concerns around Google’s attitude toward privacy and healthcare regulations.
Bluetooth Apps Could Have Limited Usefulness
Bluetooth-based apps probably can’t replace old-fashioned methods of contact tracing, which involve interviewing infected people about where they’ve been and who they’ve spent time with, but it could offer a high-tech supplement using a device that billions of people already own.
The only way to make it work, however, is with informed, voluntary and opt-in consent. That is the fundamental requirement for any application that tracks a user’s interactions with others in the physical world. Moreover, people who choose to use the app and then learn they are ill must also be able to choose whether to share a log of their contacts.
Governments must not require the use of any proximity application, nor should there be informal pressure to use the app in exchange for access to government services. Similarly, private parties must not require the app’s use to access physical spaces or obtain other benefits.
Fever scans provide a different challenge. Organizations considering them may want to have a medical professional perform them manually, confidentially – and without storage of the data. This may help reduce compliance issues as well as employee push back and any stigma associated with the results.
It’s too soon to think about using any of these new COVID-based applications, at least until we have more answers about privacy, accuracy and data storage. The apps all claim to have good encryption and limited data storage of PHI. However, we just don’t have any real data around the risk and cost vs. reward/positive effects to date. Another aspect to be aware of is the capability for an end user to simply shut off the app at any given time (see Figure 2).
Cisco is rolling out its own app to help with the employee screening process, but it has yet to be used in production.
COVID-19 and InfoSec: What You Need to Know, May 11, 2020
Proactive Privacy Program Checklist, Dec. 23, 2019
Any views or opinions presented in this document are solely those of the Faculty and do not necessarily represent the views and opinions of IANS. Although reasonable efforts will be made to ensure the completeness and accuracy of the information contained in our written reports, no liability can be accepted by IANS or our Faculty members for the results of any actions taken by the client in connection with such information, opinions, or advice.