Governments across the world are doing everything they can to fight the COVID-19 virus. This includes increasing hospital capacity, procuring protective equipment and ventilators; but also using data collected through mobile phones. Used correctly, large-scale location data might help monitor the effectiveness of lockdown measures and quarantines, while mobile phone apps collecting close proximity data could significantly increase our ability to track contacts of people who have been tested positive.
Multiple countries such as Belgium, the UK and the USA are considering using mobility data from telecom providers or tech giants, while Israel has already ( controversially) started. Smartphone apps collecting close-proximity data have been trialed in the last couple of years at the DTU in Denmark and at MIT in Boston and are being deployed by the Government of Singapore.
We’ve had many people reaching out to ask if the data shared to and by governments is anonymous, and whether it could be used effectively without enabling mass surveillance. We thought we’d share our response:
In short: this is a crisis situation and every day counts. However, proven solutions exist to help share data broadly without enabling mass surveillance.
Location data from mobile phones, collected by telcos and apps on your phone, is highly sensitive, and, despite claims of anonymity, is easy to re-identify. Our research showed that knowing 4 places and times where someone was, is enough to uniquely identify them 95% of the time and that adding noise doesn’t fundamentally help.
This data has however long been used to model and fight the spread of diseases, such as malaria in Kenya or dengue in Pakistan, and to help first responders after natural catastrophes such as for the 2015 Nepal earthquake.
In 2018, recognizing the need to find ways to safely use this data including in times of crisis, we put together a working group of privacy experts, telecommunication researchers, and practitioners. Together, we agreed on four broad models to use mobility data in privacy-conscientious ways. Since then, these models have been used, in practice, by NGOs like Flowminder (the remote access model) and the OPAL project (the query-based model) for mobile phone data, while centers like CASD have pioneered data access control mechanisms. These models exist and can be used today to enable mobile phone data to be used while preserving privacy.
Contact tracing, recording close proximity between people using bluetooth, wifi, or gps data, could help efficiently notify people that they have earlier been in contact with someone now diagnosed with coronavirus and should self-isolate. The typical design for such an app would either be to have the phone detect nearby phones, e.g. through Bluetooth like in Copenhagen and Boston, or through GPS, with phones sending the data to a centralised server which would then warn users when a case is confirmed. While potentially very effective, this also enables the collection of an extremely large amount of sensitive data. In a thought experiment two years ago, we estimated that 1% of London installing a malicious app could allow a digital attacker to track half of the London population1 potentially without their consent.
Here, we think proven cryptography techniques could help. Using technology such as oblivious transfer, the app could check whether a person they encountered in the last 14 days has been diagnosed as positive without the centralised server seeing personal data. These techniques have already been used at scale. For instance, the secure messaging service Signal relies extensively on encryption, secure hash, and other advanced cryptographic techniques to ensure that their server learns nothing of their user’s messages or calls; while Google’s private join-and-compute tool allows two companies to check for shared customers without learning each other’s full list of customers.
Strong technical solutions can help but will not be sufficient to protect what we need the most in times like these: trust. We need strong technical solutions but also transparency and oversight. Why and how the data will be collected and used need to be clearly communicated and explained. Maybe we are comfortable sharing our data to help governments study the spread of COVID-19; but maybe less so if this data then surreptitiously used to crackdown on individuals not respecting quarantines or kept and used for unrelated purposes. Oversight by data protection authorities and experts will similarly be essential to ensure trust and proportionality.
Fighting the coronavirus is a major challenge for our societies that will require us to leverage every tool and technology at our disposal. But this doesn’t have to mean mass surveillance. Good solutions exist, let’s use them!