Press "Enter" to skip to content

Coronavirus Has Shown That Our Data is More Vulnerable Than Ever

Technology has helped preserve some normalcy during the COVID-19 pandemic, allowing us to stay connected with each other and to work from home. But violations of privacy by various service providers has shown that we need to reimagine how our data is used by industry and government.

Editor’s Note: This article is published in partnership with Aapti Institute.

As the world adjusts to a new normal following COVID-19, social contracts – or the frameworks by which societies, governments and nations are organized – are being renegotiated. Countries are now forced to rethink all aspects of people’s lives, from employment to childcare. The British government, for example, has called for a fairer tax system after COVID-19.

The absence of protective policies for vulnerable and poor people is becoming apparent around the world. As a result, people are left to navigate life without health insurance or decent housing, even as the pandemic threatens their lives and livelihoods.

Social contracts in many aspects of life will need to be re-thought – from better supporting the informal economy and migrant labourers, to coping with mass global unemployment. 

For many years, we have been re-negotiating the social contract that governs our data-driven societies. COVID-19 has changed the nature of our relationship with technology. As people stay home and work remotely, digital technology has adapted and people have become more dependent on it. As a result, privacy concerns loom over everything, from contact tracing applications to video conferencing. The video conferencing app Zoom, for example, has seen the rise of ‘Zoombombers’ – hackers who join online meetings to harass participants with racist or graphic taunts. Several countries have since raised concerns over the protection of personal data while using the app.

All this is because countries and societies have long failed to properly address the privacy and data rights of people, even as data gets commodified and citizens are exploited. Many service providers are gathering more information about an individual than is necessary. Smart speakers are listening to conversations and mobile devices are silently tracking the location of their users. Individual information is also sold or repurposed without the user’s consent through a variety of activities. Some of these, such as junk emails and unsolicited telemarketing, are irritants. But others – like doxxing – are harmful to the individual.

With COVID-19 locking down the world, these issues have manifested themselves in an emergency; the space to negotiate with the government and technology companies is further diminished. While people are currently worried about data collection through contact tracing apps, advertisers have been tracking location data to influence consumers for years. Don’t be surprised that Zoom sends (or sells) your data to Facebook without your knowledge.

In many parts of the world, cities and countries have been putting citizens’ fates in the hands of algorithms for years now. Privacy concerns around such initiatives become graver when people are not able to control how their data is processed. In the U.S., for instance, after concerns arose over the need to release prisoners for the sake of social distancing, the government began using the controversial PATTERN risk assessment to determine whose release should be prioritized. But this risk assessment tool uses important data about prisoners and has been proven to be biased.

Privacy violations during COVID-19 have shown that we need to reimagine how our data is used – both during emergencies and in normalcy. Social contracts for data can – and should – look different, especially for sensitive personal information such as health data. Health data regulations across the world have been loosened during COVID-19 to ensure adequate medical treatment. We need to ensure transparency in how this data is used during emergencies – and demand clarity on the limits of these relaxations.

Once the COVID-19 emergency subsides, social contracts on personal data should reprioritize long-term interest over short-term pursuits and build technology with these interests in mind. It’s not sufficient to simply rethink the ownership and handling of data; countries must also allow people exercise real freedom in their interaction with industry (without ads) and with the government (without surveillance). And in all domains, citizens must be free from behavioural manipulation.

If our data is being commodified and sold, data governance structures must distribute the benefits derived from this data equitably across all people and societies. We must also address the data rights of communities and individuals that are most vulnerable.

Technology helps us preserve some normalcy during a pandemic, allowing us to stay connected with each other and to work from home. Individuals will often choose their own health and daily living comforts over data privacy. But we must expect more out of our governance systems. Policies should ensure that data is handled safely and responsibly, during and after global crises. These measures will impact us for many years to come. We must redesign our current data economy with the protection of individual rights in mind.

Aditi Ramesh is a Research Associate at the Aapti Institute. She graduated from the University of Southern California in Los Angeles with a degree in Economics/Mathematics. Her work at Aapti covers policy issues surrounding the data economy and privacy, and she is currently looking at models for data sharing.