COVID-19(84) — Technology as a way out of pandemic or a way into surveillance capitalism

Branka Panic
7 min readDec 4, 2020

This article is a narrative cover for a video prepared for a Build Peace 2020: Social Justice & Pandemic in the Digital Age”. Build Peace is a yearly gathering of peacebuilders exploring emerging challenges to peace in a digital era, and peacebuilding innovations to address these challenges. This year Build Peace was a Virtual Conference held on November 6–8, organized by Build Up and the Bertha Centre for Social Innovation and Entrepreneurship in South Africa. As part of a “Surveillance Capitalism” sub-theme, the conference raised the questions of: How is the rise of surveillance capitalism affecting other conflict dynamics? How has the pandemic interacted with surveillance capitalism? How does this affect power dynamics related to socio-economic justice? What are the initiatives that are utilizing technologies and other social innovations to challenge securitization and defend privacy?

I was thrilled to join the conference in front of AI for Peace, with a short talk titled “COVID-19(84) — Technology as a way out of pandemic or a way into surveillance capitalism”. The following is a narrative for a video recording you can see at the end of the article, with some additional arguments that were not included in the video due to timeframe limitations.

Hello Build Peace Community,

As we are meeting in the surveillance capitalism conference sub-theme, I want to start with quoting Shoshana Zuboff, author of the seminal work The Age of Surveillance Capitalism, The Fight for a Human Future at the New Frontier of Power. Zuboff describes surveillance capitalism as a ‘new economic order that claims human experience as free raw material for hidden commercial practices of extraction, prediction, and sales’ and as a ‘rogue mutation of capitalism marked by concentrations of wealth, knowledge, and power unprecedented in human history’.

I want us to remember this as I try to reflect throughout this talk on technology as a potential way out of the crisis, or the way towards more surveillance capitalism. We have to acknowledge that this year Build Peace Conference is happening in the midst of the pandemic crisis, and Shoshana is looking at this crisis through the metaphor of a stress test for our democracies. I will discuss today this stress tests, and especially with relation to new technologies.

This stress test revealed in so many ways vulnerabilities of our societies, social and economic inequalities, racism, discrimination, poverty, disinformation and hate speech, institutional weaknesses, and general lack of trust in institutions. Covid19 is not only creating health emergency, but economic and social one. This brings unprecedented risks to peaceful, just, and inclusive societies. Let me show you how. For example, at the beginning of the pandemic security forces enforcing the lockdown in parts of Nigeria have killed more people than coronavirus itself. Activists across the world expressed concern over the excessive use of force around police lockdowns and curfews, as well as growing manifestations of authoritarianism: including limitations on the media, civic space, and free expression, among them. Let’s look at the US where Americans while grappling with the rapidly spreading coronavirus purchased more guns in the first pandemic month than at any other point since the FBI began collecting data over 20 years ago. Xenophobia is on the rise globally, in Australia ¼ of human rights complaints are related to covid19. Stigma and hate speech are on the rise, misinformation is running out of control. And since we are virtually in South Africa now, let’s look more closely at how pandemic has caused substantial damage to human lives and the economy, worsening inequalities and impacting especially low-income earners in informal economies. A remarkable finding was that income-related health inequality in the COVID-19 period was about six times that obtained in 2017. Pandemic has especially aggravated existing vulnerabilities of asylum-seekers, refugees, and undocumented migrants in South Africa.

Technology somehow got at the very front of the pandemic debate, from how technology is going to save us or how it is creating a dystopian future. There are some good things on the horizon, for example, robots used for delivery of food or medicine, or disinfection, 3D printing of ventilator parts, artificial intelligence used in drug discovery. Just a couple of days ago, MIT researchers announced that their AI model can identify asymptomatic COVID-19 carriers just by the sound of someone’s cough. Social media platforms provided a vital way for people to stay connected with each other while in isolation, daily video calls doubled in March reaching levels normally only witnessed on New Year’s Eve. At the same time Zoom, Microsoft Teams, and Skype have enabled many of us to continue with our work while protecting personal and public health. In the technology field and AI field we saw an incredible amount of cooperation, universities, civil society organizations, different actors started writing research, papers, publishing, sharing, utilizing machine learning and NLP being applied to see how we can learn more quickly about what is happening to our societies in the pandemic.

Without any doubt, those are all positive applications. But there are also a number of concerning uses of tech. Let’s look at one example, closely related to the topic of surveillance capitalism, numerous apps that many countries and private companies started developing at the beginning of the pandemic. At some point, everybody started thinking about some kind of app that would inform people if they can go out, or tell those who are infected and those who were in contact with them not to go out. And not only apps, patrol robots and drones, CCTV cameras, facial recognition, and big-data analytics for contact tracing and social control. These things may be tools for protection, but they are also instruments of fear and control. Once something is out there it’s very hard to roll it back, and both companies and authoritarian governments are well aware of this and using it to their benefit. When we look back at surveillance capitalism, we see that it actually originated in another crisis of the 9/11 terrorist attacks and the following fight against terrorism, powered by information provided by tech platforms. If the governments supported mass digital surveillance to power their anti-terror programs, we need to be extremely conscious about similar practices being introduced through the door of this pandemic crisis. In the context of this crisis, surveillance capitalism could, on one hand, offer part of the solution, while on the other exploit the crisis to expand its influence and power.

What we at AI for Peace did at the beginning of the pandemic was to make sure technology utilization is rooted in community needs. Following this goal, we started the AI Policy pandemic challenge, joining our partners at Omdena, with an idea to help policymakers understand the social and economic impact on the world’s most vulnerable populations and make better evidence-based and data-driven decisions in this pandemic. With the majority of the world’s population in some sort of lock-down due to COVID-19, the loss of income can be disastrous to those already on the economic margins. A global community of 50+ data scientists, policy and field experts has come together for a 10-week collaboration to evaluate the direct and indirect economic and social impacts of lockdown measures on vulnerable communities. How did we define vulnerability? As losing access to healthcare, losing access to employment, losing income, experiencing domestic violence.

Using technology in crisis is not something new. Even on a policy level, data-driven policies or evidence-based policies are not something new, for decades we are trying to get this field closer to evidence-based policymaking. But what is new in this specific moment, when this crisis is happening, this is the amount of data we are having, collecting, storing, processing power of computers, our capability to collect data in real-time, that is something new. Which in a way brought us from data-driven policy to something that we will maybe very soon call “big-data-driven policies”. There is a big opportunity to use this data, but with this comes big responsibility. And this is what AI for Peace is specifically engaged in, in bringing ethical perspective asking sometimes very difficult questions of fairness and accountability, security, privacy; we are bringing this ethical perspective, and asking these ethical questions, bringing human-centered approach and human rights approach, bringing human-in-the-loop when doing AI projects. There are many other organizations trying to utilize the power of data in this crisis in community-centered instead of tech-centered ways, monitoring crime and corruption related to coronavirus, tracking violence against civilians and protests, supporting vulnerable populations impacted by global cyber threats, tracking gender-based violence during lockdowns and many more.

While, as I said, there are many positive applications of technologies, we must be careful not to be pulled into rogue surveillance. And our biggest threat at this point is basically our ignorance. As long as we are informed, we will be able to push back. In the way we pushed back Zoom anti-privacy practices at the beginning of the pandemic, in the way we pushed back the facial recognition technology in policing throughout the US, we do have the opportunity to turn this around with the power of people and community organizing. I want to circle back to Shoshana Zuboff reminding us that the digital century is still young, the surveillance capitalism is young, but democracy is old, and “I put my money on democracy”, says Zuboff. And we do too. And for democracy we need informed citizens, so thank you once again to Build Up and Bertha Center for gathering us at this forum and helping us to keep our community informed and ready to push back to surveillance capitalism even in the pandemic crisis. Thank you.

— — — — — — — —

Three amazing projects were presented along with AI for Peace, and I highly recommend them all:

Mona Ibrahim, Policy Engagement Officer, Uni of Oxford (11:00–21:13)
Joel Gabri, Senior Peace & Technology Officer, Peace Direct (21:13–30:59)
Ona Wang, Researcher on Conflict and Justice (31:00–41:51)

I also highly recommend the keynote speech on “Surveillance Capitalism in South Africa”, by Thami Nkosi, National Coordinator at Right 2 Know.

--

--

Branka Panic

Exploring intersections of exponential technologies, peace-building and human rights. Founder and Executive Director of AI for Peace. https://www.aiforpeace.org