AI and Digital Ethics: ‘There Could Not Be More Overlap’

Author: ISACA Now
Date Published: 7 July 2022

Editor’s note: Carissa Véliz, a keynote speaker at ISACA Conference Europe 2022 , to take place 19-21 October in Rome, Italy, and virtually, is an award-winning author, an Associate Professor at the Institute for Ethics in AI, and a Fellow at Hertford College, University of Oxford. Her work focuses on digital ethics, with an emphasis on privacy and AI ethics, as well as practical ethics, political philosophy and public policy. Véliz recently visited with ISACA Now to share her perspectives on AI, digital ethics and more. The following is an edited transcript:

ISACA Now: What do you consider to be the most pressing digital ethics issues that companies need to focus on?
We need to think more carefully about how to design AI and digital tech with ethics in mind from the very start. Ethics can’t be an add-on. That’s the only way in which we will be able to create technology that supports what we value. The alternative to ethical tech is unethical tech because ethics doesn’t happen by chance. Ethics happens by design.

ISACA Now: How much overlap is there between digital ethics concerns and advancements in AI?
There could not be more overlap. Digital ethics invites us to think about the kind of AI that we want to advance. Do we want to advance racist and sexist AI? AI that supports authoritarian regimes? No. We need to advance the right kind of AI, and for that, we need better ethics and governance.

ISACA Now: What are some of the socioeconomic considerations that people tend to overlook when it comes to privacy?
One consideration is that privacy shouldn’t be a luxury for the well-off, but a right afforded to everyone. Among other reasons, because if we want equality of opportunity, we need privacy. That means that privacy should be default setting. And that already involves a change of paradigm.

ISACA Now: You believe personal data should be regulated as if it were a toxic substance – can you elaborate on that?
For the past few decades we have thought about personal data mainly as an asset. But we are starting to learn that personal data is as much an asset as it is a liability. And it shares that with toxic substances like asbestos—they can be useful, but also very dangerous, and we can’t just ignore the latter. Every personal data point is a potential hack, leak or lawsuit. When companies collect or store more personal data than what they strictly need, they are becoming their own biggest risk.

ISACA Now: How confident are you that governments are capable of sufficiently dealing with some of these data privacy and ethics challenges?
We need governments to change things, but we also need so much more than that. We need the right corporate culture, too. And we need to bring in more voices to the debate: from tech workers to users and journalists and academics of different fields. It’s going to take a collective effort. And we can do it. Other generations did it with their big industries. This is the task of our generation.