Digital ethics: The good and bad of tech
As ever-smarter technology proliferates, so too does the potential for its misuse. From a flawed algorithm that may have led to the wrongful deportation of millions of people in the U.K. to Cambridge Analytica's harvesting of 87 million users' data without their consent, a number of high-profile incidents show how unchecked use of technology can lead to negative consequences.
That's why, now more than ever, digital ethics matter.
"What digital ethics asks us to do is to take a harder look, before we create the technology, at … those potential impacts, throughout the design, the creation, the implementation, the deployment, and the ongoing management of it," says Jen Rodvold, head of digital ethics and tech for good at IT consultancy Sopra Steria.
But it's not just about the technology, she points out: "Digital ethics should be a part of all decision-making because there's a huge amount of both potential risk and opportunity. This is about achieving organizational goals more broadly."
In this episode of Technology Untangled, Rovold joins Dave Strong, pre-sales director for hybrid IT at Hewlett Packard Enterprise, and Mick Jackson, CEO of WildHearts Group, in a discussion on the imperative of digital ethics and why doing the right thing is good for business.
Edited excerpts from the podcast, hosted by Michael Bird, follow:
Michael Bird: Digital ethics is a field of study that looks at how technology is shaping and will shape our political, social, and moral existence.
Dave Strong: Over the last three years, we've seen a real explosion in emerging technologies, such as artificial intelligence, and the amount of data that's out there and what we're doing with data to bring insight.
Bird: Normally on Technology Untangled we'd kick off with a little bit of history, but the story of ethics is, at its core, the story of innovation itself―from the Internet, the smartphone, and social media to AI, automation, and data analytics.
Strong: If we wind all the way back and we look at the rise of the Internet, [we see] that actually the Internet has come from a defense background; it's come from a government background. And in the Cold War, that type of technology was used to control nuclear weapons and to also model what would happen in a nuclear war.
As we roll forward to today, as we see the explosion of social media and how we interact from a digital ethics perspective, we need to be thinking about how much data we're giving away.
If I like the application, I download it, I click it, I use it. Rarely do I think, "What have I just signed up to [and] the data that I've just given away?" And we're living in that type of very immediate consumption-driven culture. We forget about that to a certain extent.
We've seen artificial intelligence and the technology accelerate at such a pace, and there are lots of quotes that say that the pace of change has never been so quick and it will never be this slow again. Now is the right time for us to really think as organizations as to what we should do.
Growing focus on digital ethics
Jen Rodvold: Technology is absolutely everywhere in our lives, and so the impacts are greater [and] our relationship to it and our understanding of it is greater.
Bird: Now, for Jen, it's no surprise that we're hearing more and more about the ethics of technology right now.
Rodvold: There are quite a few reasons that we're seeing the bigger focus right now, and we absolutely are. We've been watching this in my practice just over the last two years, seeing the focus grow and the attention grow on the ethical consequences of digital technology.
I mean, for four or five years, maybe a little bit more, there's been a lot of talk and academic research around particularly the ethics of AI, and that for most of us seems probably abstract and too far in the future to worry about in our day-to-day lives. But what's happened, I think since really 2016 and then more in 2018, is I've seen some really bad examples play out in public.
Bird: Ah, yes, 2016 to 2018. Now let's cast our mind back to some of their best bits.
In May 2017, it was revealed that a flawed algorithm may have led the U.K. government to wrongly deport millions of people.
In September 2017, Facebook was accused of silencing reports of ethnic cleansing in Myanmar.
In March 2018, a self-driving Uber car killed a pedestrian in Arizona.
In April 2018, Google employees [held a] strike against the company's involvement in the U.S. government's Project Mavern, also known as the algorithmic warfare cross-function team.
In July 2018, the ACLU demonstrated that Amazon's facial recognition used by local law enforcement could misidentify 28 sitting members of Congress as criminals.
In September 2018, a security breach exposes 50 million Facebook users' data.
And who could forget the big one?
A wake-up call
Strong: I remember I was on my way to a customer meeting in London, and I walked past this building and there were a lot of press outside. Later on in the day I was watching the news and I realized I'd walk past the Cambridge Analytica headquarters in London.
Bird: Cambridge Analytica, the scandal that frankly shocked the world. Although perhaps, in hindsight, we should have seen it coming.
Strong: What brought it to light ... they had harvested over 87 million Facebook users' accounts through an external app, and it came from a personality quiz. But that type of data harvesting wasn't new; [U.S. President Barack] Obama had been a pioneer and his campaigns have been the pioneer of using data and targeted data to give different messages to individuals. I just think Cambridge Analytica took it one step too far. They took data that people didn't know was being harvested and manipulated it to a point where contradictory information was given to me. That's unethical use of data at its highest.
Personally, and what amazed me and humbled me almost, was the ease at which it was done and to the high-profile political activities that were associated and the platforms that were associated with it as well. And I think that, yeah, for me personally, it made me really reflect as a person in the technology industry. We [must] really think about everything [we] do in everything we move forward with to make sure that we are protecting the ethical side of data.
And it has to be at the center and at the heart of everything we do.
Bird: That was in 2016, but the well-documented polarization of society via social media has continued. However, the Cambridge Analytica scandal did have one good side effect: We've started thinking a lot more about the value and power of our own personal data and how we feel about organizations that abuse that power.
Fake news, anyone?
Rodvold: Facebook is consistently in the news because of issues with fake news and the impact that's having on democracy and elections. So I think it's now just really coming into the public consciousness in a different way, when it was a bit more abstract. It's having a real impact on our lives, and the elections is a good example of this, and people's understanding that there are issues with facts in the media that we consume every day ... and interestingly more awareness in the public around how technology and technology companies use data, which we haven't really been paying attention to collectively for the last decade.
Strong: There is a balance. There is the great stuff we can do and the things that obviously coming to press that have certainly caught the limelight and caught people's focus of what we shouldn't be doing with these technologies.
Bird: The power and innovations of AI and the way we use data are at the crux of the big ethical quandaries over the past few years―firstly, because they're increasingly visible in our everyday lives, and secondly, because they have given us capabilities beyond our wildest dreams. Think about social media, newsfeeds, online grocery shopping, ordering a takeaway from an app, blurring your background on a conference call or picking what next to watch on Netflix, and all of these rely on AI and personal data in one way or another.
Strong: I don't think we've ever been in a position as we are now to have real actionable insight that can make such a positive impact to our society. Data can make some real life-changing things. And the healthcare industry is a great example of that.
There's a hospital in Norway that has been working to reduce the mortality rates of high-risk operation in heart and liver, and the way they've done that is they've taken a very traditional 2D MRI scans, they've taken lots of data, and they brought all of this data together. And then they've used other exciting emerging technology, such as augmented reality, to create 3D images of hearts and liver. And that has allowed them to, on really complicated operations, use that data globally. You get surgeons from around the world to collaborate and to figure out how they operate.
Bird: AI has the potential to transform our lives for the better. So why all of these unintended consequences? Why do we so often get it wrong?
The human element
Rodvold: I think it's easy to forget that humans create technology. They aren't created in a vacuum. And I think most technology tools that we use in everyday life have been created with good intentions. But what digital ethics asks us to do is to take a harder look, before we create technology, at the potential negative consequences, and then make sure that we're maintaining that focus on those potential impacts throughout the design, the creation, the implementation, the deployment, and the ongoing management of it.
But it's not inherently good or bad. That's something that humans have to bring to it themselves.
Bird: In the first episode on AI, things like bias and algorithms come from human error and the way that we code our programs. Even the autonomous cars are operating under parameters designed by humans, and herein lies the problem.
Tech is fundamentally neutral, but new tech has unforeseen consequences and willful ignorance isn't going to cut it. We need to get better at seeing potential flaws, designing them out, and protecting individually. But how on earth do we do that?
Strong: There are lots of examples out there of where we certainly need more control, and then we come onto the whole complex subject that is GDPR. How do we protect that data? The GDPR goes some way to protect an individual, but we just can't rely on that alone.
I think for me―and I look at grassroots, I look at right at home―we should really be educating people on how to protect themselves, [whether] at school, in the workplace, or in your home environment.
Rodvold: Thoughtful regulation is a good thing; there's no problem with that. But I guess it's incumbent upon industry to think about the things that we could do now to make regulation unnecessary.
Strong: I think we're a crossroads, and we see where we want to push forward with technology and the exciting new opportunities that we have with technology. And then we have the legal ramifications and the legal issues around how to protect people's data. Are we taking a step back and actually thinking about the implications for that technology or as organizations or as countries?
Are we looking to see how quickly we can implement that fantastic new emerging technology without thinking about the long-term consequences? And, and I think that's a real challenge for us as a global community: to think about how do we balance that pace of change with the thought around the implications to human life.
Rodvold: There is no way to eliminate risk completely, of course, and things are going to go wrong. That's what happens―life is complex and technology is especially complex. But I think what we're asking organizations to do and to think about is actually just building the corporate infrastructure and expertise to help you better anticipate the potential consequences.
I think really what's been missing, quite simply, in the past decade is that technology is often created procured and implemented in a vacuum created often by people with a pure technology background who don't necessarily always think about history and humanities and social consequences.
Bird: Jen says there's very little difference in terms of ethical responsibility, whether we're creating tech or procuring it.
All in this together
Rodvold: In the future, we're going to see more average organizations, if not creating technologies, then using so much technology in a way that they will need to take a similar approach to those larger or technology-creating organizations. So eventually it is about equipping all of us in business, in the public sector, and in society with a level of awareness around the ethics of technology and how to use data ethically.
For organizations that are creating technology, I think it is about making sure that they are establishing really robust digital ethics strategies and the expertise—both at the very technical and micro level but also throughout the organization—of digital ethics issues. And that's everything from data and privacy and transparency to also thinking about how technology is driving social issues like displacement and changing the world of work and skills in the organization and in society, [including] issues of bias, fairness, equality, and accessibility.
Bird: Digital ethics needs to be at the heart of our organizations because quite simply, our use of tech is unavoidable and it has a massive impact on the world of work and society.
As Jen alluded to, this is more than just designing tech for good; it's about how tech fits into the wider spectrum of ethical practice in our organizations. It's about who you are as an organization and what you believe in. And I know just the guy to talk to.
Mick Jackson: As a working-class kid growing up, I was raised to believe I could do anything if I worked hard enough and I get into university and I worked away. It's getting increasingly harder; the levels of inequality in our society are alarming. And I don't know how long society thinks they can keep going with the vast majority of people with their faces pressed up against the glass, you know, wondering how they can get a slice of that life that others enjoy. That's not sustainable; it's morally wrong and it's a sociological time bomb.
Bird: This is Mick Jackson, the founder and chief executive of the WildHearts Group.
A new corporate mandate
Jackson: WildHearts Group is an organization that I founded with the purpose of using business as a force for good and around using the spirit of entrepreneurship and innovation to address head on the ... alarmingly increasing issues that the world faces.
We launch companies and use the profits and the activities of those companies to further our social mission. Business has the talent, the resources, the reach, the influence, the networks, the innovation—it has everything that the world's needs to address the issues that we face. And my passion through some of our corporate clients who are very significant is not only that they buy their business supplies and their business products and document storage and their entrepreneurship training and all these services that we sell to fund the mission, but they reimagine their role, they reimagine their reach and reimagine the resources they have to make an impact in the world.
Bird: The WildHearts Group has a plethora of wide ranging social missions from running the largest bank in Malawi for rural female micro-entrepreneurs to a funding banking for the poor in Haiti.
Jackson: In the U.K., America, and Europe, we manifest our spirit of empowerment through providing all kids who wants it with free world-class entrepreneurial education through a program called Micro Tyco, which has been a bit of a phenomenon. Over 50,000 people have taken part across 40 countries. It's taught as summer schools at Yale and Cambridge. It's used by some of the world's top companies to train their graduates and our senior leaders. They pay for it. Kids get it free.
A further manifestation is in South Africa. We're now the largest purchaser and distributor of reusable sanitary pads. Now, from the outside looking in, you could say, "What on earth has banking got to do with entrepreneurship education which has got to do with sanitary pads?" It would appear to be a very eclectic portfolio! But the spirit running through is what does that person need most to take agency in their life.
I found out and was horrified that a third of girls in South Africa, by way of example, drop out of school because they can't manage their periods. We developed a reusable sanitary pad that's extremely well designed, really well made, [and] can last for a year easily. And so the price of a cup of coffee provides the girl with the pad she needs to stay in school for a full year.
Let me tell you the implications of not educating girls. It's so extreme that the United Nations said the closest thing we've got to a silver bullet in addressing the scourges that face the world's poor and the world in general is ensuring we educate girls. An educated girl will have an average of two kids; an uneducated girl has an average of eight. That's your population explosion. An uneducated girl is more likely to be a child bride; she's more likely to be trafficked; she's three times more likely to contract HIV. Her child has half the chance of surviving past [the age of] 5. It goes on and on and on.
And all we had to do is give girls the same chance to get education as boys [have]. Really? And all it takes is a sanitary pad to do it. Come on, man, was that not obvious? Really. So, when I see the so-called demigods of business pontificating on all the things they do, I mean, you know something, there's a lot more you could do if you applied your mind to it.
Tech as the great leveler
Bird: Business for good is about reimagining your resources and role organizations play in the world around them. For the WildHearts Group, tech can be wielded as a great leveler, helping them to expand their programs across borders.
Jackson: [In] our program Micro Tyco ... 50,000 kids across over 30 countries have taken part, and the challenge was, how do we scale this? How do we reach more kids? How do we help more kids? And the answer was tech. We have recently just launched an initiative for mental health and well-being as part of our school's program. In our first session, we were reaching hundreds of people through webinars. Whereas in the past, our old way of thinking, we'd have thought people would have to go in person to schools.
The way of democratizing access, the way of giving kids access to some of the most inspiring business leaders and some of the most cutting-edge knowledge to inform them of how they can fulfill their potential and have a career of contribution, can now be delivered remotely so you can scale it exponentially. Now there are significant issues in ensuring that all kids have got access to tech and all kids have got access to screen. Now, we are looking at how tech can reach the kids.
Bird: Ethical business and digital ethics are fused. It's about tech for good, but more crucially, it's about how organizations can be a force for good to address bias in equality, fairness, and accessibility.
It sounds like a tall order, but according to Mick, it doesn't take anything more than reevaluating the resources you already have.
Jackson: Africa is the owner of a document storage business that has contracts with the South African government amongst other large corporates. The vans for that organization, called TTW, they crisscrossed South Africa, they pass some of the most remote areas of the country because they have to fulfill their contracts.
The genius was we said, "Well, if the vans are crossing the savannah and passing these little schools where these little girls are dropping out, why don't we use the vans to deliver the sanitary pads?" The infrastructure cost of that alone would be millions of dollars a year. All we did was reimagine what we were already doing.
Business for good is good for business
Could you imagine if the geniuses in the tech companies say, "Wait a minute, see this thing that we have that we walk past every day or we work on every day? Do you know if it can be repurposed?" And it's not that you're turning your back on your business. You're enhancing your business because what happens is your clients love it, your current teams get so engaged with what the company stands for and the values of the leaders, and the best talent wants to join you. So business for good is good for business. And it's essential for the well-being of the planet and for humanity. And it's also, I believe, essential for the well-being and the sustainability and relevance of corporates and brands.
Bird: Mike has kind of hit the nail on the head there because digital technology is woven into our lives. It's ethics are inextricably fused to everything we do in our organizations.
Jackson: There's so much talk about disruption in environment and disruption in tech and disruption in politics. I wrote an article about a subject we call compassionate disruption, and that is the attitudes and the demands that people now have towards business have changed so much that the executives who are not aware of it are literally not going to be relevant. They're going to be like the people chain smoking in their office. You go, "You do know we don't do that anymore? We've moved on." That's how antiquated that will appear. However, this isn't some burdensome legislation or some new thing that they have to do.
This is an amazing thing, too. The more enlightened leader brings more of themselves to the table: They bring their values. They don't have to say, "I care about my kids; it's keeping me awake at night that my daughter won't come out her room and she's really struggling. But I switched that relevance when I go to work because I can't think [about it]." But what if you could address that at work? The realization and the wonderful revelation of that is you'll feel much, much better about what you're doing on a daily basis, your team will be more engaged, they'll respect the leadership and want to be a part of that team more and your customers and clients. It actually enhances your core business if you get this right, but it has to be authentic and it has to be right for you. And it has to be within the space that you have a sphere of expertise.
That's why it's so pertinent to tech because tech has the ability to address the fundamental problems that ironically some aspects of tech has caused. It's incredibly exciting.
Bird: A digital ethics strategy isn't just a "nice to have." It's crucial for staff retention, for engaging customers, and at the end of the day, for profits.
Strong: Right back into 2015, I think this is the first one that I can remember that really made front page news for a long time. That was when TalkTalk had 157,000 accounts breached, and you look at the impact this has to organizations—it is long term. It has an absolute impact to the credibility of the organization and the image of the brand that is portrayed to us as consumers.
High performers practice responsible tech
Bird: Data breaches and unethical practices impact an organization's bottom line and trash their reputation. But getting it right can have a positive effect.
Rodvold: Organizations that get ethics right outperform those that don't on almost all metrics. There are too many studies to name—for example, there's one from Ethisphere that reported that companies with strong ethical credentials had higher rates of market capitalization, by 14.4%.
Then there's employee engagement. Everyone wants to attract the best talent. Even in the face of this economic crisis, there's still a skills gap, and being purpose driven and ethical is proven to be a major differentiator in the employment market. It can help identify new markets. It can engage your customers in different ways. Customers as well are now choosing ethical organizations more than unethical ones. The list goes on and on and on.
Bird: So how can we get a slice of that sweet, sweet, digital ethics pie? Well, it starts, first and foremost, with a culture of responsibility.
Rodvold: Digital ethics should be pervasive throughout the organization. It's not about a single department. Empower people to make the right decisions, raise the right questions when they don't think those decisions are being made, maintain a culture of open dialogue, because this is new when you're thinking about new technologies in particular.
Strong: Critical to any organization's progress is, how do they evaluate emerging technology?
It's about a balance for me. You've got to assess really quickly. Can they make a difference to your organization? Can they make an impact to the market you're operating in? Are they an ethical fit? And I think you've got to really be able to move on quickly and dispense if you don't see a fit.
Rodvold: Another thing that gets thrown around in the digital ethics world is thinking about what would happen if this technology appeared in an episode of "Black Mirror," which I think is both cute but also could be quite useful.
So you get people together to think about what is the worst thing that could happen and what is the best thing that can happen. And then how do you align this technology and those things to what your organization wants to achieve, including your culture and values?
Work in progress
Strong: Digital ethics is a relatively new area of focus. Therefore, there aren't many set methodologies or set frameworks that people can use. But there are lots of organizations that are defining standards, defining working practices, defining what policies could look like. I know there are lots of independent organizations that are really putting their best foot forward into shaping some of these standards globally. You know, whether it's the AI for Good foundation, whether it is a Tech for Tech U.K., or whether it's the Ada Lovelace Institute.
Bird: Because digital ethics is such a new and expanding branch of ethics, there aren't lots of frameworks in place for organizations to refer to. So perhaps it's easier to ask," What does good actually look like?"
Rodvold: We have just been working with a fantastic organization, Harrow Council in North London. They came to us to look at how they could use citizen data in a more sophisticated way to drive better outcomes for citizens in their digital services. So they wanted to drive a better experience on their web portal, for example, and help citizens better find an access to services they were looking for. They wanted to use citizen data to help citizens find information that the council knows is relevant for that citizen but the citizen might not have originally been seeking out.
They want to do really, really interesting things like use predictive analytics to anticipate what a citizen might need. For example, if the citizen is in a vulnerable situation, could that actually provide positive interventions to help before the situation gets more serious? So a really, really interesting project. And they were absolutely committed from the beginning to making sure that they took an ethical approach to the entire project and, indeed, to all of those digital services within the council.
So we helped them to create a digital ethics strategy actually. We took them through a very collaborative journey of understanding exactly what it was that they wanted to achieve and why, [and] what their culture and values were in their organization that would support or make it more difficult to embrace the digital ethics strategy. We talked with Harrow citizens about their expectations and views of digital technology, both generally and also specifically the council services. And we brought all of us together and co-created with them a digital ethics strategy and framework that helps them now understand exactly what they do and what they don't do, because those are their ethical parameters.
And then there's the strategy for implementing digital ethics across the council and the roadmap for doing that, and also critically measuring success. So I think that's something that gets lost a lot because ethics is still a little bit seen as something that's doing some nice stuff around the edges. But if this is going to be embedded in organizations, we really need to make sure that we understand what outcomes we're expecting.
Bird: Jen helps organizations to develop dynamic digital ethics strategies that are built to last. So I asked her for some top tips to kickstart digital ethics in any organization.
A digital ethics road map
Rodvold: Digital ethics should be a part of all decision-making because there's a huge amount of both potential risk and opportunity. It's not just about the technology. This is about achieving organizational goals more broadly. It's not just what technology is going to achieve, a single goal in isolation, so, for example, automating a call center or something. [You need to look at] how that automation of the call center might lead to workforce displacement or might make services less accessible to certain users or customers and then the impact that those have on other organizational goals, and that might be a commitment to social mobility or diversity and inclusion.
This is not about what the IT department is doing or the creators of technology. This is about how digital supports wider corporate ambitions. And then that should help you to find your framework: what you do and what you don't do.
I've touched on the importance of making sure you can measure success. So again, don't create a digital ethics policy that sits on a shelf somewhere and has never looked at it. Understand what it is you want to achieve through a digital ethics strategy and create the right objectives and KPIs and measure them over time. Make sure you've got the right expertise as with any kind of strategic initiative. This needs to be started from the top and empowered from the bottom.
Bird: The implications of tech on our lives, on our organizations, and on society at large are so wide arranging that it feels that we're just scratching the surface.
The key takeaway here is that we think about those big questions and discuss them as individuals and as organizations.
Strong: If I'm going to recommend anything, I go back to Life 3.0. Great read. It was actually the book that got me really into all of this understanding and focusing on what digital ethics really means for me as a person as well, and how I deal with digital ethics and what that means wider in terms of our society.
Rodvold: It is about risk mitigation but it's also about opportunity. So if you're not acting ethically and using digital ethics, then you're going to miss opportunities, you're going to not be thinking about the potential different and new users or customers you could bring on board if you did things differently or the trust that you could build with stakeholders differently and the differentiation you can bring to your brand through creating that trust and acting ethically.
Bird: The future, of course, is yet to be written. But what is clear is that digital ethics will be absolutely at the heart of everything we do. It has to be.
Rodvold: Digital ethics absolutely is an innovation enabler. It's not just about putting the guardrails on and thinking about risk all the time. It's actually thinking about what could we achieve if we thought about technology differently.
Strong: I'm an incredibly positive person, and for me, I truly believe that if we have the right controls, we have the right policies, we have the right procedures, and that we deal at the top level of government, the top level of countries, the top level of organizations, we will absolutely do the right thing with artificial intelligence and data. We will continue to improve the way we live and work.
Jackson: We're hearing more and more about some of the alarming consequences of the misuse of tech, through to polarization of society and mental health. I take great faith in the fight that the liberating effects of tech, the emancipating and leveling effect, is I think far, far exponentially greater than the negative. We just have to have that discussion and awareness, if you will.
Bird: So with the digital ethics can of worms well and truly opened—yeah, sorry about that—we come to the close of the first series of Technology Untangled.
A big thanks to today's guests, Dave Strong, Jen Rodvold, and Mick Jackson. You can find more about today's episode in the show notes, and be sure to hit subscribe in your podcast app to access all of our first series and to be the first to get new episodes when they drop.
Today's episode was written and produced by Isobel Pollard and hosted by me, Michael Bird, with sound design and editing by Alex Bennett and production support from Harry Morton and Alex Podmore. Technology Untangled is a Lower Street production for Hewlett Packard Enterprise in the U.K. and Ireland.
Thank you so much for tuning in and we'll see you next time.
Listen to other episodes:
[Technology] is not inherently good or bad. That's something that humans have to bring to it themselves.
This article/content was written by the individual writer identified and does not necessarily reflect the view of Hewlett Packard Enterprise Company.