Enable javascript in your browser for better experience. Need to know to enable it?

魅影直播

Perspectives edition 11 banner
Perspectives edition 11 banner

Ethical technology: From purpose to practice



A rising, and collective, responsibility听听听

In many respects, it鈥檚 a sign of progress that ethical considerations are no longer an occasional concern, but core to the way we do business. The COVID-19 pandemic, rising demands for social justice, and the widening digital divide and consequent denial of opportunity for some segments of the population have all put ethics on the agenda of enterprises globally in a way that鈥檚 virtually unprecedented. Even better, firms are taking a multitude of positive steps in response, from pledging support to social causes to announcing steps to foster diversity within their workforces.听


Most business leaders would acknowledge that technology has ethical implications, but despite technology becoming more and more central to what enterprises do, it鈥檚 not always clear how to approach and apply technology in an ethical way. 鈥淭echnologists have, for a long time, been operating with a utopian mindset,鈥 says Rebecca Parsons, Chief Technology Officer at 魅影直播. 鈥淭he assumption is technology can solve the world鈥檚 problems, and there鈥檚 no bad technology, it鈥檚 just sometimes put to bad uses.鈥澨


The truth is to produce positive ethical outcomes and minimize risks, technology has to be managed and monitored as actively as any other aspect of the business - perhaps even more so. This issue of Perspectives will explore the specific strategies and frameworks that can put technology-embracing enterprises on sounder ethical footing.

Why it matters

With many businesses still in survival mode, it鈥檚 easy to conclude that ethical technology doesn鈥檛 need to be a priority, or can be lumped in with other 鈥榮oft鈥 aspects of corporate social responsibility. But there are multiple reasons why it鈥檚 become business-critical and could have a massive impact on an enterprise鈥檚 ability to build value in the future.听


First, what鈥檚 meant by 鈥榯echnology鈥 in the business context has changed radically. A few decades ago, when it was largely limited to accounts or payroll systems, 鈥渋t either worked right or it didn鈥檛,鈥 says Parsons. 鈥淭here was very little that could go wrong from an ethical perspective. If the software was functioning properly and there was no fraud involved, there really weren鈥檛 any ethical implications. It was easier in many ways to know whether something was working as intended.鈥澨


Contrast that with today, when technology is embedded in sensitive areas like healthcare, criminal justice and access to financial services. 鈥淭hese are all areas where the ethical impact of getting something wrong is far greater,鈥 Parsons says. 鈥淚n some cases, it鈥檚 even difficult to define what constitutes the right answer.鈥澨


Second, consumer awareness of, and sensitivity to, ethical issues is arguably at an all-time peak - and many are willing to vote with their wallets on a company鈥檚 ethical performance. One recent survey of consumers in the US, for example, found a significant majority (68%) see sustainability as important when making a purchase, and that 49% are willing to pay more for .听


Technology-driven ethical lapses, like Apple鈥檚 credit scoring system apparently making sexist , or unintended racial bias in an algorithm used by healthcare provider , can quickly spiral into full-blown scandals, endangering relationships with customers and regulators and pressuring the bottom line. Recent disclosures to investors by Microsoft and Google have warned of the potential havoc 鈥榖ad鈥 AI could wreak on their .

Explore this topic in our Social Impact report

Quote -  Laura Paterson, Principal Consultant, 魅影直播


鈥淭here鈥檚 real evidence as to why companies should do the right thing over and above the risk factor. There are tangible benefits to having an ethical technology approach and being purpose-led.鈥


Laura Paterson

Principal Consultant, 魅影直播

鈥淭here鈥檚 real evidence as to why companies should do the right thing over and above the risk factor,鈥 says Laura Paterson, Principal Consultant at 魅影直播. 鈥淭here are tangible benefits to having an ethical technology approach and being purpose-led.鈥


A major consideration is that how a company uses technology is likely to have a direct effect on its ability to attract and retain future talent. Research shows millennial and Generation Z talent aspire to work for ethical enterprises and are highly concerned about the consequences of the adoption of technologies like . A study by responsible technology think tank, Doteveryone, of tech workers in the UK found 28% had seen decisions made about technology that they believed could have negative ethical consequences - and that 18% of those ended up as a result.

Proportion of tech workers who鈥檝e experienced decisions that could lead to negative consequences for people and society

Source: Doteveryone

For companies failing to address technology鈥檚 ethical ramifications, 鈥渋f you look at the social movements, the zeitgeist of the moment, there鈥檚 not only a huge reputational risk externally, there鈥檚 a huge risk internally with employees,鈥 notes Chad Wathington, Chief Strategy Officer at 魅影直播. 鈥淚n tech companies, we鈥檙e seeing a wave of activism in the workforce. Workers are aware of the power of corporate interests in modern democracies, and are prepared to organize to influence corporations to act politically, and ethically, on their behalf.鈥澨

Common blind spots

锘緼ny application of technology can have ethical effects, but there are two key areas where these implications are especially likely to be significant and direct, and, therefore, merit close attention - AI and the use of customer data. Both are seeing massive adoption by businesses, and are playing a greater role in decisions and strategies that were once the exclusive domain of humans.听


Awareness of AI bias - systems making questionable decisions due to bad data or assumptions introduced, consciously or unconsciously, by their designers 鈥 as an issue is growing as the tool takes over more and more business functions. One recent poll of IT decision-makers found an extremely high proportion - 94% in the US and 86% in the UK - were planning to boost investment in AI bias over the next year.听


However, many organizations鈥 efforts to address the issue are either nascent or misdirected - and complicated by the fact that AI bias is, in many respects, an invisible enemy, prone to finding its way into systems constructed with the best intentions. Developers of UnitedHealth鈥檚 algorithm, for example, attempted to eliminate bias by not including race data in their models but effectively re-introduced it through a 鈥榖ack door鈥 by segmenting patients on the basis of their , which varied according to ethnic group.听


It is hard for companies to be vigilant, but one way of raising awareness about ethical risks is to look to artists who critique tech, such as the British artist, Karen Palmer. Karen's work dealing with systemic bias and AI, incubated and developed by 魅影直播 Arts, has been exhibited at the and in , and will be highlighted in an upcoming Augmented Reality app which showcases pioneering artists grappling with the impacts of new technologies.

Quote from Rebecca Parsons, Chief Technology Officer, 魅影直播
鈥淲hen you talk about reinforcement learning, the whole point is to detect the patterns that existed in the data in the past. And if that data is coming from a system that鈥檚 biased in any way, that bias is not only going to be manifested in the patterns that emerge, but those patterns are going to be reinforced.鈥


Rebecca Parsons

锘緾hief Technology Officer, 魅影直播

As another example of what can go wrong, Parsons cites the case of a research hospital that employed AI to decide whether patients should be admitted to the ICU after a particular procedure, and only realized later standard protocols had left them with an incomplete data set that didn鈥檛 take asthmatics into account. Seemingly minor omissions or distortions such as these can be highly dangerous because they are amplified as the system does its work.


鈥淲hen you talk about reinforcement learning, the whole point is to detect the patterns that existed in the data in the past,鈥 she explains. 鈥淎nd if that data is coming from a system that鈥檚 biased in any way, that bias is not only going to be manifested in the patterns that emerge, but those patterns are going to be reinforced.鈥


As data has become the new lifeblood of business, changing consumer expectations and regulations like the EU鈥檚 General Data Protection Regulation (GDPR) have made companies more conscious of how they gather, use and retain information - a good thing, as many customers are willing to act out against firms that don鈥檛 appear to take data policies seriously.

The 'Privacy Actives' Segment

Source: Cisco.com

However, 鈥渨e still have a long way to go,鈥 says Wathington. 鈥淭he ways we capture mass sums of data about people online and across all their devices to build profiles hasn鈥檛 really slowed down. As much as some players in the space have tried to make tracking harder, it鈥檚 an arms race, and the means of monitoring are getting more and more sophisticated.鈥澨


The reality is there鈥檚 often a business imperative to deploy technology in an unethical way. Incorporating behavioral design or subliminal messaging to 鈥榟ook鈥 a customer on a game or app can, for example, be the 鈥榬ight鈥 thing to do in the pursuit of profit or shareholder value. 鈥淏y defining revenue as your measurement of success, that is what you鈥檒l focus on,鈥 says Paterson. 鈥淎nd as long as that鈥檚 seen as the indicator of success by society - and investors - it will difficult for organizations to escape from that mindset.鈥

Quote from Laura Paterson, Principal Consultant, 魅影直播
鈥淏y defining revenue as your measurement of success, that is what you鈥檒l focus on. And as long as that鈥檚 seen as the indicator of success by society - and investors - it will difficult for organizations to escape from that mindset.鈥


Laura Paterson

锘縋rincipal Consultant, 魅影直播

That said, perceptions are shifting and progressive organizations are increasingly looking to measure value in other ways. 鈥淭here鈥檚 a myth that corporations only need to care about shareholder value, which was a theory advanced by Milton Friedman and other Chicago School economists,鈥 notes Wathington. 鈥淵et if you look at the laws around incorporation, most allow for balancing concerns and different constituencies - shareholders, employees, the local communities in which companies operate, customers and competitors. It鈥檚 also in your interests to be thinking about all those other touchpoints.鈥澨


Ultimately, Parsons says, every business has to grapple individually with these questions - 鈥淚s it better to make more profit or be fair? And to what extent? How much profit are you willing to give up to increase your level of fairness? What鈥檚 the balance point? There isn鈥檛 necessarily a right answer that applies everywhere, so it鈥檚 important for every organization to have that discussion - to define what their stance is, what they will or won鈥檛 do, and where to draw the line.鈥

The elements of ethical tech

The complexity and breadth of considerations most enterprises face in applying technology mean ethical technology has to be a consistent organizational focus, rather than a one-off initiative or list of principles posted on a wall. By considering the elements of ethical technology, organizations can develop a comprehensive approach across multiple fronts, from product development to the way leaders interact with their teams.




The elements of ethical tech

Source: 魅影直播

Diversity (Make sure there鈥檚 a range of viewpoints in the room):

锘縏he ethical issues or implications of the products a company builds can only be fully thought through when they鈥檙e examined from different viewpoints - and that requires the participation of diverse teams.听


鈥淚t鈥檚 very difficult for people to actually think and view a problem from another person鈥檚 perspective,鈥 says Parsons. 鈥淲e try our best - in fact it鈥檚 one of the principles in our own social change manifesto, trying to view the world through the eyes of the oppressed - but we don鈥檛 always succeed. It鈥檚 much easier if you鈥檝e got somebody in the room who can represent that perspective because it鈥檚 their lived experience.鈥澨


Ideally diversity should expand beyond the lines of gender, background, ethnicity or sexual orientation to functional areas. 鈥淵ou certainly need people from design to be represented because they鈥檙e the stewards of how the customer interacts with the technology,鈥 notes Wathington. 鈥淏ut also, someone from finance because they might need to examine or balance the profit motive. And legal, compliance and security, so those processes aren鈥檛 a gate-check at the end, but built in from the start with the right concerns in mind.鈥澨


Diversity and inclusivity also need to be reassessed as products develop, since seemingly welcome advancements can have negative consequences for access and affordability. 鈥淲e started with paper forms and moved to online, then mobile, and now we鈥檙e looking at ways of interacting with technology that go beyond that,鈥 Paterson says. 鈥淭he challenge is that as you move along the continuum you鈥檙e potentially losing your ability to reach all users. If you develop a product feature for Alexa, what does that mean for people who can鈥檛 hear or speak? Diversity has to include the intersectionality of people who have the technology and people who can or can鈥檛 use it.鈥

Quote from Rebecca Parsons, Chief Technology Officer, 魅影直播
鈥淚t鈥檚 very difficult for people to actually think and view a problem from another person鈥檚 perspective. We try our best - in fact it鈥檚 one of the principles in our own social change manifesto, trying to view the world through the eyes of the oppressed - but we don鈥檛 always succeed. It鈥檚 much easier if you鈥檝e got somebody in the room who can represent that perspective because it鈥檚 their lived experience.鈥


Rebecca Parsons

Chief Technology Officer, 魅影直播

Inquiry (Ask the tough questions, systematically):

锘緾onnected to diversity, as Parsons points out, 鈥渦nless the right group of people are asking the right questions, you won鈥檛 get the answers that accurately reflect the ethical implications of what you鈥檙e building - especially for groups that aren鈥檛 represented.鈥澨


To ensure the 鈥榬ight questions鈥 are raised, it can help to employ formal tools and frameworks that guide teams through structured processes of inquiry - and a number of methods have been designed and fine-tuned for precisely that purpose. (See 'The ethical tech toolkit' below).听


According to Wathington these exercises shouldn鈥檛 be viewed as a chore but 鈥減art of a holistic approach to design and customer experience鈥 - a welcome opportunity to flag trouble spots that could come back to haunt the enterprise later, and to introduce appropriate checks and balances.

The ethical tech toolkit

What is it?

  • Tools to shape strategy & values of a company and its products
  • Includes checklist of risk zones/future scenarios and instructions on applying these in a workshop context

Why/when/how to use it?

  • To prepare for a project (in any phase) to highlight concerns and illustrate negative future scenarios
  • To better understand risks of existing products/solutions

What is it?

  • Full set of workshop materials with guide and cue cards
  • Foundation for a structured session to explore intended/unintended consequences of a product or feature

Why/when/how to use it?

  • Product vision/ideation/roadmap development stage
  • Can also be employed as a retrospective or each time a new feature is introduced

What is it?

  • Deck of cards with provocative questions designed to help creators envision unexpected ethical outcomes
  • Can be used to drive brainstorming sessions

Why/when/how to use it?

  • In the early stages of product ideation to expand thinking/dialogue on impacts
  • In the design process to flag possible negative consequences

What is it?

  • Visualization tools for the exploration of AI/ML data
  • Helps highlight distortions in training and validation data sets

Why/when/how to use it?

  • When creating data sets to train AI/ML algorithms
  • Can also be used to visualize other data sets

What is it?

  • Risk-based approach to designing secure software
  • Brings teams together to brainstorm threats before they materialize

Why/when/how to use it?

  • Should be conducted for every product iteration
  • Participants should include business analysts, product managers, and security team to raise awareness and get various risk perspectives

Constituencies (Make sure multiple stakeholders are considered):

锘縐ltimately product听builders won鈥檛 be able to answer all ethical questions or consider all impacts themselves. For any innovation with potentially game-changing consequences for society or the environment, there should be an effort to secure a broader consensus on what鈥檚 being created. 鈥淲hen you start asking questions like what self-driving cars should do when faced with the dilemma of prioritizing the life of a driver or a pedestrian, it鈥檚 no longer something a programmer or some group of analysts sitting around a room should decide,鈥 says Parsons. 鈥淭hese are questions that society as a whole needs to start tackling, and deciding what the right ethical response should be.鈥澨


When rolling out a technology, companies should consider limitations or knock-on effects that may only apply to specific groups, such as children, the disadvantaged or the elderly. 鈥淚t鈥檚 about understanding all your possible users, how they experience your customer journey and use technology to interact with you,鈥 says Wathington. 鈥淭here are still huge gaps there. In mobile, a lot of developers still go for the iPhone first because it鈥檚 the more expensive mobile platform. Some companies don鈥檛 care about whether the product is more expensive to consume, or a worse experience, for those who are poor. Companies have to ask themselves whether they鈥檙e just marketing to the affluent - and whether that鈥檚 okay.鈥澨


Pointing to how organizations can sometimes neglect this process, Paterson gives the example of the collaborative tools - Zoom, G Suite and the like - that have proven crucial to enabling work to continue throughout pandemic-induced lockdowns. At many organizations these were adopted without much thought to varying levels of access.听


鈥淚t鈥檚 increased the digital divide in many respects because no one really stopped to ask whether people could access these services,鈥 she says. 鈥淚 was astonished to find that even within our own organization there were some people who didn鈥檛 have broadband, and we had to work out a way to provide it. We had blind spots in our awareness of the type of access people have, and also that quality of access can be an issue. Maybe someone鈥檚 connection isn鈥檛 great. Maybe they鈥檝e got five people in the house trying to do video calls at once. It鈥檚 important to understand technological decisions won鈥檛 impact all people the same way.鈥

Quote from Rebecca Parsons, Chief Technology Officer, 魅影直播
鈥淲hen you start asking questions like what self-driving cars should do when faced with the dilemma of prioritizing the life of a driver or a pedestrian, it鈥檚 no longer something a programmer or some group of analysts sitting around a room should decide. These are questions that society as a whole needs to start tackling, and deciding what the right ethical response should be.鈥


Rebecca Parsons

Chief Technology Officer, 魅影直播

Methodology (Formalize ethical processes where feasible):

While ethical guidelines are difficult to set in stone, it鈥檚 important for there to be a basic reckoning of what the organization aspires to be and stand for, to form a 鈥榥orth star鈥 that can be used to guide technology decisions.听


鈥淭he first step is being clear about your mission, and it鈥檚 very seldom about technology - it鈥檚 always more than that,鈥 Paterson explains. 鈥淭he next is defining values so everyone knows the parameters within which they鈥檙e working and making decisions. Subsequent to that is creating the channels for communication, and opening up the diversity of opinion.鈥澨


Once ethical standards and goals are set, they can be formalized and inculcated, with the establishment of frameworks or guidelines for specific processes, like the early stages of product development, or the use and retention of customer data. Rather than rigid codes of conduct - which can be difficult to enforce, or even allow the company鈥檚 leadership to effectively 鈥榳ash their hands鈥 of ethical responsibility by pushing the burden onto frontline workers - these should include 鈥渄efinitions for developers which are based on your mission and principles, showing what these look like in practice,鈥 Paterson says.听


There are opportunities that spring from this process. For one, as Parsons points out, formalizing some aspects of ethical decision-making paves the way to apply technology to the cause. 鈥淥nce you have a definition of what constitutes good, many of these things can be automated,鈥 she says. 鈥淭here are well established tools for monitoring things like data theft, and well-understood approaches for looking at various kinds of code vulnerabilities.鈥澨


Second, it positions the enterprise to evaluate the technology vendors and partners it chooses to work with, and hold them to similar standards. The practice of what鈥檚 become known as ethical procurement is, according to Wathingon, creating virtuous circles.听


鈥淭his has multiple facets for a company like us - whether the software we鈥檙e buying is operating in an ethical way, and whether the vendor itself is,鈥 he says. 鈥淎 lot of companies are adopting tech from other vendors, so it becomes a two-way, mutually reinforcing, process. The buyer starts to think about the ethics of what they鈥檙e purchasing, and the company building the software starts to think about the ethics of what they鈥檙e making.鈥

Capability (Strive to constantly improve the organization and its products):

An ethical technology approach has to be honed by the organization like any other skill. It鈥檚 a focus the leadership can and should introduce and encourage, but that also has to spring from the ground up, since employees on the front lines of product development and end-user interaction will in many cases be those confronting ethical choices directly.听


鈥淲ithout effective communication, ethics won鈥檛 become part of the general ethos,鈥 Wathington says. 鈥淏ut what you鈥檙e trying to do with communication is affect a change in people鈥檚 hearts and minds so that they own it, and it鈥檚 not you as a leader trying to enact a change on them. You want to encourage people to innovate, to continue thinking about what they can do, and make their own contributions.鈥

Quote from Chad Wathington, Chief Strategy Officer, 魅影直播
鈥淲ithout effective communication, ethics won鈥檛 become part of the general ethos. But what you鈥檙e trying to do with communication is affect a change in people鈥檚 hearts and minds so that they own it, and it鈥檚 not you as a leader trying to enact a change on them. You want to encourage people to innovate, to continue thinking about what they can do, and make their own contributions.鈥


Chad Wathington

Chief Strategy Officer, 魅影直播听

Internal structures at some companies prevent this. 鈥淢ost organizations still aren鈥檛 set up to be able to have the flow of information, of feedback,鈥 says Paterson. 鈥淎s a technologist who鈥檚 influencing some externally-facing, high-stakes system, you鈥檙e probably best positioned to know where the vulnerable spots are. If you鈥檙e not being asked or listened to, the company doesn鈥檛 know what to fix, and you鈥檙e also not able to highlight opportunities.鈥澨


In this regard, being ethical has much in common with being effective. 鈥淭here鈥檚 a lot of parallels between agile ways of working and ethical ways of working,鈥 Paterson notes. The same feedback loops that drive effective product development and regular, incremental improvement can be used to support better ethical choices.听


For companies seeking to build these capabilities, Wathingon has one main recommendation: start small, perhaps with a core group working on a single product, to establish the mechanisms, then learn from the experience and scale up.听


鈥淥nce you鈥檝e got critical mass, you get to institutional knowledge, where you don鈥檛 need the same 10-15 people to lead everything because you鈥檝e worked through it,鈥 he says. 鈥淵ou have the stories, templates and frameworks that enable you to handle whatever鈥檚 important in your context.鈥

Quote from Laura Paterson, Principal Consultant, 魅影直播
鈥淢ost organizations still aren鈥檛 set up to be able to have the flow of information, of feedback. As a technologist who鈥檚 influencing some externally-facing, high-stakes system, you鈥檙e probably best positioned to know where the vulnerable spots are. If you鈥檙e not being asked or listened to, the company doesn鈥檛 know what to fix, and you鈥檙e also not able to highlight opportunities.鈥


Laura Paterson

Principal Consultant, 魅影直播

Reasons for optimism听

Recent headlines around topics from data privacy to contact tracing and autonomous vehicles show technology is likely to remain an ethical minefield. Nonetheless, at times quietly and behind the scenes, a number of encouraging trends are taking shape. Entire ecosystems of frameworks and solutions are emerging around danger spots like AI and data security. More off-the-shelf models and solutions are becoming available for enterprises aiming to make ethics an integral part of the development of products and interaction with customers.听


Perhaps most importantly, technologists themselves are increasingly determined to weigh the ethical consequences of what they do - and course-correct where necessary.听听听听听


鈥淭he fact that these conversations are happening at all is hopeful,鈥 Parsons says. 鈥淭here are more people who are saying: We as technologists have to take responsibility for the choices that we make and the products that we develop; we can鈥檛 just be order-takers. If someone asks us to build something that we don鈥檛 think is right, we have a responsibility to push back.鈥澨

Quote from Rebecca Parsons, Chief Technology Officer, 魅影直播
鈥淭here are more people who are saying: We as technologists have to take responsibility for the choices that we make and the products that we develop; we can鈥檛 just be order-takers. If someone asks us to build something that we don鈥檛 think is right, we have a responsibility to push back.鈥


Rebecca Parsons

Chief Technology Officer, 魅影直播

鈥淭here鈥檚 a whole burgeoning of awareness around many of these issues that is really positive,鈥 agrees Wathington. 鈥淚鈥檝e met a lot of excited technologists who want to do better, and who have passion in areas like the climate or accessibility. There鈥檚 a broadening recognition that we need to address these considerations by default - which is amazing, because 10 years ago most companies wouldn鈥檛 even think these are things you should talk about.鈥澨


Another reason for hope is the growing awareness that ethical and business performance are not parallel or competing priorities, but intimately connected. As Paterson points out, companies that make significant progress on ethics are also more responsive. By practicing openness and transparency, and prioritizing long-term relationships above short-term gain, they鈥檙e more connected to customers and other stakeholders, and better positioned to anticipate market and regulatory trends. Being ethical, in other words, means being ahead of the curve.听听


鈥淭echnology creates complex problems, increasing in complexity with every global progression,鈥 Paterson says. 鈥淵ou can鈥檛 predict exactly what鈥檚 going to happen next, and you also can鈥檛 resolve your approach to all these complex issues in advance. But if you put values and principles in place you can start to make good decisions regardless. It鈥檚 one of the only ways organizations can future-proof themselves.鈥澨

Quote from Laura Paterson, Principal Consultant, 魅影直播
鈥淭echnology creates complex problems, increasing in complexity with every global progression. You can鈥檛 predict exactly what鈥檚 going to happen next, and you also can鈥檛 resolve your approach to all these complex issues in advance. But if you put values and principles in place you can start to make good decisions regardless. It鈥檚 one of the only ways organizations can future-proof themselves.鈥澨


Laura Paterson

Principal Consultant, 魅影直播

Perspectives delivered to your inbox

Timely business and industry insights for digital leaders.

The Perspectives subscription brings you our experts鈥 best podcasts, articles, videos and events to expand upon our popular Perspectives publication.听听

Marketo Form ID is invalid !!!