iTWire asked a number of IT executives for their best thoughts on ethical behaviour in IT. Here’s what they had to say. Firstly, this is the specific question we posed to everyone with the intention of neither expanding nor elaborating. “We regularly talk about ethics in AI – it’s a very hot topic right now. But surely it should be just as hot a topic in the rest of our industry. Where does the IT industry show great ethical behaviour, and where do we fall down (a little)?“
Broadly, those who responded attempted to identify behaviours that might be considered either positive or negative. In addition, a few somewhat misinterpreted the question and confined their responses to the AI domain, even though that was provided merely as a grounding statement. Their comments are included where appropriate.
To open, it would seem that Jason Duerden, Managing Director, BlackBerry Spark ANZ said it best when he noted that, “I often think about the moment in the original Jurassic Park film when Jeff Goldblum’s character says, “Your scientists were so preoccupied with whether or not they could, they didn’t stop to think if they should.” As an industry, I think we need to keep this in mind – assess both good and bad use cases, bake in security by design, and develop ethical standards that help us safeguard the future. Manufacturers, developers and vendors should also own the responsibility for their technology and we should have an open and collaborative culture that allows us to call out, and learn from, failures.”
So, with that in mind, let’s start with a few ‘scene-setting’ comments from some of our contributors.
Rick Mayhew, VP & GM Unisys APAC says, “The far-reaching impact of technical innovation, and its capacity to be used for good or evil, means that technology cannot exist in a vacuum, without responsibility. The intended purpose of its application, consent of those who are expected to use it or be subjected to it, and changes to what is culturally acceptable need to be considered, and regularly revisited.
“The technology itself is not the issue. The issue is how it is used and by whom. For example, high speed video streaming enables us to work and collaborate remotely via video conferencing – a huge benefit in this COVID-19 WFH age, but it is not acceptable for this same technology to be used to share offensive and illegal content.”
In a similar vein, Daniel Markuson, a digital privacy expert at NordVPN observes that, “Keeping society safe comes at a cost. In this case, the technology in question is still very much work in progress. That may bring on increased surveillance, violations of the right to protest, and prosecution of innocent people. So the decision by the major companies puts pressure on other players to re-evaluate their aim to push the technology further.”
Rob Byrne, Field Strategist at One Identity adds, “Experience shows that we cannot trust to chance or self-regulation to ensure that corporations will behave ethically. Government regulation and laws are the main way we can ensure that we all comply with agreed ethical norms. Imposing financial penalties at the right points in the production and deployment chain shifts the balance to avoid a culture that considers hot-fix, clean-up, and damage limitation as acceptable operating modes.”
Garrett O’Hara Principal Technical Consultant at Mimecast points to the ‘people issues’ as being critically important. “Like any sector, it largely comes down to the leadership in an organisation and how the culture of ethics flows through management into the individual contributors. Beyond culture the structures for incentives come into play. Incentives exist to motivate a particular behaviour, or more accurately to motivate behaviours that lead to an outcome (e.g. revenue). Pressure to achieve an outcome can lead to incentives that inadvertently conflict with stated cultural norms or values for an organisation. It’s great to have a motivational poster on a wall telling employees to do the right thing, but if the pressure to perform is large enough people may take shortcuts or veer into ‘Artful Dodger’ territory.”
“When we talk about ethics in IT, artificial intelligence so frequently comes up because people either don’t understand it or fear its potential adds Jason Duerdin. “This doesn’t just apply to AI, but to all product development. As we continue to invest in research and development, we should always consider the alternate functions a technology may be able to perform, outside of what we intend it to perform. Innovation is already outpacing security and ethical frameworks, which leaves a huge window for exploitation.”
And from there, Mayhew asks, “But should the IT industry set its own rules on what is/isn’t acceptable ethically, or is that is the role of government and the end-user industry authorities? We can’t cop out and say “Hey we just like to build cool stuff – it’s not out responsibility how you use it”. The problem is, technology evolves and is consumed much faster than governments and authorities can roll out legislation or compliance. Just like medical science we must consider the wider implications. This requires diverse teams to examine the ethical issues from a wide variety of viewpoints, continual review, and the preparedness for competitive companies, industries and governments to work together and agree acceptable parameters.”
Blaise Porter, director, responsible business, Fujitsu Australia. An ethical approach means acting ahead of ethical questions. It’s preferable to act in advance of the potential for harm, than to have gone too far and have to pull back. Being able to achieve this means having a clearer understanding of the unintended consequences, which means being able to think broadly and incorporating different perspectives. Greater diversity in the ICT industry will be essential in helping the sector get to grips with these consequences.
To summarise, Burne asks a couple of succinct questions:
- Are we fulfilling our duty of care in protecting sensitive data, preventing malicious tampering, and doing our best to ensure the availability of critical services? The numbers unfortunately imply that we are not.
- Are we securing digital identities, investing sufficiently in quality and security testing and taking on board the fact that recommending a reboot of your pacemaker or car’s braking system may not be an acceptable level of risk to impose on our consumers, or rather, our fellow citizens.
Of course, there are some positives, and our assembled throng identified them easily.
Helping each other
Garrett O’Hara noted that “The vast majority of security and IT leaders I meet are good people who want to do good work and do the right thing. This shows up in their concern for peers in other organisations when things aren’t going well, for example during a breach. I’ve heard many stories of how an organisation will help out another through the loaning of talent and resources or providing office space with connectivity.”
In a similar space, O’Hara adds, “The overt collaboration between different vendor organisations in security is heartening – through technical integrations and educating the wider population. Sure there’s a commercial outcome, but that outcome only exists if it delivers something valuable to a customer. And the collaboration can only be successful if approached ethically and with trust.”
Owning up to mistakes and breaches
O’Hara continued, “A clear sign of ethics is honest and regular communication when a breach occurs. The best PR during an incident is transparent and goes with the truth. There have been great examples of this in Australia, including the way the Red Cross handled a breach several years ago.”
Similarly, Markuson observed that “three major players in facial recognition technology made bold decisions to put the brakes on its development or drop out altogether. IBM, Amazon, and Microsoft cite a concern that law enforcement may be misusing the technology and violating human rights.”
“A level of forced transparency and a need to be ethical has emerged because of the internet – forums, review sites, third party analysts. Doing the wrong thing will be amplified” said Garrett O’Hara. “Certainly, the expectation from buyers now is of partnerships rather than ‘sell and move on’. This is pushing organisations to consider the long term reputational impacts of making a sale at the expense of burning bridges. Honest and ethical behaviour has become a competitive advantage; it’s hard to fake that.”
SaaS – a pretty good truth serum
“The emergence of SaaS helps keep the vendor landscape honest,” opines O’Hara. “Gone are the days of a slick-haired salesperson promising the world to a potential customer over a steak dinner because they could shoehorn in approximate functionality through sloppy dev work.
“The IT industry has a few different functional silos that I suspect appear at different points along the ethics spectrum – sales and marketing vs security operations vs governance vs systems engineering vs management. And there is disparity even between companies operating in the same areas.”
Duerdin adds, “From my perspective, there are two key areas where ethical discussions are needed in the IT industry. One relates to product development, where malicious actors and fear are the main drivers of ethical frameworks. The other relates to the culture of the IT industry, where complacency and greed can often get in the way of ‘doing the right thing’. “
Garrett O’Hara the describes the typical flow of events:
Yes of course Ms Customer we have the functionality to finagle your widgetry – we’ve been doing that for years!
“Eh, hello dev team. I’ve promised we can finagle widgets. I wasn’t sure what a widget was or how to finagle but you guys are amazing. How about you work that magic and get it sort of working?”
O’Hara summarises, “With SaaS, what is sold is what is sold with no forking of code – coupled with reams of online reviews of the reality and quality of an offering.”
“The corollary to the transparency during a breach is trying to pretend a breach hasn’t happened, or customers have not been affected, then back tracking as the truth finally emerges (as it always will), reminds O’Hara.
(lack of) Grace in competition
O’Hara adds, “As an industry – again as in most industries – how different vendor organisations deal with the competition can be unpleasant. Instead of focusing on the partnership with a customer, and the way they can solve a customer’s pain, you can sometimes see the technology equivalent of a US politician’s attack ad. It’s an age-old sales tactic that’s up there with, “One owner and only driven to church on Sundays!” Customers really should – and probably do – see through it by now.”
Mistakes leading to discrimination
Markuson draws from the facial recognition field. “A study by the National Institute of Standards and Technology found that AI misidentifies women and people of colour up to 10 to 100 times more often than white men. This raises concerns about the potential of discrimination and biased prosecution based on gender and ethnicity.”
Pat Devlin, Director, South Pacific (ANZ), Aruba, a HPE Company observed that “AI has enhanced our ability to do good. However, this same technology can enhance the ability of humanity to do bad things also. The tech industry is a natural collector of data – we accumulate and store as much data as we can. Good data helps us understand and improve our products. It helps us solve complex problems faster than ever. It allows us to profile our customers and build products that meet very specific, individualised needs. The same technology that could analyse behaviour on social media and help a patient prevent a manic episode could also be used to advertise high risk investments and costly consumer goods to take predatory advantage of the same condition. The technology that knows about your age, race and religion could be used to isolate and persecute minority groups.”
Jason Duerdin outlines a number of situations where a single-minded focus on an outcome has masked all the less-desirable outcomes. “Technologies like robots, drones and supercomputers, while often created with good intentions like driving efficiency and enabling deep learning, carry enormous power and potential for unintended or unacceptable outcomes. It’s easy to become so enamoured with cool, shiny, new tech that the commercial and societal benefits outweigh the associated risks.
“Intel, for example, created its chip processors with consideration for computer innovation and the lowest barrier to operational efficiency to drive corporate revenue. The Meltdown and Spectre vulnerabilities were discovered years later in the processors, highlighting the fact that within all technologies are unintended functions that may not have been considered or acknowledged during development.
“The fundamental aim of technology is to enable improvements to productivity, security, and user experience. However, there is a common perception that vendors are primarily sales-driven, which can be at odds with this larger purpose. This perception might prevent customer organisations from establishing a meaningful partnership – in which salespeople can do their best work – meaning the uplift of efficiency, cyber resiliency, and experience is hampered.
“This is where ethics comes into it. Sales teams that really want to shift the needle and help organisations need to leave the pushy tactics at the door. It’s easy to slip into the doom and gloom narrative, and hope that spreading fear, uncertainty and doubt will land a sale. But the truth is that constructive, collaborative, and hopeful conversations about innovation are what will really make a difference – and what will help vendors establish trust and lasting partnerships.”
What can we do?
“There are several things that individuals can do as leaders, to help their organisation build a reputation of goodwill alongside the good reputation that will naturally be generated by quality solutions,” says Jason Duerdin:
- Carefully consider every team hire, and ensure you are bringing on people who are the right cultural fit and really want to make a difference. There will always be sales targets and an element of financial motivation to drive the ongoing success of the business – but employees should also be encouraged and supported in building authentic relationships and a reputation for caring about what they do.
- Ensure that CSR initiatives don’t just tick a box but contribute to the ongoing improvement of the industry. For example, investing in and supporting the next generation of IT professionals – looking to the future rather than just trying to capitalise on the current market.
- Remember that money doesn’t always come first – particularly in times of crisis. As battles are increasingly fought online, security professionals will be our next cohort of superheroes and we should start stepping up to the plate. Sometimes you just have to roll up your sleeves with no signed agreement or financial commitment, because it is the right thing to do.
Mayhew adds, “The good news is that we have made great strides in some areas such as greater transparency and permission about how data is used in technology – although this has largely been driven by public demand. The Facebook-Cambridge Analytica data breach and other similar scandals over recent years put the spotlight on the privacy implications of how personal data is collected, used, stored and shared. Today consumers expect transparency about how their data is used and have the right to withhold or grant consent to access it. And if they grant consent they expect that data to be secured. Breach the consumer’s trust and they will simply change who they do business with or what data they allow you access in the future.
“However, the IT industry still has a way to go even in this area. We’ve all been asked to click on long, complicated terms and conditions in tiny print when downloading an app or a software update – most of us don’t take the time to review and therefore have no idea what we’ve actually agreed to. True transparency would be honest plain language T&Cs so that the consumer can make an informed choice about what they are signing up to.
“Ultimately, it requires creating a culture focused on continued understanding and testing of ethical and moral boundaries of IT – across the creators, vendors, customers and users. We must embed this critical thinking in the education curriculum – from primary school upwards – to create a culture that is prepared to prod, question, push-back and in some cases critically analyze technology, until ethical issues are investigated and addressed – all as a normal part of doing business.”
“Civic education is key to establishing an ethical culture,” continues Rob Byrne, “and should be extended to cover IT ethics from an early age, including training on how to protect your own privacy online and the very real impact of online abuse and bullying.”
In a simple conclusion, Byrne asks are human beings “our consumers, or our fellow citizens.” The difference is important.