Skip to content
Content

Log 8: When Buzzwords Kill Ethics

17. Jan 2022

In the tech industry, we love using buzzwords like lean, user-centred design, and usability, but these tools can actually be a hindrance to ethical design solutions

Quick insight

By Peter Svarre

This article is published in connection with The Digital Ethics Compass project.

Why has the tech industry suddenly been hit by an ethical backlash? Why is it precisely the tech industry that time and time again is portrayed as the proverbial clumsy bull in a China shop?

Maybe it has to do with the buzzwords that we surround ourselves with and praise in the tech industry. Let us examine three of them.

  • Lean/agile development
  • User-centred design
  • Usability

Lean or agile development has possibly been one of the most important shifts in digital product development in the past twenty years. Before, we had heavy and bureaucratic waterfall models. Today, we have agile development team that do not try to build a chrome-plated fantasy structure but instead work in a flexible and inquisitive manner towards a goal that is constantly changing on the basis of new data and user involvement.

The problem with lean, however, is that we often end up getting too involved with details. We stare intensely at the data and focus on the next sprint, and this means that we can quickly forget the big picture and the end goal. And ethics is precisely about being able to consider your solution and design as part of the big picture. What consequences does it have for my design when it is used by 100,000 people? How will our solution look in five years if we are really successful? And how do we impact people who are not users but who are still impacted by the solution? These are the kinds of questions that are rarely brought up during a 2 or 3-week sprint because people are focused on improving their KPIs by a few percentage points.

Another lean development concept is ‘pivoting’, which may also be ethically problematic, as the idea of constant pivoting removes the fundamental focus on the purpose of running a business and moves the focus to the simpler question of ‘How can we make money’? There is nothing wrong with making a profit, but if it becomes the only purpose of running a business, ethics are often disregarded.

It can be hard to argue against user-centric design. Who would disagree with the notion that the design should be centred around the users who will be using the product? But here there can also be ethical problems.

First of all, user-centred design often results in a very one-sided focus on the users of the product and not the people who are impacted by the product in other ways. Think of the people who are killed by self-driving cars or the people who live on a street in Barcelona that has been turned into an AirBnB hotel that has killed off all of the local flavour.

Another problem with user-centred design may be that one simply defines the user group too narrowly – often based on the demographic composition of the development team. The product Philips Hue is, granted, designed using user-centred design processes, but it has still resulted in the design of a solution where the lights in the home are turned off automatically when the user (typically the man of the house) leaves home. Even if the wife and children are still there.

If you are going to design ethical digital solutions, there is quite simply a need for an ethical and socially conscious awareness among the design team that goes beyond the classic and narrow definitions of ‘a user’. Usability is a necessity when designing usable digital solutions. When In2Media’s design team back in the day designed MobilePay for Danske Bank, they systematically reduced the number of clicks needed to transfer money from 23 to seven. The friction involved in money transfers was removed and this resulted in MobilePay being the most successful app in Danish history.

A huge amount of design work is about removing friction and making things easier, but the more friction we remove from interfaces, the more control and autonomy we remove from users. We remove friction by making choices for the users so that they find it quicker and easier to do the things they need to do. It makes sense in the vast majority of cases, but sometimes we might make the wrong choices on behalf of our users and thus make unethical choices.

Think about Facebook, which has built their entire multi-billion dollar business around the algorithmically controlled newsfeed. Facebook makes thousands of choices per minute on our behalf, and this allows them to show us content that is interesting, appealing and enticing. However, they also take away some of our autonomy and the ability to select what content we want to view. Facebook’s newsfeed is not necessarily unethical, but there are some extremely important ethical choices behind the design of it. They could have chosen to design it in a thousand other ways, but they chose this precise design, and that has both ethical and societal consequences.

If you would like to learn more about the problems associated with buzzwords, you can listen Tristan Harris from the Center for Humane Tech in the presentation ‘What’s Now San Francisco with Tristan Harris’.

Can’t get enough of design and innovation? We hear you. And we have you covered.

Sign up for our newsletter to get the latest from our world delivered straight to your inbox.

Sign up for the ddc newsletter

Copenhagen

Bryghuspladsen 8
BLOX, 2. floor
1473 Copenhagen
CVR 3699 4126

Kolding

Dyrehavevej 116
Design School Kolding
6000 Kolding
CVR 3699 4126

Unless otherwise stated, all content on this website is presented under the Creative Commons Attribution License.