Log 3: Can We Put Digital Ethics on a Scale?14. Dec 2021
In The Digital Ethics Compass project, we have developed an ethics scale that helps companies understand when a design is illegal, when it is unethical and when everything is great
By Peter Svarre
This article is published in connection with The Digital Ethics Compass project.
Ethics can be a difficult subject matter. On the one hand, we can all agree that it is unethical to kill, but nonetheless, democratic states with a strong rule of law still go to war and kill thousands of people. We also know that it is unethical to tell a lie, but we still lie almost on a daily basis to our closest family and best friends. The line between something that is ethical or unethical can sometimes be blurry. This also applies when we talk about ethics in a digital context.
In The Digital Ethics Compass project, we are working towards making companies better at designing proper, ethical, and socially conscious digital products and interfaces. One of our first experiences in this project was discovering that it can be difficult for companies to even grasp what is ethical and what is unethical. There are no checklists or specific ethics formulas that can help digital designers make more ethical decisions.
A first step towards changing the mindset of companies is therefore to create an overview of ethical and unethical solutions in the context of digital design challenges. We have therefore created a scale that shows different types of digital designs and digital business models where the absolutely most unethical approaches are at the top of the scale and the very proper and ethical approaches are at the bottom.
As you can see, this allows us to move from the wildly illegal end of the scale to the grey zone in the middle and then the fully legal at the bottom. This also means that the scale shows how ethics and law sometimes have a complicated relationship with each other and it is not always completely clear what is a matter of following the law and what is a matter of just behaving properly. For the same reason, when developing this model we interviewed Mette Saabye Maaløe, who is an attorney working for the Danish Consumer Ombudsman. Mette helped to clarify when digital design is not just unethical, but actually also at odds with the legislation governing marketing and consumer rights and the Danish Penal Code.
The model is in no way an exhaustive one. We are still developing it and there can most likely be added many more examples. We are open to additions and ideas.
The digital ethics scale:
- Digital fraud Here, one is deliberately misleading users with illegal means (the so-called ‘Nigerian letters’ where people send an email and pretend to be distant relatives who need money or pretend to be in possession of an unknown inheritance to the recipient).
- Illegal data collection. When data is collected or sold to third parties in violation of current legislation (though on a global level, legislation differs greatly).
- Spam Illegal mass marketing, but the products are real (i.e. these are not scams).
- Subscription traps When digital solutions entice people to subscribe to a service that has no value.
- Semi-spam Same as above, but where one has unknowingly enticed customers to submit their email address (pre-checked sign-up, etc.).
- Malicious algorithms When algorithms manipulate people into doing things that are clearly not to their advantage (hidden price differentiation, predatory marketing, etc.).
- Dark patterns When digital design is knowing used to force users to carry out certain actions that they would not otherwise have done if they were able to see through the design. An artificial dark pattern makes people carry out actions that may have negative financial, social or human consequences for the user
- Customer clubs When people are enticed to sign up for a club even though in reality the company is just selling products. It is legal if sufficient information is given, but it is illegal if people are manipulated into signing up for the club.
Legal but perhaps unethical
- Malicious nudging When users are manipulated into doing things or buying things that are clearly not to their own advantage.
- Grey patterns When digital design is knowing used to force users to carry out certain actions that they would not otherwise have done if they were able to see through the design. A grey pattern (our term) is a dark pattern but here the consequences are not serious.
- Design with collateral damage Design that perfectly hits its target group but which has knock-on consequences for other users (AirBnB).
- Unsuitable design “Edge case” (stress case) users are not taken into account, so they get a lower quality treatment or service or are left feeling marginalised.
Predominantly ethical ↓
- Commercially oriented algorithms. Algorithms are used to make people do things or buy more products (but products that are still relevant to the user).
- Commercially oriented nudging When users are nudged to buy more products.
- Benign algorithms When algorithms help the users make decisions that are to their own benefit.
- Benign nudging Users are nudged towards doing things that are to their own benefit.
- Rule-based personalisation Simple interaction patterns that personalise an interface so that the user does not have full control but still understands how the personalisation takes place (and it can be turned off).
- Empowering UI The user has full control of all interactions and understands the consequences of all of his/her actions (this is to be understood as a kind of ideal setup that is never fully realised in the real world).
Can’t get enough of design and innovation? We hear you. And we have you covered.
Sign up for our newsletter to get the latest from our world delivered straight to your inbox.Sign up for the ddc newsletter