Skip to content
Tool

The Digital Ethics Compass:
Behavioral Design

Humans are not always rational. We make decisions based on emotions, which can be manipulated via behavioral design, for example, by nudging. In the best case, behavioral design is used to help people make wise decisions, but in the worst-case, behavioral design can be used to manipulate people in directions that are harmful to them. It is your ethical choice whether you want to use behavioral design to help or to manipulate.

01. Does your design play with negative emotions?

Get smarter – what is it?

Studies show that most people will go out of their way to avoid a loss compared to what they would do to achieve the same gain. If you want people to do something, then it is proven more effective to scare them with something negative than to motivate them with positivity, a tactic used by sellers, insurance companies, and baby equipment retailers for years. 

In the digital world, motivation through fear becomes even more effective because, combined with user data, it can make fear messages increasingly personalized. There is, of course, an ethical balance. It is fair game when a pension company informs people that they are not saving enough for their retirement, but if they combine this message with an image of an old lady eating cat food, you start moving beyond the ethical limits of digital behavioral design.

Recommendations

  • Think about whether you use fear, uncertainty, and doubt as motivation in your design. 
  • Be especially careful with fear messages based on knowledge about the user. 
  • Think about whether you can turn negative communication positive. 
  • Be aware that false information such as “only one product left” may be false.

The bad example

Many hotel booking websites use messages that tell the user that x number of people have already booked and that you need to hurry to book at the price shown. Users are scared that the offer and dream trip will disappear or that the price will double any minute. This practice is in itself unethical because it uses fear and insecurity to motivate users. But it can also be outright illegal if the stated deadlines and bookings are not truthful. Many booking sites have been in the spotlight of the competition authorities because of this practice.

The good example

Tobi is a Danish startup that helps parents invest their children’s savings. Their message is that it is far more efficient to invest their child’s savings than to leave it in a regular bank account. They use comparisons to show what this difference can mean when the children turn 18, but they do not use fear in these messages. They use concrete examples, but they do not tell stories about how horrible it will be to be 18 years old without child savings in a world where a two-bedroom apartment in Copenhagen will cost 10 million Danish kroner (or well over a million pounds or euros).

02. Do you deliberately make it difficult for users to find or understand information or functionality?

Get smarter – what is it? 

Digital design is most often about creating interfaces that are user- friendly and useful for users. But companies can have an interest in making users do things that are not to their benefit. For example, making users continue to subscribe to a service that they do not use, buying more products than necessary, or perhaps saying yes to conditions that are not in their best interest. 

In these cases, design can develop into a manipulative design. Also known as a dark pattern where you use cunning design tricks to get users to do things that are not to their advantage. Often it is about hiding important information or highlighting other information using colors, animations, font sizes, or other graphic tools.

The ethical guideline must be that you should always design based on your users’ interests. There is, of course, a balance to be found. But one should always avoid deliberate manipulation, where one can say with certainty that it is about benefiting the company at the expense of the user.

Recommendations

  • When designing, you should not only think about usability but also about user needs. 
  • Pay special attention to children and other vulnerable audiences who are particularly easy to manipulate. 
  • Consider how you would feel about using your own digital design. 
  • User test all the things that are not for the benefit of the company (unsubscribing, returning products, etc.).

The bad example

Do you have an Amazon account? Try deleting the account!

If it seems tricky, you can find a manual here: kortlink.dk/2agfy 

Hint: You need more than ten clicks and it is actually not possible without contacting someone from Amazon.

The good example

The vast majority of subscription-based digital services offer a free trial period with the condition that you provide your credit card information so that the subscription can start automatically at the end of the trial period. The result, of course, is that many users forget to unsubscribe and therefore remain stuck as paying customers, even though they may not have wanted to. However, there are exceptions – namely the Danish newspaper Information and the cycling training platform Zwift. Both of these offer a free trial period where you do not have to provide your card information. 

Both stand out from the competitors and appear more ethical and more attractive to customers.

03. Do you exploit your user’s inability to concentrate to your own advantage?

Get smarter – what is it?

Users can easily read long texts and familiarize themselves with complex issues online. But it requires that they are in the right mindset, such as when they are listening to a podcast or reading long magazine articles.

However, if the user is in the process of a transaction or buying products, downloading apps, or the like, then one cannot expect users to have the capacity to familiarize themselves with complex issues about cookies, tracking, use of data, profiling, etc. In other words, you should not expect users to have understood your data policy or your subscription rules just because they clicked the accept button. You may have received a legal acceptance, but ethically, you have not received an actual endorsement.

Recommendations

  • Avoid long and complex texts as much as possible. 
  • Try to split information into smaller chunks and present it when relevant. 
  • If you need acceptance from the users, ask for it in the context where it is suitable. 
  • Consider whether you could make your service less complex. 
  • Use skilled copywriters to write this kind of text.

The bad example

The University of Copenhagen has a privacy policy that is certainly not transparent
to the general public. For example, it states that the University of Copenhagen:

”(…) registers and processes personal data based on Article 6 of the GDPR. The processing of sensitive personal data in research projects is covered by (…)”. One must have a legal understanding and be familiar with the various sections of the GDPR to understand how the University of Copenhagen processes data.

The good example

The DuckDuckGo search engine declares that the platform is not storing, sharing, or using personal data about the users.

They sum up their privacy policy with the phrase: “We don’t collect or share personal information. That’s our privacy policy in a nutshell.” Next, it is explained in human language what data is collected and how. It is still a lengthy document but purposely made to be as understandable as possible. The example also shows that data-ethical companies typically find it easier to formulate comprehensible privacy documents simply because they have less to hide and more to be proud of concerning their customers.

04. Do you manipulate actions by taking advantage of people’s need to be social?

Get smarter – what is it?

People are social, and most of us are constantly looking for social recognition from our surroundings. It is an urge to be exploited for better or worse. On the one hand, our need for social recognition means that we help each other with advice and guidance through social media. But this very urge can also be why teens spend thousands of dollars on digital skins that make them more popular on Fortnite. 

In the digital world, companies can exploit our urge to be accepted socially to such an extreme where the consequences can be digital dependence or over-consumption of digital products or services.  Ethically, it is okay to use social design techniques to motivate people to take certain actions, but it is crucial to keep your users’ deeper interests in mind. Think about whether your social interfaces make people happier or more unhappy.

Recommendations

  • Beware of using social actions as a currency that users need to invest in to achieve something else. Social actions should be the goal in itself. 
  • Consider how social designs can develop when a lot of users use your service. Does the service change character when everyone uses it? 
  • Embed stop-blocks in your social designs so that people do not get carried away. For example, set restrictions on purchases or time consumption. Keep in mind that social interaction can be very addictive.

The bad example

When Pokemon Go launched in 2016, it was a pretty innocent augmented reality game that was all about collecting digital Pokemon monsters in the real world. The game assumed a social character as people began to share information about the locations of the best Pokemons, which meant that big cities suddenly were overrun by swarms of thoughtless Pokemon hunters who looked down at their screens and trampled everything and everyone down on their way. 

It was not a planned consequence but a consequence of the game becoming socially addictive and lacking stop blocks that could prevent large numbers of people from gathering.

The good example

Instagram is, if anything, a service that benefits from the urge for social recognition. But in 2020, after much criticism, Instagram admitted that it could go out of hand for some users, who became heavily dependent on reaping as many likes as possible. To reduce the reliance on social recognition, Instagram removed the ability to see how many likes photos have received. 

A relatively small initiative on a large platform like Instagram but a positive example of a small step in a more ethical design direction.

05. Are you trying to create addiction to your product with cheap tricks?

Get smarter – what is it?

Many companies want their products used as much as possible, especially if the products are financed through advertisements. It is therefore tempting to use nudging, social mechanisms or data to get their users hooked. It can mean excessive use of notifications, or it could be the use of features where people increase their social status by being more active on a platform. 

Of course, it is ethically okay to design products to be so good that people become addicted. It becomes unethical when the addictive tricks solely contribute to creating addiction and do not make the product itself better. It can be a tricky ethical balance to strike!

Recommendations

  • Ask your users if they feel they are spending too much time or too much money on your service. 
  • Include stop blocks that prevent excessive dependence on your product. 
  • Think of vulnerable target groups such as children or gambling addicts who become more easily addicted.

The bad example

Snapchat has a feature called streaks, which motivates people to keep Snapchat conversations running for as long as possible. The longer the conversation runs back and forth, the better a streak and the more rewards the conversation partners receive in the form of funny emojis. 

A classic example of a design that is not making the product better but solely serves the purpose of keeping people on the platform.

The good example

Netflix is a service that can be very addictive because you always get recommendations for relevant content based on your personal preferences. However, Netflix also had a feature that automatically initiated the next episode of a series once having finished one. Here the users were being nudged to get stuck and binge many episodes in a row. 

It is an example of a design that does not give the user much extra value (it is not difficult to click on the start button), but in return, it can create an addiction where the user does not benefit. 

Netflix has since removed this design element, so the series no longer start automatically.

06. Do you validate or challenge your users?

Get smarter – what is it?

The customer is always right! Companies must always give customers what they want, and with artificial intelligence and algorithms, it becomes easier to figure out precisely what customers want. The problem is that you can end up knowing your customers so well that you never challenge their preferences but give them more of the same. 

Consider whether good customer service is about more than giving customers more of the same. Perhaps it also entails challenging customers and showing them surprising new products or information. It is a difficult balance where one must not become patronizing and manipulative. People want to choose freely, but most also have a strong desire to discover new opportunities and see new perspectives on the world.

Recommendations

  • Build algorithms that do not create echo chambers. Build algorithms that deliberately make random mistakes to expose users to less of the same. 
  • Consider your metrics: Make it a KPI to expose users to new content or new products. 
  • Combine – as far as possible – algorithmic recommendations with human recommendations. 

The bad example

If you observe young children using YouTube, you will notice how they are quick to click on the videos recommended to them by the algorithm. A child can quickly go from watching an informative video about dinosaurs into a meaningless universe of cheaply produced cartoons intended for children to click on. The algorithms lure children into echo chambers of content that have the sole purpose of making the children linger (and watch commercials). 

In other words, YouTube is an unsuitable medium for children, which is bad for both children and YouTube.

The good example

Most music streaming services make use of algorithmic recommendations, which create personalized playlists for their users. Many of these tend to create musical echo chambers where the users’ musical tastes are rarely challenged. In this market, the Tidal service stands out because they have hired a small army of curators who make human-made playlists. Not all users like Tidal’s playlists, but the strategy gives Tidal an ethical profile that stands out significantly from the competition.

Can’t get enough of design and innovation? We hear you. And we have you covered.

Sign up for our newsletter to get the latest from our world delivered straight to your inbox.

Sign up for the ddc newsletter

Copenhagen

Bryghuspladsen 8
BLOX, 2nd floor
1473 Copenhagen
CVR 3699 4126

Kolding

Sdr. Havnegade 7
Pakhuset
6000 Kolding
CVR 3699 4126

Unless otherwise stated, all content on this website is presented under the Creative Commons Attribution License.