Skip to content

The Digital Ethics Compass:

Digital products and services get better from data, and it is therefore tempting to collect as much data as possible.  But it is not legal to collect data that one does not need.  And even if one is within the bounds of the law, data collection can easily become very unequal, so it is the company that reaps all the benefits while customers are left without knowledge of or control over their own data.  It is your ethical choice whether you as a company will use data in a way that increases people’s sense of control, or whether you will use data solely for your own benefit

01. Are you collecting too many data points, and do you keep them for too long?

Get smarter – what is it?

When data is considered the new oil and artificial intelligence becomes better with more data, it can be tempting to collect data without restraints. However, you should be aware that excessive data collection is illegal under the GDPR, and besides, it is also unethical, impractical, and risky. 

Large amounts of data increase the risk of data leakage, and it also makes it more difficult for you to handle data on behalf of the user. 

Finally, it is profoundly unethical to collect data that is not highly business-critical because it exposes users to unnecessary risks of privacy breaches and because it helps to strengthen data inequality between businesses and citizens.


  • Always try to remove a data point for users rather than adding a new one. 
  • Always ask yourself if you need this particular data point. 
  • Be sure to clean up old data that is no longer needed.

The bad example

Danish high schools use the platform Lectio for administration and communication between teachers and students. The problem with Lectio is that the data collected is not deleted, which means teachers can go back in time and find information about grades and sick leave on students who have long since finished high school. The data is available so that all teachers at a school can access the information without a password. The practice is probably a violation of the GDPR, but it is also unethical handling of data because they are storing data that serves no purpose.

The good example

DuckDuckGo is an alternative to Google’s search engine that does not collect user data or track user searches on the web. DuckDuckGo does not even know how many users they have, as they do not track users. Because DuckDuckGo does not store information about users, users will only see ads related to their current search. When using DuckDuckGo, it is clear to the users that they aren’t being tracked across web pages and that the search engine is not collecting too much data about their movements online.

02. Do you anonymise your data?

Get smarter – what is it?

By default, you should always assume your data collection could be published online tomorrow. What would the consequences then be? Can individuals be identified in your data? Or have you ensured that the data is so anonymized that nothing can be disclosed about the individual? Anonymizing is not easy. It is not enough to remove names, addresses, or social security numbers because someone’s identity can often be deduced by combining or merging data with other publicly available data sets. 

You may not collect personal data at all, but you should be aware that seemingly harmless data can also help identify individuals. For example, one can derive dietary habits from users’ recipe collections to determine any chronic diseases.


  • Ensure that data about people and their identifiable information are securely separated. 
  • Consider whether you really need data about individual people or if aggregating data is sufficient. 
  • Invite independent experts to assess whether your anonymous data can be de-anonymized by merging data.

The bad example

In 2017, the training app Strava published a world map, which showed aggregated data for all the cycling and running routes that people had uploaded to Strava – seemingly hassle-free, because all data was anonymized. But the problem was that in countries like Afghanistan, there was very little activity apart from selected locations, namely US bases, where soldiers used the app when training. It turned out that the map came to show precise locations and routes of bases and soldiers who exercised on their own in the landscape. In other words, it was showing sensitive data!

The good example

Using the Strava training app to track and share bike rides seems innocent to most users. But few people think about the fact that knowledge of where a bike ride starts and ends is also knowledge of where an expensive bike is parked. Precious data for bike thieves! Therefore, Strava has chosen to obscure the exact point where a bike ride starts and ends, even though it provides a worsened user experience for both those who share the bike ride and for those who follow their friends’ bike rides. They have sacrificed a little on the user experience in exchange for giving users a much more secure app.

03. How do you store data?

Get smarter – what is it?

There is an ethical and legal duty to store data securely. Nevertheless, errors still occur when storing data, and the errors are often due to companies and organizations simply not thinking about storing sensitive data. Digital platforms often make it very easy to collect and store data. Therefore, data collection can also happen without control within an organization.

An example is the Danish company Medicals Nordic, which was responsible for testing corona patients. The company used WhatsApp for daily workflows, and when they suddenly had to test thousands of Danes, they chose to continue using WhatsApp as the platform where they shared test results. Subsequently, the company was fired by the Danish regions. But the example shows that it is often simple thoughtlessness that leads to insufficient data processing. 


  • Make sure that all employees are familiar with the fundamental rules regarding the use of data. 
  • Consider your use of online platforms where you do not have complete control over data (Facebook, WhatsApp, Dropbox, etc.). 
  • If you are not an expert in data storage, you should hire external experts to secure your solutions.

The bad example

Copenhagen Zoo has 140,000 annual cardholders. The log-in page for these annual cardholders had no restrictions on log-in attempts, which made it easy for unauthorized people to try out and access the cardholders’ personal information, including card number, name, address, and email. It got reported as a breach of personal data security and was criticized by the Danish Data Protection Agency.

Now, the zoo has had their members change passwords and they have introduced the function ”I am not a robot,” which ensures that a program cannot cheat the system. Also, they have introduced a feature that, after three failed attempts, the system will block access for one hour.

The good example

Telegram is an online chat service similar to WhatsApp and Facebook Messenger. However, Telegram has a strong focus on encryption and privacy protection. Telegram has, among other things, a data-safe function called Secret Chats. When sending messages via Secret Chats, the chat is fully encrypted, and no data is stored centrally or is accessible by employees at Telegram. You also cannot forward Secret Chats, and you can even set messages to self-delete for both the sender and recipient after a certain period. Secret Chats are therefore safe, as long as you have your phone safely stored in your pocket.

04. Do you give people access to their own data?

Get smarter – what is it?

If people are to have control over their data, they also need access to their data. This means:

  • That you must ensure that all data you have collected about your users is visible to the user. 
  • That even though you collect data from many sources, you should make sure that the user can access their data from one place. 
  • You have a responsibility to present data so that it’s understandable to the user, even if it may be cryptic. 

You are ethical if you have a dedicated area in your digital solutions where people can access data that pertains to them. It must be easy to find, and the data must be easy to see and to understand. If data is used for creating new data (for example, through profiling), then the new data must be just as clear and easily accessible. Also, people need to have control over data. Essentially, users should always be able to delete data. If it makes sense, users should also have the option to correct their data to make it more accurate.


  • Ask your users how they would like to access their data. 
  • Test your solution to see if users can access and understand their data.

The bad example

Most Danes use an online banking solution where you usually have full access to all data about your finances. However, banks often use financial data to profile their customers in different earnings segments, and very few banks exhibit this profiling to their customers. 

Banks also have data on how much they earn on each customer (the price of being a customer in the bank), but this information is also not readily available on online banking platforms. 

Banks are thus adept at providing raw data, but when it comes to aggregated data, which can be very valuable to their customers, their digital solutions are severely limited.

The good example

Following the Cambridge Analytica scandal, Facebook was criticized for collecting too much data about their users and doing so without users’ awareness. Subsequently, Facebook has designed a comprehensive and user-friendly area on the platform where you can get an overview of your data. Facebook also provides the options to restrict the collection of data and delete personal data. Because Facebook’s business model is about data collection, the company often makes it difficult to find places where you can restrict their access to data. In some cases, users get warned that they will lose functionality if they don’t allow the data collection, which is both unethical and on the edge of EU law.

05. Have you obtained user permission to collect and process data?

Get smarter – what is it?

Most companies have understood that it requires permission to collect data about their customers and users. The ethical problem is more often about whether one has obtained real consent where the user also understands what it entails. Think of the many cookie pop-ups that users encounter online today. How many users know what they are doing when accepting cookies? 

You have an ethical obligation to obtain permission in a way that is understandable to your users. It often means that you have to communicate much more concisely and pedagogically. But remember that if it is difficult to explain why you are collecting data, then there may be a case for not collecting it in the first place.


  • Do not let lawyers write the texts for the users/customers alone. Let communicators write it. 
  • Remember that rarely are texts of more than 5 – 10 lines read all the way through. 
  • Consider having two documents: a formal/legal one and a document that is easy to read but perhaps not entirely legally correct. 
  • If you cannot get actual and fully informed permission from your users, do not collect data.

The bad example

Most Facebook users are aware that their data is used for targeting advertisements and other content. But only the rarest of users on Facebook understand how their data can be used and misused. A glaring example is the Russian-developed app Girls Around Me. The app combines freely available data from Facebook and Foursquare to create a stalker app where (typically) men can log in to the app and view women nearby with data about when they were last located at a specific location. The app caused quite a stir when it launched and has since been removed from the Appstore.

The good example

At the time of writing, Danish telcom uses a cookie pop-up that gives users a choice between “All cookies” and “Only necessary cookies.” With a single click, the user can deselect all cookies except the technical cookies necessary to make the site work. It is a user-friendly way to get acceptance for cookies. You even have the opportunity to dive deeper into the information and adjust your choice further (which, however, is probably only done by a few).

 On the other hand, TDC has chosen to color the “All cookies” button an alluring blue, while the ”Only required” button is white on a white background. Here, there is no doubt about what TDC wants from the user, and the company could have designed it more ethically.

06. Do you inform your users about how they are profiled?

Get smarter – what is it?

Artificial intelligence and algorithms are getting better and better at finding patterns in data, and often they see patterns in data that are not immediately visible to the human eye. Companies and organizations can develop profiles about their customers and users that contain knowledge that not even the users themselves know. Companies can determine people’s creditworthiness and preferences in books alongside mental illnesses, sexual preferences and political attitudes. 

It creates some obvious ethical challenges, both concerning inequality and human rights. Profiling can establish a high degree of inequality between the company that profiles and the person who (perhaps unknowingly) is a victim of profiling. 

Profiling can also conflict with fundamental rights, which are about not storing sensitive personal data or discriminating based on gender, race, sexuality, etc.


  • Be aware that database profiling can violate fundamental human rights. 
  • Do not create profiles that end up being personal data. 
  • Always inform your users and customers about how they are profiled. 
  • Make it easy for users to understand how you have created their profiles. 
  • Profile data is also the property of users, and they have the right to access and delete this type of data.

The bad example

Target is an American retailer that collects large amounts of data on people’s buying behaviour, using this data to profile their customers. The profiles are for sending tailored offers on products. One way is by creating profiles on whether customers show signs of being pregnant to send suggestions on pregnancy and baby products. In 2012, this resulted in Target sending pregnancy-related offers to a young high school girl, even though neither she nor her parents were aware of her pregnancy. This story created problems in the small family and made Target reconsider its use of profiling in marketing.

The good example

Facebook is rarely a company showcased as particularly data ethical but, the company is good at showing how their collected data categorizes (profiles) the individual user in different areas of interest, used to target ads.  You can find this information under privacy settings on your Facebook. It’s easy to find and user-friendly. You can get an overview of the hundreds of interests linked to your profile on Facebook. 

Users can also choose to delete information to remove personal marketing within this subject area.

The Digital Ethics Compass

Can’t get enough of design and innovation? We hear you. And we have you covered.

Sign up for our newsletter to get the latest from our world delivered straight to your inbox.

Sign up for the ddc newsletter


Bryghuspladsen 8
BLOX, 2. floor
1473 Copenhagen
CVR 3699 4126


Dyrehavevej 116
Design School Kolding
6000 Kolding
CVR 3699 4126

Unless otherwise stated, all content on this website is presented under the Creative Commons Attribution License.