Ethical data part 4 – Working with data : how to respect ethics?

Now that we have established what ethics are, how they are different yet linked to legislation and which components of ethics are important for the consumer, it’s time to ask ourselves the crucial question:

how does one start be ethical?

A good beginning is to look at the way ethics are managed elsewhere. By doing so, we see that some professions have established codes of conduct:

  • Medicine has the famous Hippocratic oath (“First, do no harm”)
  • Lawyers pass an oath on the day of their admission to the bar
  • Journalists have oaths too

Wouldn’t it be a good idea to have a similar code of ethics for all persons working with data?

The limitations of regulations

Regulations are a necessary frame, but they can’t be the only answer. As said before, technology is advancing quickly, while regulation moves slowly. In a way, we are constantly regulating yesterday’s technology. This allows many abuses because they comply with outdated regulations. Finding the loopholes in current laws becomes a national sport.

So, of course businesses, as individuals, must comply with laws, but this is not a positive approach. Companies who only have that goal in mind will end up doing the minimum required to meet the letter of the law. It’s not forward-thinking. As compliancy follows regulation, which follows social impact of technology, this is just thinking about yesterday’s battles. A better way to go forward is to already respect the societal consensus.

The role of associations

A business annoying his customers is at risk of losing them. Trade associations are aware of those risks and are therefore keen on self-regulating to avoid this. This happens at worldwide or European level (iab, DMA, IAE, IAA…) and also on the national level (BAM, UBA …in Belgium). These initiatives are valuable, as they are usually ready before legal regulations. But they usually still follow rather than lead.

Being proactive: Code of Ethics

So how can we lead? How can we be proactive rather than reactive? Why not take inspiration from data giants? Microsoft has its own code of conduct, the AI Fairness Checklist. IBM has one too, the Trusting AI framework. They are not the only ones.

Why create such codes?

Well, simply because we shouldn’t have corporate lawyers defining our future for us. We should be responsible of the direction we take the data field to. We must create a Code of Ethics that guides us proactively.

Surprisingly, this Code of Ethics doesn’t have to be complex. According to Professor Jagadish from the University of Michigan, whose work has been a huge inspiration for this series, the code could even be summarized in 2 simple articles:

  1. Do Not Surprise: Do not surprise the data subjects with what data you collect and how you share or use it. Ensure proper data disclosure, to build trust.
  2. Own the outcomes: our processes have societal impact. It’s not enough to say there is nothing wrong with our technical processes. If they lead to undesirable outcomes, we need to work to modify them. Even if there is nothing technically “wrong” with them. Ensure regular societal assessment, to meet consumers expectations.
Open a direct line with your customers

To be able to perform your societal assessment, you need to hear what your consumers have to say. They will have questions or will need explanations: give them a channel on which they can talk to you. Listen with empathy. Reply, explain, educate… and change if needed.

Focus on transparency and the ability to explain. Describe:

  1. Do not Surprise
  • Who owns the data?
  • What will the data be used for?
  • What can be deduced from exposed data?

2. Own the Outcomes

  • Is the data analysis valid?
  • Is the data analysis fair?
  • What are the societal consequences?

In everything you do, remember that ethics matter. Working with data has great power: to help, but also to harm. We need to care about how this power is used. We cannot hide behind the claim of neutral technology. The whole society will be better off if we voluntary limit how this power is used.

Of course, ethical analysis is difficult. In a complex world, with many actors, it is often difficult to see where problems are coming from, and how to define the limits to the use of data. Ethics should be the framework to use to answer those questions, each time we must use data.

We owe it to our customers, to the future of the marketing and business profession, and to the society as a whole.

Why this extensive series on data ethics?

At Black Tiger Belgium, we are commited to building a  new generation of data leaders.

Ethics, transparency, innovation and security are the 4 pillars of the Black Tiger DNA.

Ethics

All personal data touches the privacy of every individual in the most direct way, hence our obsession with data ethics. Black Tiger developed the Master Data Platform, the first end-to-end, data quality and native GDPR platform.

Transparency

From technology to customer relations and management, at Black Tiger transparency is at the heart of everything. Data traceability is a given and must be a central topic for our clients. Transparency, as a recurring and unavoidable theme at Black Tiger, is translated into nocode development to allow all end users to easily and fully understand the tools they use.

Innovation

Change is inevitable. Either we undergo it or we anticipate it. At Black Tiger Belgium, we are passionate about technological innovation and we anticipate it.

Your data is kept safe, local and well-protected

Personal data is private data. It is therefore inevitable to approach security issues with the utmost seriousness. At Black Tiger Belgium, we don't compromise on security.

See also : blog 1 - An introduction to data ethics blog 2 - What are ethics anyway? blog 3 - Ethical opportunities