That’s a funny way of reacting.
No hobbies. No pet.
That is really like, from the America point of view, let’s say in the business model Facebook, that would be a huge infringement as in terms of saying, "As long as I book an ad and I pay for it, I can display my ad."
Very cool. That’s the learning out of Cambridge Analytica I suppose.
[laughs]
Leads right away into the last big topic that we have today about ethics, the human in ethics and also the human factor back in the world of technology. That’s dear to your heart if I may say.
When I remember this one slide you showed me, your Prime Minister making fun of himself about being bald now, that’s actually quite a good example for having the human factor within the digital if you will. Why do you care so much about that? How did this come about? What is your take on it to implement that?
That also requires a capacity of empathy, which is a very human thing to be able to. The question would be how you can implement this even if you would say this positive bias or whatever you want to call it within the technological framework? Because we seem to be lacking this.
Even though you have now assembled, like make sure you can display that societies have similar issues they care about, still we are in a age where divisiveness is rating high and lack of empathy, if you see the European debates on refugees, it’s tremendous. How do you see this factor? How can we bring empathy into technology?
Is that your prerogative for the ethic of the digital, to just go on such a methodological stage and say, "Look, that’s how we propose it"? You look in a 20-year future?
Thank you very much. That was very insightful. I’m excited.
Awesome. Can we take a photo?
That would be super lovely.