Coding Culture: My Favorite Question at SHAD 2019
Last week I had the pleasure of interviewing the great Dan Doiron live at SHAD 2019 at the University of New Brunswick. SHAD is an intensive STEAM focused program for exceptional youth across the country and their questions made it clear the distinction is deserved.
Half of my brain expected the questions to only be technical- the traditional definition of smart. This, to me, was a worrisome thought: navigating 21st century change will require the best technical talent we’ve ever produced but it will also require navigating social issues on a scale we’ve never encountered.
My favorite question was about the very intersection we deal with here: the intersection of culture, business & technology.
It was related to an MIT study called the Moral Machine. The bright young man asking the question referred to the Trolley Problem: a seminal morality question that requires you to codify your morality in real-time: a train is running on a track directly at 5 unassuming victims and you as the bystander can save them by pulling the lever. The lever diverts the train on to track 2 and will kill only 1 person. Do you pull the lever or let fate take its course? The analogy to self-driving cars is clear: when it comes time to make the decision between one life or the other, what does the car choose? Of course by car we actually mean algorithm. What does the algorithm choose? It depends on how we code culture.
The result of the MIT study was exactly what you’d expect: we are not the same, we have different values and the way we code culture will also be different.
The real question was this: will we be able to code for a global culture that all of our algorithms will follow?
Dan kicked us off with the unexpected answer that he can foresee a day in the near future where your car responds to your personal morality; you will literally program your vehicle to code for your values. This is a fascinating question that opens Pandora’s box on responsibility, insurance and how you and your car will interact with the fleet (post coming).
When it was my turn I was more blunt than I intended to be. Will we be able to code for a global culture that all of our algorithms will follow?
No. It’s not possible.
We’re too different.
Different is not a bad thing: it’s a quantitative measure, not a qualitative one. Before the internet we basically thought we were all the same and that we wanted the same things. Yes, we do have some things in common; love of our children being one. But we very clearly do not want the same things and our morality/cultural values are on an infinite spectrum. This is what you’re seeing in social media hysteria: we thought we were the same before we could have a global conversation and now that we can our morality is being put into code and deployed as algorithms. The election of Donald Trump, in hindsight, is not surprising in many ways. The fear & loathing that came out on the campaign trail was always there, it just wasn’t broadcast on Twitter.
Before the internet, the coasts, the mid-west and the south thought they were all basically the same. The Americans, Europeans and the Chinese thought they basically had the same views on data and surveillance. India, Nigeria and Canada thought they basically had the same views on economic growth and energy use.
The reality is that we don’t, or at least we don’t yet.
What will likely happen is one of two things:
Tech companies will domesticate and the domestic algorithms will reflect the morality of the state or company responsible. For example, Twitter was once a platform, similar to common phone carriers like Bell. Then they took sides. Now they’re not a platform, they’re more like a political party or a social movement. Twitter’s algorithms are now reflecting a left-leaning morality of it’s coders that has seen people reporting science get de-platformed because it goes against the social morality of those writing the code. For better or for worse, depends entirely on your view of the world.
How will this affect multi-national companies? Future post!
We will attempt to codify a global morality. The UN will hold meetings that deal with global algorithms and it will go well or it won’t. We may be forced to see each other as different but to respect each other nonetheless. Do Chinese citizens truly feel comfortable living in a surveillance state? For those who don’t will they be issued visas to immigrate elsewhere to a country that shares their morality? Will people be able to opt out or will they be forced to live with what the majority votes in to power?
How we code culture will be one of the most fascinating social experiments we’ve ever experienced.