Last year has been especially prolific in scary headlines about the implications of “emergent” technology such as AI or robots. I’ve been reading about this topic, attending some talks and thinking it through.
So I decided to break my long blogging silence to share some thoughts with you:
#1 – the right discussions aren’t happening yet
Media posts about the implications of technological development tend to be about job losses and our gloomy future. We cannot deny that the introduction of new technology, especially as disruptive as the ones we are now developing, has a societal impact as this amusing story about the introduction of cars during the Victorian era (taken from Don Tapscott‘s new book “Blockchain Revolution”), shows:
But we should promote more open discussion around these topics and less “it’s-the-end-of-world” clickbait posts.
If the development and introduction of many of these technologies is inevitable, and I believe it is, then we should start discussing the practicalities surrounding their existence. As Prof. Andrew McAfee stated during a discussion panel at the Web Summit 2016 in Lisbon fearfully (more on that topic in a while) named “Farewell the age of human supremacy”:
“We should not let ethical decisions to technology, we should decide ourselves and then embed those principles in technology.”
Thus, we should both discuss what AI or machine learning mean to society but also consider how to develop practical ethical guidelines based on concrete scenarios or experiences.
#2 – we’re confused about what is new technology
“Our unique algorithm” is the new black! Virtually every startup now claims to have a “unique algorithm” as the special kryptonite that powers their solution.
But most of what we label as new or emergent… isn’t. Artificial Intelligence (AI) research has been around since the 1950’s and machine learning (as a separate research field) since the 1990’s. What is new now is the access to computing power, unthinkable some 10 years ago, and the democratization of these technologies.
I believe that it matters to know the difference, to understand the context of technology development in order to truly understand why, for some technologies, it seems that the time to flourish is now.
As another speaker at the Web Summit (whose name I shamefully failed to note amidst the attention tsunami that is an event as big as that) stated, we are at times too quick to dismiss new technology without properly considering it:
“We do a disservice to the discussion about technology when we rapidly dismiss tech in the hype cycle instead of discussing how to make sense about it.”
#3 – our fears are our demons
Do you know why you can’t find wooden houses in Portugal? Do you know the three little pigs children’s tale? In the tale, both the straw and wooden houses are easily destroyed so the only way to keep safe is to use concrete to build your house.
I’m sure other factors influence the lack of wooden construction but we should not underestimate the role of the stories we hear and tell, and our basic instincts, when it comes to making sense of new things.
In a terrific interview for The Guardian, Genevieve Bell, anthropologist in residence at Intel, explains how fear dominates our reactions:
“Western culture has some anxieties about what happens when humans try to bring something to life, whether it’s the Judeo-Christian stories of the golem or James Cameron’s The Terminator. […] What we are seeing now isn’t an anxiety about artificial intelligence per se, it’s about what it says about us. That if you can make something like us, where does it leave us?”
Again, this comes back to the importance of discussing these topics widely and openly. We need to shake the demons out of our heads and make sense of technology and change. But for that, we need to involve not also technologists but also
#4 – change by design
I’m a firm believer that we learn a lot by doing, by experimenting. It helps us understand on our things are made and how they work and it also helps build empathy for the work and effort of others. The fact that some companies are involving the community of techies and curious minds in advancing research and making sense of the possible use cases for their technology stack, as Amazon is doing with Alexa, is important.
It is up to us to envision the way we will incorporate these technologies into our products, services, businesses and societies and also to design the grounding principles we will embed in them.We should not forget that there is still a human responsibility regarding what we feed to the machines and how we tailor them. Poorly designed products, as the controversial Microsoft AI chatbot, or poor data quality will negatively influence technology design.
As Genevieve Bell pointed:
“We are building the engines, so what we build into them is what they will be. The question is not will AI rise up and kill us, rather, will we give it the tools to do so?”
And citing Andrew McAfee again:
“We have powerful tools to shape our destiny but it is up to us to shape our destiny, not technology.”
Wasn’t this always the case?