It helps to think of technology as clothing. In the past we killed animals and wore their skins to protect ourselves from the elements. As our knowledge and skills evolved so did our needs. Just like with clothes, we use technology not only for practical things but also for social signalling.
Just like with almost every kind of technology we’ve used in the past, from the spinning jenny and its Luddite detractors to coffee and refrigeration and those who resisted them the “new” both excites us and frightens us. It presents us with opportunities and threatens to upend the ‘traditional’ way of doing things.
Smart algorithms and artificial intelligence are no different. To say that we have a choice in whether we use this technology or not is sophistry. Our world is made of data. We use data the same way we’d use social cues and behavioral signals in the past: to understand what people do and try to infer why.
Much of that activity is upended and discussed within the boundaries of analysis done for marketing purposes but a lot of it spills over on activities we’ve always engaged in, in the past: navigating the present, projecting into the future and understanding the world, in the process.
As the world becomes more abstract our ability to create more refined layers of entities representing it requires us to actively build ontologies that require us to define entities and the active dependencies between them. In that context even the ‘simple’ art of pistol shooting becomes an extended mental activity that requires embodied cognition in order for us to apply our sensemaking skills to what we observe.
To illustrate the point think what happens to our thinking when we take something as simple as a tree and turn it from: “wood, green leaves, floats in water” to forest (and its implications on the ecosystem) to classification of particular types of tree and their effect on the environment to ontologies of tree types and their interaction with each other to data reference points that make sense when viewed via drone from the sky or via satellite from space.
In both the case of the AI and the mechanical digger we need to retool our cognitive arsenal. In the case of the digger we need to learn how to operate it, when and where. What type to go for and when that particular type should be used. Intuitive as this may seem right now it’s only because we, as modern humans, understand mechanical technology better than our ancestors.
We don’t exhibit quite the same refinement in our knowledge however when it comes to artificial intelligence. And we should. Not just be designing artificial intelligence tools that are both smarter and more transparent to us so we can query them better. But also by being smarter ourselves.
A person from the Middle Ages would look at an excavator as some kind of demon magically animating some kind of machine powered by an unworldly propulsion system. The person operating it would clearly be prioritized for a one-to-one meeting with an exorcist.
This doesn’t happen any more because our understanding of the world is now several layers of abstraction denser than our medieval cousins’. But, just as learning to use an excavator is an application of higher-order abstraction that enables us to massively multiply our puny muscle power, so does the use of an AI massively multiply our relatively puny brain power (and notice I use the word “relatively” here because in truth some tasks our brain performs are beyond even the most advanced computing at the moment).
No simple answer is going to successfully deal with a complex problem. For that we must first upgrade ourselves and our approach to the issues we face. And that starts with the way we organize our thinking and then the way we use the tools at our disposal.