To Make Analogies is to Be Human11-12-2021
John Pavlus interviews Melanie Mitchell, an AI scientist at the Santa Fe Institute. Mitchell is convening a series of interdisciplinary workshops “examining how biological evolution, collective behavior (like that of social insects such as ants) and a physical body all contribute to intelligence.” But beyond these social scientific insights, Mitchell is above all concerned with the way human intelligence depends on making analogies. She explains that analogy-making is central because it is key to the human capacity for abstract thinking.
It’s a fundamental mechanism of thought that will help AI get to where we want it to be. Some people say that being able to predict the future is what’s key for AI, or being able to have common sense, or the ability to retrieve memories that are useful in a current situation. But in each of these things, analogy is very central.
For example, we want self-driving cars, but one of the problems is that if they face some situation that’s just slightly distant from what they’ve been trained on they don’t know what to do. How do we humans know what to do in situations we haven’t encountered before? Well, we use analogies to previous experience. And that’s something that we’re going to need these AI systems in the real world to be able to do, too.
One reason people haven’t studied it as much is because they haven’t recognized its essential importance to cognition. Focusing on logic and programming in the rules for behavior — that’s the way early AI worked. More recently people have focused on learning from lots and lots of examples, and then assuming that you’ll be able to do induction to things you haven’t seen before using just the statistics of what you’ve already learned. They hoped the abilities to generalize and abstract would kind of come out of the statistics, but it hasn’t worked as well as people had hoped.
You can show a deep neural network millions of pictures of bridges, for example, and it can probably recognize a new picture of a bridge over a river or something. But it can never abstract the notion of “bridge” to, say, our concept of bridging the gender gap. These networks, it turns out, don’t learn how to abstract. There’s something missing. And people are only sort of grappling now with that.