What real bodies can show artificial minds
Published Date: 3/15/2024
Source: axios.com

A fundamental facet of intelligence found across the entire animal kingdom is beginning to be unraveled using AI, neuroimaging and other tools.

Why it matters: Language, reasoning and other abstract skills tend to get the most credit for human intelligence. But gaining knowledge of how the world works by walking, crawling, swimming or flying through it is an important building block of all animal intelligence.


  • Some AI researchers think this "embodied cognition" is a necessary ingredient to achieve advanced AI.
  • Others have argued that, to deliver a sophisticated intelligent machine, it's enough to scale up the large language models that underpin ChatGPT and other generative AI tools with more data and more computational power.

How it works: Teams of neuroscientists, anatomists and machine learning researchers around the world are building detailed virtual models of rodents, flies, and human infants.

  • Researchers from Google DeepMind and HHMI's Janelia Research Campus have built a virtual fruit fly by combining an anatomical model of the fruit fly skeleton, simulations of the physics a fly experiences (fluid dynamics, adhesion, gravity, etc.) and an artificial neural network trained on fly behaviors.
  • The behavior of the virtual fly is compared to the behavior of a real fly to update the virtual model until it behaves like a real fly.
  • The virtual fly walks, flies and crawls upside down like its biological counterpart, the researchers say. Members of the team previously built a virtual rodent.
  • Researchers at EPFL in Lausanne, Switzerland, also published a virtual fly model late last year.

The goal is to understand "how the body mediates between the brain and the world," says Srinivas Turaga, a neuroscientist at Janelia and co-author of the preprint paper about the virtual fly, which was posted today.

  • Eventually, these models might be combined with diagrams of how neurons in the brain are connected with one another — "connectomes "— to try to understand how a network of neurons gives rise to a particular behavior.
  • "The body and the nervous system evolved together," Turaga says. "And so intelligence, in some sense, isn't just in the brain. There's also mechanical intelligence" that helps animals move.

The intrigue: Embodied cognition also helps animals understand how the world works — by experiencing it.

  • "There's an argument to be made that biological systems learn from interacting with the world," says Jochen Triesch of the Frankfurt Institute for Advanced Studies.
  • Doing so allows animals to learn about the physics of the world — causality, gravity and other relationships and forces — and, critically, to see the consequences of their actions, he says.

Triesch and his colleagues are interested in human cognitive development and have developed MIMo, a virtual human model with the body of an 18-month-old child with five-fingered hands. Its virtual body senses its surroundings with binocular vision, proprioception and a full-body virtual skin.

  • MIMo isn't as detailed as the fruit fly model, but that makes it much faster to simulate, Triesch says. There is a tradeoff between the level of realism and the computation required, and the MIMo researchers believe the critical part of their model is the touch-sensitive skin rather than the exact body shape.
  • "We're trying to do a serious attack of the problem of consciousness and how it develops," Triesch says. And while the idea that human cognitive development relies on embodied interactions with the world isn't new, the tools to study it — including AI and cheap computation — are now more readily accessible.

The big picture: A group of prominent AI researchers last year advocated for an "embodied Turing test" to shift the focus away from AI mastering games and language, which are "well-developed or uniquely human, to those capabilities — inherited from over 500 million years of evolution — that are shared with all animals."

  • "Most machine learning systems today learn by basically passively absorbing large data sets, whether it is video or images or captioned images," Triesch says.
  • Learning through interaction with the world is something "really essential that most of the machine learning community is right now completely missing."

It's an open question — and debate — whether information about the brain-body relationship gleaned from neuroscience studies can be used to teach machines to work in the physical world.

  • "In AI, that's the hardest problem yet," says Aran Nayebi, a postdoctoral researcher at MIT who works at the intersection of AI and neuroscience to try to reverse-engineer neural circuits, including the visual system in mice.
  • Yet, across the phylogenetic tree spanning hundreds of millions of years — from small animals to big animals — the brain has solved how to work with the body to complete complex tasks, he says.