Feelings seem like the Achilles heel of artificial intelligence. We may be a long way off from emotional AI as scientists still debate the best way to make it happen.
In biology, homeostasis is the ability of a biological system to maintain its equilibrium despite external constraints in order to survive.
Take the human body as an organism, for example.
The body’s homeostasis allows it to resist changes in the external environment to regulate and maintain its biological parameters. Body temperature, blood sugar levels, blood composition, and many other constant values in the body are permanently regulated through physiological mechanisms.
Now, researchers argue that imbuing artificial intelligence with homeostasis-based goals would help it become smarter and less dangerous to humans.
Emotional AI Needs a Sense of Homeostasis
John McCarthy, who coined the term “artificial intelligence,” defined it as “the science and engineering of making intelligent machines that have the ability to achieve goals like humans do.”
This definition sums up the current goal-maximization approach to AI, where intelligent agents have to work toward achieving preset goals.
That leaves us with the question: which goals, or more precisely, whose goals?
In nature, biological systems don’t maximize any goals. Instead, driven by the homeostasis principle, they try to keep a certain balance from departing from its setpoints.
Living organisms, like humans, strive to achieve then maintain homeostasis through operant conditioning, not to maximize it. This is similar to how cybernetic control works where closed systems, both natural or artificial, automatically self-regulate after detecting changes using feedback.

Two neuroscientists propose a way to instill homeostasis into machines to make them more emotional.
Kingson Man and Antonio Damasio from the University of Southern California think artificial emotional intelligence, or emotional AI, needs the ability to sense danger for its survival.
The authors write in a new paper published in Nature Machine Intelligence:
“In a dynamic and unpredictable world, an intelligent agent should hold its own meta-goal of self-preservation, like living organisms whose survival relies on homeostasis: the regulation of body states aimed at maintaining conditions compatible with life. In organisms capable of mental states, feelings are a mental expression of the state of life in the body and play a critical role in regulating behavior.”
In other words, it’s about allowing AI systems to develop feelings that would guide their behaviors.
Homeostasis-based emotional AI would be “safer” than goal-oriented because, at least, it’d be much more restricted. An intelligent agent wouldn’t be able to maximize one goal in detriment of other goals because this would upset its homeostasis balance.
The advances being made in the soft robotics field make the idea of homeostasis-based emotional AI more approachable.
A “soft artificial skin” would enable a robot to have a sense of touch that helps it to identify dangers and develop self-preserving behavior. In short, it is working to make machines more vulnerable.
“Rather than up-armoring or adding raw processing power to achieve resilience, we begin the design of these robots by, paradoxically, introducing a vulnerability,” write the researchers.
Comments (0)
Most Recent