The relationship between artificial intelligence (AI) and human emotions, particularly in the context of our dietary preferences, is a fascinating and evolving field of research. While AI has made significant strides in recent years, it has predominantly focused on the analytical and logical aspects of human intelligence. Emotional intelligence, which plays a crucial role in our daily lives, has largely been overlooked in AI development. A team of researchers at Penn State University has embarked on a pioneering journey to bridge this gap and create Emotional Artificial Intelligence systems that incorporate emotional intelligence.
Understanding the Complexity of Human Behavior
Human behavior is an intricate interplay of physiological needs and psychological desires. Emotions often dictate our choices, and this complexity has posed a challenge for creating emotional artificial intelligence systems, which typically rely on mathematical models and data for their decision-making processes. Unlike human behavior, which is observable but challenging to measure accurately, replicating emotional intelligence in a robot is currently an elusive goal.
The Role of Taste in Emotional Intelligence
One of the fascinating aspects of emotional intelligence in human behavior is our relationship with food. What we choose to eat is profoundly influenced by the process of gustation, which refers to how our sense of taste guides our food choices based on flavor preferences. This process differs from eating solely out of physiological hunger.
Imagine having access to a wide array of food choices. You’re unlikely to select something very bitter; instead, you’d opt for something sweeter. This preference showcases how our psychological state influences our food choices even when we’re not physically hungry.
Replicating the Gustatory Experience in AI
The researchers at Penn State have developed a novel approach to mimic the human gustatory experience in AI. They’ve created an “electronic tongue” and an “electronic gustatory cortex” using 2D materials, which are incredibly thin materials composed of one to a few atoms.
The electronic “taste buds” in this system consist of tiny, graphene-based sensors called chemitransistors, capable of detecting gas or chemical molecules. The “gustatory cortex” incorporates memtransistors, a type of transistor that can remember past signals, made from molybdenum disulfide. This setup enables the creation of an “electronic gustatory cortex” that connects physiological “hunger neurons,” psychological “appetite neurons,” and a “feeding circuit.”
For example, when the device detects salt (sodium chloride), it senses sodium ions, allowing it to “taste” salt.
Strengths of the 2D Materials
The choice of two different 2D materials, graphene and molybdenum disulfide, complements each other’s strengths in forming this artificial gustatory system. Graphene serves as an excellent chemical sensor, while molybdenum disulfide functions as a semiconductor for circuitry and logic, critical for mimicking brain circuits.
Applications of the Technology
The potential applications of this robotic gustatory system are promising. They range from AI-generated diets based on emotional intelligence for weight management to personalized meal recommendations in restaurants. The researchers’ next goal is to expand the range of tastes the electronic tongue can recognize.
They aim to create arrays of graphene devices that mimic the approximately 10,000 taste receptors on our tongues, each slightly different from the others, allowing for the discrimination of subtle taste differences. Ultimately, they envision AI systems that can be trained to excel in tasks like wine tasting.
Future Directions and Beyond Taste
Beyond enhancing the technology for tasting, the researchers plan to integrate the “tongue” and the “gustatory circuit” into a single chip, streamlining the system further. Additionally, they aspire to extend this concept of gustatory emotional intelligence to other human senses, such as vision, hearing, touch, and smell, contributing to the development of advanced emotional artificial intelligence systems.
While the circuits demonstrated in this research are relatively simple, the goal is to refine them further to closely replicate human behavior. As our understanding of the human brain advances, these technologies may become even more sophisticated.
Contributors and Funding
The study involved a team of researchers, including Dipanjan Sen, Akshay Wali, and Harikrishnan Ravichandran, alongside the authors Saptarshi Das, Andrew Pannone, and Subir Ghosh. The research received support from the United States Army Research Office and the National Science Foundation’s Early CAREER Award.
As we continue to explore the complex realm of human behavior and emotions, the integration of emotional intelligence into AI systems like this electronic tongue marks an exciting step toward creating AI that understands and interacts with us more like humans do.