Can AI mimic human emotions and intuition?

C

The human right brain is responsible for emotions, and the possibility of AI mimicking them is discussed. Through the neuroscientific processes involved in emotion formation and advances in AI technology, we will explore whether AI can learn and express human emotions and intuition.

 

The human brain is composed of a right brain and a left brain. Most of the higher mental functions are performed by both brains working together, but the functions of the right and left brains are different. The left brain is responsible for verbal, mathematical, analytical, logical, and rational functions, while the right brain is responsible for non-verbal, spatial, intuitive, and emotional functions. “According to Brian Christian, author of The Most Human of Humans (2012), the right brain is one of the key differences between humans and AI. He argues that the computational and logical functions of the human left brain can be performed by current computers, but the emotional functions of the right brain are difficult for AI to perform. However, I disagree with him and believe that if we can understand how emotions, moods, and intuition are formed in the brain, we can apply them to AI. I will discuss the possibility of AI mimicking right-brain functions by looking at emotions, which are the most representative of right-brain functions and the most commonly claimed emotions that AI cannot have.
First, the generation and expression of emotions in context is similar to an AI’s algorithms. Emotions are shaped by evolution and memory. Compared to other creatures, mammals have a more developed limbic system and cerebral cortex that are responsible for emotions. The interaction of these two parts of the brain allows for complex responses, which are called emotions. Evolving to survive and reproduce, mating to reproduce and fleeing from natural enemies to survive, mammals have developed the emotions of love, fear, anger, and sadness in common. This is evidenced by the fact that an untrained newborn baby smiles when it’s happy and cries when it’s not. These unlearned emotions are called primary emotions, and they are the foundation for learning emotions.
To illustrate the learning of emotions, let’s take the emotion of anger as an example. When a person faces a situation and feels the emotion of anger, the stimulus releases hormones, especially adrenaline, and the sympathetic nervous system responds. This physiological response is memorized as “anger” and the person remembers the situation that triggered it. In this way, the information that caused the emotion of anger is present in the brain. And by applying and analyzing that data, we can determine which stimuli will trigger the emotion of anger.
Another source of evidence is brain stimulation experiments. When electrical stimulation was applied to the part of the brain responsible for laughter, the stimulated woman was unable to suppress her laughter. At the same time, the doctors found them funny and said, “Doctors, you’re so funny!” What this experiment shows is that the brain can trigger emotions by sending electrical signals to specific parts of the brain in certain situations.
The formation of emotions in this way is much like the algorithms in artificial intelligence and computers. A computer stores data in a memory device, and an AI algorithm sends the corresponding stimulus to a specific part of the CPU. Then, when that stimulus (command) is present, it retrieves the corresponding response from memory and performs it. If we interpret human emotions as responses to experiences and memories, then it is not much different from human emotions when an A.I. receives electrical stimulation to a specific area and expresses the appropriate emotion in a situation that matches the data values corresponding to each emotion.
One might wonder whether the primary emotion is the difference between computers and humans in the process of forming emotions. Of course, A.I. did not evolve from primitive creatures like mammals and develop a limbic system and cerebral cortex. However, it is the existence of primary emotions that is important, not the process by which they are formed. If we had primary emotions, even through human input, the ability to collect and categorize information about a situation at a much faster rate would override the limitations of human input. Artificial synapses and deep learning are the technologies that make this possible.
Thoughts, emotions, moods, and memories are all regulated by neurotransmitters in the brain. The human brain is organized into structural and functional units called neurons, which release neurotransmitters at the end of their axons. The places where one neuron connects to tens of thousands of other neurons to send and receive neurotransmitters are called synapses. Neurotransmitters stored in the axon terminals of a neuron are released when the neuron receives neural information in the form of an electrical signal, and the signal is transmitted to another neuron across the synapse. When a neuron receives a signal, it opens sodium-potassium ion channels to form an action potential within the cell, which then spreads to neighboring cells and signals are transmitted. This is a very fragmentary representation of the processes that regulate thoughts, emotions, mood, and memory, but it is the interactions between these fragmentary neurons that add up to human thoughts and emotions.
Artificial synapses mimic the human neuron-synapse system. This hardware, which mimics human neurons, has made it possible to create artificial intelligence that can react instantly to any situation. Previously, there was a systematic limitation in the speed and variety of responses due to the one-to-one storage of memory and input values. However, artificial synapses, which mimic the process of a single neuron interacting with dozens of neurons, have not only overcome this problem, but have also led to further advances in deep learning technology. A new hardware was born that is both formally and functionally like the human brain.
The development and advancement of artificial synapses has led to the development of deep learning technology. Deep learning technology is a technique used to cluster or classify objects or data. At the core of deep learning technology is classification through pattern analysis. The amount of information that AI takes in is huge, and in order to analyze and categorize that much information, it needs more efficient hardware than traditional hardware. This is where artificial neural networks, which evolved from artificial synapses, come in. Artificial neural networks use dozens of CPUs and GPUs (central processing units) for fast and efficient categorization. In addition to processing information quickly with CPUs, GPUs allow for parallel processing of information, allowing for astronomical amounts of data to be categorized in a short amount of time. The reality of AI storing data on various situations that trigger emotions, categorizing them, and emitting appropriate emotional expressions when faced with new situations is near. Classifying and predicting through pattern analysis and connecting dozens of CPUs and GPUs into an artificial neural network is structurally and functionally very similar to the interactions between individual neurons in our brains, and between the limbic system and cerebral cortex. In fact, in March, French and American scientists announced that they had succeeded in developing a “memristor,” a solid-state artificial synapse for artificial brains that can learn on its own and even mimic the plasticity of human synapses.
The counterargument to this is that human neuronal synapses don’t just send and receive signals, they also regulate the strength of the signals that are transmitted, making their performance highly variable and difficult for simple hardware to match. In other words, can we completely replace human biology with mechanical components? However, we only need to look at the memristor mentioned above to see the positive possibilities. Memristors utilize dielectrics to vary the voltage, allowing for plasticity. Advances in technology like this allow for the development of new hardware that has never existed before. This hardware may not be identical to the human body down to its components, but it is structurally and functionally identical.
Emotions are created and expressed by building on existing data about evolution and memory, so it’s similar to a computer retrieving stored data. Meanwhile, artificial synapses that mimic human synapses have been developed, which has advanced deep learning techniques. Deep learning has made it possible to quickly categorize situations, which instantly triggers the appropriate response. Combining these three, we can conclude that A.I. has similar structural and functional units to the human brain and is able to react to situations quickly and accurately using vast amounts of data to produce emotions. While we’ve discussed emotions, the nonverbal, spatiotemporal, intuitive, and affective functions of the right brain are formed through the same process as emotions. Considering that all of these functions result from the interaction between synapses in the brain, we can extend the conclusion that A.I. can have emotions to the conclusion that A.I. can fulfill the roles of the right brain.
We discussed how emotions, the most representative of the right brain’s roles, have been shaped by evolution and manifest in different situations, what artificial and human synapses have in common, and how artificial synapses can be quickly categorized by advanced artificial neural networks to quickly respond to situations. To summarize, emotions that were created for survival and reproduction have remained in the mammalian brain, allowing us to learn which emotions are triggered in which situations, and this is also possible for AI. The development of artificial synapses that mimic those in the human brain has also led to advances in deep learning techniques. This has given AI the ability to quickly categorize vast amounts of data, making it well-suited to respond to situations immediately. Therefore, I agree with the argument that AI can take on the role of the right brain.
The conclusion that A.I. can fulfill the role of the right brain is significant. If an artificial right brain (A.I.) is developed to assist and replace the human right brain, the most promising societal impact would be to provide solutions for patients suffering from right brain diseases. ADHD (Attention Deficit Hyperactivity Disorder), which has been increasing in prevalence in recent years, is a typical right-brain disorder, and it is possible to use A.I. to alleviate symptoms by utilizing the deteriorated function of the right brain, either by being taught by an A.I. that plays the role of a normal right brain or by implanting a chip with A.I. into the right brain. This could open up the possibility of treating and alleviating autism and non-verbal learning disabilities. However, in this case, it is necessary to clarify the boundary between human and A.I. identity. In severe cases, when the brain is replaced with an AI brain, there will be an ongoing debate about whether the patient should be considered human or cyborg. Furthermore, if an A.I. emerges that has left and right brain functions and a body, rather than assisting humans, it will be debated whether it should be considered a new species.

 

About the author

Blogger

I'm a blog writer. I like to write things that touch people's hearts. I want everyone who visits my blog to find happiness through my writing.

About the blog owner

 

BloggerI’m a blog writer. I want to write articles that touch people’s hearts. I love Coca-Cola, coffee, reading and traveling. I hope you find happiness through my writing.