The most shocking truth of human reality is that we are a form of biology and as such are driven by the physics that makes the chemistry happen. What can and does make us unpredictable in certain ways is how such chemistry can be altered. We’re talking about subtle influences such as thermal currents in a brain cell! When realizing that qualia of experiences relates to the emotional reactions, we see how important this feature of our brains is. To give a machine something similar as to how it would make decisions based on the emotional reaction it anticipates or has experienced might give some pause to create such a machine. After all who wants a car that doesn’t feel like driving you to work on a particular morning?
So what’s the point to building an emotional machine? The notion goes towards the objective of our goal and that is to build a machine that can relate to human beings. To relate to humans the machine should have emotions or at least mimic the signal patterns that emotions respond to as we interact in the environment. Currently those software tools or applications that try to recognize human facial expressions and associate them to words that identify or describe emotions can not relate to humans! They are no different than you PC where you type on your keyboard and it responds according to its programming with a specific response. Some think that’s all machines need to do and that humans will then anthropomorphize how the machine responds. But such strategies quickly fall into the uncanny valley as the response become very repetitive and many times incoherent.
Ultimately emotional machines will be much more relatable to people and perhaps to a degree that’s not comfortable, a flip of the uncanny valley if you will, and because its so human like it causes an adverse reaction. On the other hand they could be integrated into family or be guardians of the elderly where provide emotional support similarly to pets. In another context because most people live in cities, at least in first world countries, human intimacy is problematic. Just look at the dating sites where it is widely known that most use chat bots and employees to respond to lonely men. An emotional A.I. just might be the god send to alleviate anxiety and depression from loneliness…
Emotions are an enigma and no one has really set in stone what emotions are. There are certain chemical signatures such as oxytocin and noradrenaline, endorphin, dopamine, vasopressin and serotonin that are associated with emotions but there are fundamental issues with emotions that haven’t been answered. For one an emotion doesn’t come about until after a stimuli is processed and interpreted, yet with animals those very same chemical signatures exist as well, so it would seem that since humanity evolved from ape like animals that process exist apes as well as other mammals. Emotions are a matter of interpretation and other animals probably do not interpret such neural signally the way a human does. But this gets much more complicated since cultural influence and personal experience can affect the interpretation of the neural signaling as well. So emotional experiences are not the same from person to person, there are differences, yet we all believe or argue that emotional interpretation are universal to all humanity. To understand emotions we need to monitor neural activity on a connection by connection basis.
One method of doing so is using optogenetics. This process is very exotic and surprisingly effective. Optogenetics involves modifying the genetics of an animal where its neurology can not only output the chemical transmitters to function but also will emit light as a neuron fires! With that said one can build interfaces of fiber optic probes into an animals brain and listen to the neural activity. Not only that but the neurons can be affected by light emitted by the probes as well. This will lead to a much more detailed understanding of the neural code of brains and an understanding of emotions.
However, there has been much work in the concept of emotions by psychologists where there are three tiers of emotions. There has also been work done on how humans feel emotions and one such study demonstrated an almost universal body map of how we feel emotions. With that said can we model emotions?
Plutchik’s wheel of emotions gives us a concept of tiered emotions as emotions have core origins starting with 8 primary emotions that then extend to secondary and tertiary emotions. However, there have been others who argue that Plutchik’s wheel doesn’t capture all human emotions. Parrott, Shaver et al is the model that I decided to use.
Using the OODM Descriptor model we can actually model emotions! Because of the inheritance ability of OODM the relationships of the tiers of emotions and how they are derived can be described.
The image above shows how emotions can be described with Descriptors. Note that only MicroDescriptors are used and also notice that inheritance is used.
The entire Emotions Wheel is structured into classes that all inherit from a “Base Class Emotion”. The base class emotion contains the common descriptors used for all emotions. MicroDescriptors facial expression, secondary, none (will explain this later), and bodymap ProtoVector SubTypes have attributes whose data are actually vectors.
The base Class Emotion contains the basic three MicroDescriptors for all emotions, as shown in the image above.
The MicroDescriptor “none”, due note that all MicroDescriptors are listed as their SubClass Types in this viewer, is actually the arousal Group ProtoVector as shown in the image above. This group along with emotions where created to model emotions and the arousal state can be set to any of the listed subclass types. Each subclass type has vectors associated. So the state of “none” means no vector state has been set yet.
Once all emotions have been described we can build charts of the data as shown in the image above. The list of all emotions is on the left-hand side. Select an emotion and the charts will describe the hierarchy of the emotion any higher tiers represented by the emotion and if you select one of the higher tiers on the right-hand side below are charts that will describe the chemical signature vectors, Arousal, and Valence vectors and the facial muscles activations associated with the emotion as well the body map of where humans feel the emotion. You’ll also notice sliders on the lower left-hand side of the panel where the vectors can be adjusted.
OODM proved to be adequate in modeling or describing emotions so they are more than a word or state but a set of concepts that give the emotion meaning. The need to interpret stimuli to an emotion would be handled by a separate algorithm which could very well be a neural network!