What is arousal? According to the APA dictionary of psychology:
1. a state of physiological activation or cortical responsiveness, associated with sensory stimulation and activation of fibers from the reticular activating system.
2. a state of excitement or energy expenditure linked to an emotion. Usually, arousal is closely related to a person’s appraisal of the significance of an event or to the physical intensity of a stimulus. Arousal can either facilitate or debilitate performance. See also catastrophe theory. —arousevb.
The key component for arousal is the reticular activating system (RAS). This is responsible for alertness and focus in mammals. Here is another feature of brain activity that is responsible for real-time adaptations in the environment. But for a machine that can be relatable to people RAS is also critical. Imagine how much more empathy or anthropomorphic a machine becomes when it conveys something that all humans experience, feeling sleepy, tired and/or feeling very active with energy!
To Mimic RAS involves signalling or processing that captures things such as battery levels, time of day, feelings of exhaustion. These signals have to be integrated into the information processing of the machine in such a way that it affects its choices and interpretations of information both externally and its internal states.
Emotions are an enigma and no one has really set in stone what emotions are. There are certain chemical signatures such as oxytocin and noradrenaline, endorphin, dopamine, vasopressin and serotonin that are associated with emotions but there are fundamental issues with emotions that haven’t been answered. For one an emotion doesn’t come about until after a stimuli is processed and interpreted, yet with animals those very same chemical signatures exist as well, so it would seem that since humanity evolved from ape like animals that process exist apes as well as other mammals. Emotions are a matter of interpretation and other animals probably do not interpret such neural signally the way a human does. But this gets much more complicated since cultural influence and personal experience can affect the interpretation of the neural signaling as well. So emotional experiences are not the same from person to person, there are differences, yet we all believe or argue that emotional interpretation are universal to all humanity. To understand emotions we need to monitor neural activity on a connection by connection basis.
One method of doing so is using optogenetics. This process is very exotic and surprisingly effective. Optogenetics involves modifying the genetics of an animal where its neurology can not only output the chemical transmitters to function but also will emit light as a neuron fires! With that said one can build interfaces of fiber optic probes into an animals brain and listen to the neural activity. Not only that but the neurons can be affected by light emitted by the probes as well. This will lead to a much more detailed understanding of the neural code of brains and an understanding of emotions.
However, there has been much work in the concept of emotions by psychologists where there are three tiers of emotions. There has also been work done on how humans feel emotions and one such study demonstrated an almost universal body map of how we feel emotions. With that said can we model emotions?
Plutchik’s wheel of emotions gives us a concept of tiered emotions as emotions have core origins starting with 8 primary emotions that then extend to secondary and tertiary emotions. However, there have been others who argue that Plutchik’s wheel doesn’t capture all human emotions. Parrott, Shaver et al is the model that I decided to use.
Using the OODM Descriptor model we can actually model emotions! Because of the inheritance ability of OODM the relationships of the tiers of emotions and how they are derived can be described.
The image above shows how emotions can be described with Descriptors. Note that only MicroDescriptors are used and also notice that inheritance is used.
The entire Emotions Wheel is structured into classes that all inherit from a “Base Class Emotion”. The base class emotion contains the common descriptors used for all emotions. MicroDescriptors facial expression, secondary, none (will explain this later), and bodymap ProtoVector SubTypes have attributes whose data are actually vectors.
The base Class Emotion contains the basic three MicroDescriptors for all emotions, as shown in the image above.
The MicroDescriptor “none”, due note that all MicroDescriptors are listed as their SubClass Types in this viewer, is actually the arousal Group ProtoVector as shown in the image above. This group along with emotions where created to model emotions and the arousal state can be set to any of the listed subclass types. Each subclass type has vectors associated. So the state of “none” means no vector state has been set yet.
Once all emotions have been described we can build charts of the data as shown in the image above. The list of all emotions is on the left-hand side. Select an emotion and the charts will describe the hierarchy of the emotion any higher tiers represented by the emotion and if you select one of the higher tiers on the right-hand side below are charts that will describe the chemical signature vectors, Arousal, and Valence vectors and the facial muscles activations associated with the emotion as well the body map of where humans feel the emotion. You’ll also notice sliders on the lower left-hand side of the panel where the vectors can be adjusted.
OODM proved to be adequate in modeling or describing emotions so they are more than a word or state but a set of concepts that give the emotion meaning. The need to interpret stimuli to an emotion would be handled by a separate algorithm which could very well be a neural network!