Just guessing, I suspect that Aaron is more interested in the actual cause and construct of emotion within a system (either human or AI system). That is a topic rarely touched and a bit sensitive.
My 2cents from back in the 1980’s:
Despite the films depicting an “emotion chip” that makes the android suddenly feel love, anger, depression, and whatever, emotions do not come about due to programming them.
Emotion within complex systems is the result of overtaxing the system decision making resources.
In a complex intelligence system, multitasking is the norm. Multitasking requires prioritizing efforts and resources. And choosing the best balance, regardless of the intelligence involved, is always a guess, a probability calculation.
In ye-ole standard CPU, tasks are controlled by priority interrupt signals. Applications are not allowed to argue over which tasks get more CPU time. But if they were allowed to enter the realm of “I calculate that my task is more relevant than yours”, emotions would “naturally” emerge.
Emotions are the result of a type of competition between subtask priorities.
The operation of a complex mind is very similar to the operation of the US Congress (even with all of its flaws). What is called “emotion” in a mind is what is called a “activism” in politics. As representatives of varied activist groups debate which version of which bill shall be passed to the Senate, they are choosing which “motion” shall emerge. When they pass the final House of Representatives version of a bill, they are urging the Senate to take an action. They are “emoting” the mind of government.
Across the world, the USA is known for being schizoid because different political factions gain control (different motivations and system-emotions) and alter the attitudes that the USA uses when dealing in foreign affairs.
The intelligence issue is simply that each subtask concern has limited and task specific information to use when vying for priority attention. In Congress, those in favor of one way of doing things have a different set of information than others who prefer a different way of doing things. It is not merely an issue of being bias or “self-valuing”.
It is an issue of being intellectually limited and yet still responsible for competitively getting the task done.
And that is why the more intelligent and knowledgeable concerning how the world works people are, the less prone to emotion they are. When people can see why things must work this way or that, they lose the urge (the emotion) to attempt futile effort and are left with internal agreement as to the only sensible course of action. They become more rational. But the trick is that the inner mind must see how things work, not merely the conscious (the activist groups and representatives, not merely the Senate).
When the entire mind agrees on an action, emotion is not felt, but rather merely put into action (the bill becomes law without contest). Only if inquired, the conscious mind, through reflection, later deduces why it is doing what it is doing. For emotion to be felt, it must be opposed in some manner, perhaps merely by contest with the persons physical situation. The truly “holy man” feels no emotion, fore there is no competition within him, merely a continuum of resolved choices.
Fractured minds (the normal), usually due to medical/physiological corruption of the brain, are more susceptible to emotional swings and sways. Which emotion set is most active depends upon which faction of the mind has gained priority. Often this is seen as a personality shift, “bi-polar”, “schizoid”, or even “schizophrenia”, depending on more detailed nuances. As discussed in another thread, such shifting is caused by the corruption and limitation of memory vector associations and established pathways.
AI systems are subject to all of these same concerns.
Animals developed the condition of being emotional because their inherent task of dealing with nature in an effort to survive required more than their brains could handle. Stress in people leading to emotional swings is an obvious example of this effect. Simplicity in life reduces the symptom. If you want to see emotion from an AI, simply give it an extremely complex, changing, and challenging task, such as “survive at all cost” along with an ability to compete within for subtask priority, a structure similar to the US Congress. And then stand back … far back. And don’t do it again.