Image Image Image Image Image Image Image Image Image Image

Bartlett School of Architecture, UCL

Scroll to top

Top

No Comments

On the Relationship Between Affective Wearable and Social Perception

On the Relationship Between Affective Wearable and Social Perception

Computers have advantaged abilities for processing patterns; however, humans are superior in interpreting meaning in patterns. (RW Picard, 2000)

Affective wearable is portable equipment that could recognise its wearer’s affective pattern. EMO prototype distributed an affective wearable prototype of recognising wearers’ physiological external and internal signals(valence and arousal) with facial expression and galvanic skin response to explore human affective feelings. However I imagine it to extend to develop a healthy social relationship not only in human-human but also human- computers interactions.

 

1. Affective Wearable 

Emotions are an intriguing topic in both psychology and neuroscience area. Some may assume the reason that the subject of emotion can lead out so many different theories is due to the lack of testable hypotheses. But still, tons of many experts in various fields trying to explore the appealing topic. Scientists have been trying to bring “emotion” or “affect” to the field of computer science called “affective computing” (RW Picard, 1995), which relates to, or influences emotions. The idea of bringing some may assume short of rational thoughts, emotion, into the computer may be unwise. However, it is practical to build machines in affect recognition. The advantage of affective computers can advance not only art and entertainment field but also the field of human health interaction.

Recognising psychological pattern is not new; many products have been made to monitor people’s heart rate, blood pressure and so on. Especially in the medical application which could sense or recognised people’s anxiety or stress and provides an easy way of monitory users daily activity not only for those suffering anxiety attacks but also for healthy people to understand more of their psychological condition. However, combined the field with the outdoor environment, entertainment and the pattern recognition tools as we can call it, affective wearables is a new approach and trying to make wearers more conscious and interactive to surroundings. 

                           

    Figure 1:  Affective Jewelry with PIC chips on iRx boards by Jennifer Healey & Grant Gould which can transmit data by infrared wirelessly to lager computer to analysis. (R. W. Picard et al.,1997)

 

 

There are lots of reasons why or why not for computers to understand human’s emotional state. However, the most direct reason is that we want to understand ourselves more and also because no one can read others’ mind (yet). Sometimes we subconsciously ignore what is happening to our surroundings, but it still affects our feelings. For example, imagine walking in a busy street that full of traffic sounds, our brain might already predict these annoying sounds were supposed to appear in the first place; however our inner body state was not. That could explain why in our mental state we could adapt and ignore the sound however our body still sweated and heart rate still increased. In this case, we might want a computer to tell us what our body actually react and to remind us to relax. Even though we can not measure a cognitive state of emotion but at least we can measure the physicality response such as facial expression, speech tone or heart rate response.

Some may argue the heaviest problems is that different individual could sometimes respond same emotional state with different physiological responses. However, it would be fair to expect wearables, or in particular, the pattern recognition tools could understand the wearers’ feeling states.  All of these potential applications are worth exploring and require the recognition of “affect.”Â  In the following paragraph will have more discussion for four categories of affective computing which is focusing on the ability of computers to express and perceive affect.

 

Figure 2: Some prototype wearables to experiment how bio sensors could related to sensory feeling to not only see our facial expression or recognised heart rate.

 

Experiment 1:

Figure 3: The video just demonstrate the experiment that we have done by wearing wearable in streets to show how people would affect by environment. 

 

 

1-1 Either Expressible or Perceivable Affective Computers

 

 

Figure 3: Four categories in two dimension: perceive and express of affective computing. (R. W. Picard 1995 )

 

 

Take a look at the above diagram by Rosalind Picard in Affective Computing(1997). There are four categories focus on how computers can or can not perceive or express human affect.  Nowadays, almost all computers are seated in this category A, without knowing any human’s affective feelings. These computers are neither friendly nor personal.  The idea of developing a machine in category B is to create a comfortable users’ experience by adding natural intonation sound or an impression of happiness that could make users feel a momentary pleasure while interacting with it. They are easily be seen in the entertainment field that could help people reduce stress, more like equipment that could keep company with. Such as robot dog that could make sweet puppy sound or dance around users. In category C, however, is the main focus of the discussion. To build computers that could perceive users’ affective feelings and give advice to them. Also, it reduces the fears of those who do not trust the idea of emotional computers. They are more like assistants or teachers that could give users better instruction while sensing negative affect. Such as affective wearable that has mentioned earlier could monitor wearers condition of an inner state. In the final category, the most user-friendly computers that could efficiently communicate between human and computers. Interestingly, in the famous comic book, Marvel, Iron Man, given a very lively image of such an intelligence computer. Edwin Jarvis, a very loyal artificial intelligence household that have assisted the whole Stark family for years and especially superhero Iron Man. The computer could not only perceive the owner’s affect but also been set to understand humour and communicate with the owner easily. In the later series, it has then been improved to have feelings and love. However, the idea of making computers be able to express themselves is still a controversial topic.

 

 

 2. Wearable Design

“Design is in everything we make, but it’s also between those things. It’s a mix of craft, science, storytelling, propaganda, and philosophy.” -Erik Adigard

Due to the limited access to our mind, computers are outside observer with only a few accessible ways to recognised human’s affect. We could only expect affective computers to recognised human’s emotions as a third-person observer. Such as facial expression which is the most easily controlled of all the expressions. (R. W. Picard 2000) Even though recognising a facial expression pattern is not always represent human’s emotion which involves an interpretation and cognitive state. But still, facial expressions are evident, it will be the best way to observe how people are trying to communicate their feelings. 

    In our wearable design, we want to explore the field of human affective feelings. Most people have difficulty recognising their feelings not to mention how to articulate them. Our wearable, EMO, is more like a device in category C in the previous chapter. It is an assistant device that could perceive a wearer’s affect. With two bio-sensors support: galvanic skin response and muscle sensor attached to our neck and cheek to represent two dimensions of human’s affect: arousal and valence.

Figure 4: A particular coordinate that maps the 28 affective feelings states. (James. A Russell 1980)

 

 

Above figure is the concept from James Russell(1980) of how we want to map the affective feelings states to wearables. Galvanic skin response is related to arousal affect from high to low as excited or sleepy. However, the measurement for valence we concluded the two main parts of the muscle on our faces that have related to both positive and negative feelings: corrugator supercilii muscle (mostly are related to negative valence feeling) and zygomaticus major muscle (primarily are connected to positive valence feeling). After a cross match the signal (figure 3), we are more likely to map the affect feelings of the wearer.

 

 

 

 

   

 

 

Figure 5: A modifications of two muscles on the face that are mostly related to human’s feelings (Henry Gray 2011) and the diagram explains where each sensor is placing and the relationship between the two dimensions affect states.

 

Each sensor has a signal range that represents different muscle tension or electrode value.  Based on the concept of James Russell’s affect model, I simply categorised seven emotions and scaled up with GSR and muscle sensor value. Refers to American psychologists Paul Ekman of the six basic emotions: anger, disgust, fear, happiness, sadness, and surprise. I replaced emotion surprise to excited and added another positive emotion “delightful” to the category. (the emotion of surprise is more likely to be a positive feeling but easily detected by negative corrugator supercilii muscle) 

3. Emotion Theory

 

To have better comprehensive research on such a complicated topic which it is necessary to declare the definition of emotion. Since the lack of emotion definition and the lack of basic emotions are obstacles for computers to recognise and synthesis. There are many scientists have already done much research to understand people’s emotions and try to find out whether there has a distinct pattern of emotion: an emotion fingerprint. (Barrett 2017) It is a concept to demonstrate each feeling’s uniquely pattern. Such as a frown face would represent the sense of sadness. However, one emotion might have a similar pattern to another. For example, people sometimes could not distinguish feelings between anxious and depress, but still, the idea of emotion fingerprint is regardless of age, sex, or culture the pattern should appear the same. The reality is that very little work has been found to apply for computers to communicate with human emotion. In particular, scientists still could not tell which kind of patterns are the best indicators of people’s emotions. Computers have advantaged abilities for processing patterns; however, humans are superior in interpreting meaning in patterns.(RW Picard, 2000) Even though there are studies that expression of feeling plays a role in generating and regulate of emotion. A facial expression can elicit emotion in those making the expression. (Izard, 1990; Ekman, 1993) But still, there is no guarantee that facial recognised as sad matches to any direct affective state of sadness. On the other hand, if a joyful person suppressed all the facial expressions, it would also fail to recognise. And that is because affective states are internal and involve cognitive thoughts as well as physical and bodily modulation: face, speech, body posture, and more. (RW Picard, 2000) In the following paragraph, there are some briefs on how emotion is generated in a person’s innate physiology.

 

 

 

 

Figure 6: Other assumption theory with the process of how emotions are create. A diagram of the psychological term in creating an emotion by James Lange theory.

 

3-1 Construct Emotion 

Constructionist hypothesis combines three sources of stimulation to create emotion:  1. How we perceive information from the external world. (the exteroceptive sensory of light and vibrations) 2. The change of internal sensation from our body.  (interoceptive sensory) 3. Mental representation stimulated from our prior experience. (memory or category knowledge)  (Barrett, 2006) Our brain makes prediction soon before any of our sensory input. (DJ. Heeger, 2017) From our brain’s perspective, there is only a dark box for it to seek the meaning of every vibrations and light. Our body is another source of perceiving sensory input. Without any stimulus from the external world, our brain makes the best prediction of what is going to happen outside with “prior experiences” as a guide. Which prior experience that matches the particular event now with all the similar sensory input around us?  All these simulations then compared with our sensory input, verifying whether the prediction is correct. If it does, then it becomes our past experience.

 

”In every waking moment, your brain uses past experience, organised as concepts, to guide your actions and give your sensations meaning. When the concepts involved are emotion concepts, your brain constructs instances of emotion.” (Barrett 2017)

 

 

Figure 7: A structure of a prediction loop of how our brains stimulate everything before it becomes to our experience. (Barrett, 2017)

 

 

4. Social Perception – Emotion and Social Relationship 

In neuroscience,  it mostly explains how the brain works inside a single human body. And affect just an overall pleasant and unpleasant feelings that represent the change of inner body states; However, social perception is a general interpretation process of how individuals can be affected by others’ behaviour, nonverbal communication, such as facial expression, the tone of speech, body language. When we are reading others facial expression: smiling, or frowning their eyebrows, our brain at the same time are stimulating a several mental concepts reasoning those patterns. 

And while these concepts glued together in the brain then forming the idea of categorisation, for example, it represents the prototype of sadness while we predicted an event seeing others frowning their faces. It is a categorisation of emotional facial expression. And the reason is that our brain constructs a summary interpretation of sadness that match the situation. The emotional information is in our perception.

People are easily affected by others’ feelings and emotions by gathering information from their physical appearance. For example, people are more likely to smile back while seeing others doing the same manner. Research has consistently indicated that people tend to align with, or mimic the facial gestures, vocalisations, postures, and other body languages by perceived others(Dimberg, 2000; Niedenthal, 2001). People in a more intimate relationship may synchronise the physiologies pattern such as heart rate(Feldman, 2011), breathing rhythm (Creaven, 2014; Van Puyvelde, 2015) and hormonal level.  Imagine if this address to our wearable design, it would be fun to see people wearing EMO outdoor or to a close friend’s party and all the wearables’ colours synchronise by automatic mimicking other’s facial responses.

 

4-1 Online Embodiment and Offline Embodiment

There are several different theories about evoking emotion by the embodiment. A classical study by Duclos(1989) shows that when a person bodily positions associated with fear, sadness, or anger he or she might modulated experiences affect by these postures. However, I am not writing to endorse whether the body postures can elicit emotion or not. One thing that is widely agreed upon is that embodiment is strongly associated with the process of emotion: when people respond to a direct emotional expression and when people interpret the emotional meaning of the symbols. To distinguish between online embodiment and offline embodiment, as mentioned earlier: online embodiment is stimulated directly in the presence of external objects; offline embodiment is stimulated by symbols that are not actually present. (Niedenthal et al., 2005) For example, mimic others facial expression like happy faces is an online embodiment. Reading the word such as ‘’happiness” or recall a joyful experience which needs to process the cognitive systems with it is an offline embodiment. Our wearable in general, focus on people’s online embodiment; however, it would be great in future to consider combining a cognitively generated affective wearable.

The beneficial influences for computers to understand emotions is to provide better decision making, learning, or behaviour system. With a more humanlike ability, it would be more convenient for computer-human interaction. However,  as previous paragraphs have mentioned, the aid in developing computers to know human cognitive emotion is the process of designing simulations. The concept of what is ”good” and what is “bad” needs to be addressed to the system as well as the basic emotions. More importantly, it is essential for computers to represent affective state and reasoning emotions according to eliciting cognitive conditions. What should it react to situations consists of different events?  In the book, The Cognitive Structure of Emotions, by Ortony, Clore, and Collins in 1988, they have constructed a model (OCC) to be able to generate 22 emotion types of cognitive appraisal by different valence reactions (positive and negative).  If the function return as a beneficial result then it categories it in a positive value. On the other hand, if the function return as a harmful result then its categories in a negative value. For example, knowing whether a person is having a joyful emotion, it then needs a representation of a combination of reality, expectedness and the event that the person assigns at the time.

Another way to modified OCC model is used in generate emotions among social relationship which was then modified by Elliott(1992), a system called Affective Reasoner. There are three kinds of social relationship that it focus on: friendship, animosity, and empathy. By analysis how events, objects are interpreted to an individual agent’s preferences also could address how the individual could respond to an affective state. For example, when one of the agents failed the test, another just had the highest score in the class. In this case, the agent which was the highest score of the course may then synthesised emotions based on the presume preferences, in an effort to think how the other agent would felt.

By adding these both cognitive and physical mechanisms to computers, it would be a more comprehensive system for computers to understand human emotions and in beneficial assist human in various fields. Take our wearable for example, with the recognition mapping tools along with an emotions synthesis system it then could visualise a person’s emotion in both physical and cognitive interpretation.

 

 

Figure 8: Ortony, A., Clore, G. and Collins, A. (1988). The Cognitive Structure of Emotion. 

 

 

5. Summary

“ The remarkable capacity to share others’ affective states and empathise with others is the key characteristic of many of humanity’s modern achievements.” (Prochazkova and E.Kreta, 2017)

Affective wearable could help people understand or recognise stress and provide feedback. In more entertainment way of receiving data, we could choose whether to show others our mood by representing different colours on our wearable. If we were happy to share our affective state, we could build a stronger social relationship to deal with our issue together with friends. It can also provide a group light painting visual effect (Figure 10) to let each other know how positive the feelings are.  Nowadays more and more people value the fact of putting emotions in the field of artificial intelligence. Even though it is still a critical question for intelligent to associate with social interactions. However, it is still a practical goal to improve the relationship between human-computer interactions. It is not necessary for computers to function as the way human beings are, but it would be essential to achieve a functional intelligence with the sensitivity toward human(RW Picard, 2000).

 

 

Bibliography

  1. A Fundamental Explanation of Social Perception With Examples. [online] Available at: https://psychologenie.com/explanation-of-social-perception-with-examples [Accessed 21 Sep. 2018].
  2. Barrett, L. (2017). How Emotions Are Made. UK: Macmillan, pp.29-102.PsycholoGenie. (2018).
  3. Barsalou L. (1999). Perceptual symbol systems. 22, 577—660. [pdf] Atlanta: Printed in the United States of America. Available at: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.4.5511&rep=rep1&type=pdf [Accessed 05 July. 2018].
  4. Bassili, J. (1979). Emotion Recognition: The Role of Facial Movement and the Relative Importance of Upper and Lower Areas of the Face. 37th ed. [pdf] Toronto: Journal of Personality and Social Psychology.[Accessed 6 Jul. 2018].
  5. Bubic, A., Cramon, D. and Schubotz, R. (2010). Prediction, Cognition and the Brain. [ebook] Available at: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2904053/ [Accessed 21 Sep. 2018].
  6. De Silva, L. and Nakatsu, R. (1997). Facial Emotion Recognition Using Multi-Modal Information. 1st ed. [pdf] Sigapore: IEEE Xplore.[Accessed 6 Jul. 2018].
  7. Elliott, C. (1992). The Affective Reasoner: A Process Model of Emotions in a Multi-Agent System. 32nd ed. [ebook] Available at: http://The full PDF version of the thesis [Accessed 21 Sep. 2018].
  8. Ezlika Ghazali, Dilip S. Mutum, Mei-Yuen Woon, (2018) Exploring player behavior and motivations to continue playing Pokémon GO, Information Technology & People, https://doi.org/10.1108/ITP-07-2017-0216
  9. Genschow, O., Bossche, S., Cracco, E., Bardi, L., Rigoni, D. and Brass, M. (2017). Mimicry and automatic imitation are not correlated. [ebook] Swiss National Science Foundation. Available at: https://osf.io/v3afy/. [Accessed 21 Sep. 2018].
  10. Heeger, D. (2017). Theory of cortical function. [ebook] Available at: https://doi.org/10.1073/pnas.1619788114 [Accessed 21 Sep. 2018].
  11. Lindquist, K. (2010). THE BRAIN BASIS OF EMOTION: A META-ANALYTIC REVIEW. [pdf] Boston: eScholarship@BC,. [Accessed 6 Jul. 2018].
  12. L. Clore, G. and Ortony, A. (2014). Psychological Construction in the OCC Model of Emotion. [ebook] Available at: http://10.1177/1754073913489751 [Accessed 21 Sep. 2018].
  13. Mueller, E. (1990). Daydreaming in humans and machines. Norwood: Ablex Pub. Corp.
  14. Niedenthal, P., Barsalou, L., Winkielman, P., Gruber, S. and Ric, F. (2005)Embodiment in attitudes, social perception, and emotion.[pdf] Available at: 10.1207/s15327957pspr0903_1 [Accessed 21 Sep. 2018].
  15. New York University. (2017, February 6). How does the brain make perceptual predictions over time? There’s a theory for that. ScienceDaily. Retrieved September 19, 2018 from www.sciencedaily.com/releases/2017/02/170206155947.htmEmbodimentinAttitudes,SocialPerception,andEmotion. Lawrence Erlbaum Associates, Inc.
  16. Oosterwijk, S., A. Lindquist, K., Anderson, E., Dautoff, R., Moriguchi, Y. and Barrett, L. (2012). States of mind: Emotions, body feelings, and thoughts share distributed neural networks. [ebook] Available at: http://10.1016/j.neuroimage.2012.05.079 [Accessed 21 Sep. 2018].
  17. Ortony, A., Clore, G. and Collins, A. (1988). The Cognitive Structure of Emotion. [ebook] American Sociological Association. Available at: http://www.jstor.org/stable/2074241 [Accessed 21 Sep. 2018].
  18. Prochazkova, E. and E.Kreta, M. (2017). Connecting minds and sharing emotions through mimicry: A neurocognitive model of emotional contagion. [ebook] Netherlands. Available at: https://doi.org/10.1016/j.neubiorev.2017.05.013 [Accessed 20 Sep. 2018].
  19. Picard, R. W(1995). Affective Computing. MIT Media Laboratory. (pdf) Available at: http://www.media.mit.edu/Ëœpicard/[Accessed 20 Sep. 2018].
  20. Picard, R. W(2000). Affective computing. Cambridge, Mass.: MIT Press.
  21. Picard, R. W and Healey, J. (1997). Affective Wearables. [online] Vismod.media.mit.edu. Available at: http://vismod.media.mit.edu/tech-reports/TR-467/index.html [Accessed 22 Sep. 2018].

Prototype Post Link: http://two.wordpress.test/lab-projects/emo

Submit a Comment