Image Image Image Image Image Image Image Image Image Image

Bartlett School of Architecture, UCL

Scroll to top


No Comments

An investigation of bio-sensing technologies to infer states of affect

An investigation of bio-sensing technologies to infer states of affect
  • On September 7, 2017

The business world has increasing interest in expanding biosensing technical knowledge for self-monitoring and self-tracking, leaving research firmly within the domain of wearable computer, wearable sensors and biometrics. Moreover, an increasing number of people use wearable devices and fitness trackers to quantify their fitness and share their physiological data with others via social networks and applications. These new devices, known as Fitbit, Garmin and Beddit Smart, are more effective and popular tools for measuring the “Quantified Self” movement of human progress. The programmes help to record their health-related information, such as step counting, sleep tracking, heart rate monitoring and breathing training to combat stress (Stables, 2017). The “Quantified Self” movement utilize any kind of biological, physical, behavioral and environmental information to improve self-understanding and self-reflection, which is considered as one of most potentially active trends in the world of big data (Shin and Biocca, 2017).

Self-tracking based on recording human information with biometric technologies is not just promoting a person’s self-insight and positive behaviors, like healthy living, energy conservation mental health, etc., (Li, Medynskjy, Froehlich, & Larsen, 2012) but increasingly providing opportunities for improving interactions with humans in architecture and offering responsive, performative and adaptive design possibilities combined with reacting a person’s bio-data. Biosensing can be a valuable asset for extending interactive architectural design. For example, Rafeal Lozano’s Pulse Room (, 2006) which features one to three hundred light bulbs flashing at the exact rhythm of participant’s heart rate or Marcus Lyal’s interactive laser and music composition controlled by a person’s mind via an EEG headset (, 2015).

Rafeal Lozano’s Pulse Room

Similarly, Claudia Robles’s Skin installation detects a participant’s sweat data to turn their affect states into observable information like sound and images (, 2012). What these examples illustrate is that biosensing is no111t only characterized by recording one’s own biofeedback, but also inspired to generate unexpected connection with interactive architecture design, and research in this area shows room for future development.

“People are transformers, just like spaces are transformers on a meta-level as seen in relation to people” (Feireiss and Oosterhuis, 2006). In this sense, buildings may be able to act like instruments, which can continuously orchestrate to the change of human physiological information. In an attempt to extent the applications of biosensing with new topics, the project covered in this report aims to build a responsive installation that is able to perform human affective states. The primary question of “how can an installation’s performance in a space be directed by inputting physiological data from biosensing technologies?” begins with a series of investigations on the relationship between biometrics and the human affective states. The exploration and comparison of biometric technologies in tracking a person’s affective state make participants more likely to engage with the surrounding space. Then it is speculated that “is it possible to inform a responsive interactive installation by human affective states”.


The project “Bio-wings” covered in this thesis is a biometric interactive installation which consists of two parts: the input part through collecting participants’ bio-data and tracking their facial

1Figure1: The diagram of the process of project

expressions via webcam to infer their states of affect, and the output part which is responsive, performative and reconfigurable structures directed by the change of a person’s affect state. The aim of final project is to build a large-scale installation which is able to interact with human physiological data. Inspired by the reconfigurable structures from Harvard (Wyss Institute, 2017) and the “aeroMorph” designed by MIT Tangible Media Group (, 2016), project “Bio-transformer” explores the possibilities of driving rigid structures by pneumatic actuators, such as pouch motor,air muscle or air spring. In doing so, the final kinetic structure of project “Bio-wings” is identified as a rigid and geometrical framework of plywood, which controlled by soft inflatable actuators to interact with participants. In addition, soft actuators are dominated by the change of participants’ bio-data or facial expressions which can be tracked by the webcam and Face OSC, a software to collect the data of the movements of the key points of facial expressions, such as eyebrows, eyes, nose, mouth and jaw.

This report focuses on the technical aspects of the project, in particular, the exploration and investigation of biometric technologies to track the variation of people’s arousal, the relationship between states of affect and physiological data collected by biometric sensors and Face OSC and the performance of responsive structures with the change of biofeedback. The human body is introduced as a bridge between human inner milieu and outside environment, a series of experiments and prototypes of capturing human bio-data with different biometric sensors was designed and conducted through a group work, in order to investigate the idea of exploring biosensing technologies to detect human degree of arousal.

Research Methods

The research methods of thesis consist of method of comparison research, literature research and analytical method.

  • Analyse and compare biosensing technologies of previous discussions, focus on the applications of transforming personal bio-data into different formats.
  • Investigate the relationship between affective states and biometric signals.
  • Experimentations with various prototypes for different steps of the research and evaluation of advantages and disadvantages of experience.

Affect state

 A circle model of affect

Eight affect concepts in a circular order_Russell, J. 1980

Figure1: Eight affect concepts in a circular order. James A. Russel

Affect is better explained as emotion in the article of Evidence of convergent validity on the dimensions of affect (Russell, 1978). According to Russel’s theory, most psychologists describe affect as a set of dimensions, like pleasure, excitement, distress, displeasure and so forth. More importantly, these dimensions of affect are interdependent rather than independent (Russel, 1980). The structure of affect, also known as the circle model of affect, well represent the ratings of affective states (Russel, 1980). The structure of affect states with a two-dimensional bipolar circumflex model is characterized by two axes: valence (horizontal axis) and arousal (vertical axis). The horizontal part refers to the degree of negative versus positive valence of an affective experience, and the vertical one describes as high versus low activation (Hoyt et al., 2015). Furthermore, people who are cross-cultural, cross-species and cross-age are able to react with others, interact with others and interpret other’s emotion, which is based on a cognitive structure that can be used in interpreting the verbal and nonverbal evidence of emotion. For example, a subtle hint or explicit expression of emotion and facial expressions.

Symptoms of affect

A previous study have proposed primary emotion is associated with the amygdala and anterior cingulate––key structures of emotional processes which react to external stimuli (Damasio, 2006). Moreover, according to the hypothesis of William James on the nature of emotion and feeling: if people immerse themselves in strong emotions and try to extract all the bodily symptoms of the feelings from their consciousness, it is hard to find anything related to their affect states, except a cold and neutral state of intellectual perception (Damasio, 2006). Then James illustrated it through citing robust and vivid instances. James explained: “what kind of an emotion of fear would left if the feeling neither of quickened heart-beats nor of shallow breathing, neither of trembling lips nor of weakened limbs” (Damasio, 2006). Broadly, people hardly interpret and perceive their affect states without nonverbal evidence of affective states, for instance, facial expressions, tone of voice, obvious actions, blushing, breathing and any of abundant possible hints. Due to a host of cues to perceive people’s affect states, the evidence of affective states (various physiological symptoms of affective states) can be divided into two parts. One is facial expressions, which is one of the most powerful, natural and immediate means for human beings to communicate their emotions and intensions (Shan, Gong and McOwan, 2009), and another one is bio-data which is difficult to track directly its fluctuation without biometric sensors, such as heart rate, skin moisture, respiration, muscle tension and even the brainwave.

Sensory inputs

Inspired by the circle model of affect and the physiological symptoms of affect, it is speculated to think of the combination of using biometric signal and facial expressions as a way to infer a person’s


Figure2: the diagram of the relationship between facial expressions and biometric signals

affective states. As shown in the diagram, it is a direct and effective way to track the movement of key points of facial expressions through webcam and Face OSC, in order to identify the valence of affective experience of participants. In the second step, it is decided to carry out biometric measurement of activation through a series of biometric sensors. Primarily, the combination of these two ways illuminates diverse affective states with physiological data. The reason we combine these two parts together is that people can smile without emotion, in other words, facial expressions can be faked without biosensing. Therefore, the sensory inputs of the project consist of two parts: face expressions and biometric signals. However, the research and investigation covered in this thesis more focus on the bio-work to track the degree of arousal.

The degree of arousal

Just like the brightness of hue and the loudness of audition, it is crucial to separate the intensity of response from other sensory characteristics for the research of arousal. In fact, it has been pointed out in previous papers that the degree of arousal, also described as the intensity of response, appears to vary with a number of factors in a wide variety of biometric measurements, such as skin resistance, muscle tension, EEG and Others (Duffy, 1957). Moreover, the differences of measurements of degree of arousal in individuals show obviously, due to various factors, for example, the environmental influence, genetic difference, drugs and hormones etc. However, the degree of arousal can be classified in rough way among multiple physiological means. Several studies shows that the variation of arousal is a continuum from a low point of sleepiness to a high level of excitement with biometrics especially for skin conductance, muscle tension and EEG (Duffy, 1951). In addition, Lindsley supported that sleep and emotion are related with certain changes in EEG (Mead and Stevens, 1952). Therefore, in the process of the experiment, the degree of arousal is able to be classified into three levels: neutral state, low arousal and high arousal and mild arousal.


Biometric signals are one of the most important parameters in the research field of emotion recognition. Due to the complexity and limitation in experiments of analysis states of affect, evidence in support of this aspect is still meager, as it still leaves them in the category of speculation. The change of human affective states can be identified indirectly through the analysis of physiological data. The most reliable way to track the degree of arousal is to extract bio-data through biometric sensors in real-time. However, as the development of biometric technologies and in-depth study, it is potential to gain from this process. A series of experiments and tests were designed and implemented through a group discussion according to the sensory inputs. In order to track the fluctuation of arousal, it was decided to carry out these experiments with three different kinds of biometric signals, conductance of skin moisture, heart rate and muscle tension which can be detected by different biometric sensors. Furthermore, in order to develop the explorations of biometric signals, it is crucial to create an algorithm which is able to determine the degree of arousal through analyzing human physiological data and steer the output part.

The experiment with sweat data


555Figure3(a) Prototype of Galvanic Skin Reponse                 Figure3(b)  The script of Audition

As shown in the figure2(b), the first experiment uses a Galvanic Skin Response sensor as a biofeedback interface to measure the skin moisture, which the device is portable and small. Galvanic Skin response (GSR) signal vary obviously as the change of skin moisture. The hands and feet are the most efficient body regions to measure skin moisture. Basically, when the emotion of people is variable, or obvious and strong facial expressions appear on their face, sympathetic activity will increase, which promotes sweat gland secretion simultaneously. As a result, the skin conductance experiences an increasing trend (Cai, J. 2010). The investigation in this section focus on studies of variations of Galvanic Skin Response signals, introduce the experimental process of the collection of bio-data, data-processing, and feature extraction.

At the beginning of the experiment, the question of what kind of triggers to stimulate the fluctuation of arousal is related to whether the collection of physiological data is reliable and efficient, while, there is still without a set of uniform standard. In the test of checking the Galvanic Skin response sensor’s performance, the whole experiment starts with using audition as a trigger of arousal. What was discovered to be significant to the guarantee of accuracy and efficiency of the physiological data collected by the sensor, was the sounds generated by MAX MSP, a software to program audition with multiple pitches and different tempo. As shown in the table below, the audio trigger consists of two forms of sound: signal audition and piano sound with the same pitch or tempo for comparison. The experiments use the tempo and pitch of audition as the variables to test the fluctuation of skin conductance.

8Figure4: the table of Audition with different characteristics

F: Frequency (pitch) of audition; T: Tempo of the audition

In the test, the value of skin moisture was classified into three kind of level, low, high and mild. When the value of skin moisture is at degree of high arousal, the LED lights would light up In order to distinguish the differences of physiological data, a series of single-variable tests using Galvanic Skin Response sensors was conducted.

  • The first step: measurements in a relaxing state.
  • The second step: comparing the results of measurements with various pitches of signal auditions
  • The third step: comparing the results of tests with the same pitch of signal sounds and piano sounds
  • The forth step: comparing the results of tests with cross-age, cross-cultural people.
  • The fifth step: comparing the results of tests with various tempo of auditions like the previous tests.
  • The research results show the one of the fluctuation of test (F440 Hz, T100ms signal sound), there obvious change when frequency of signal sound.



9Figure 5: The wave of the data collected from Galvanic Skin Reponse sensor

  • Wide differences between testers with the same stimulus situations, such as using the same frequency (440HZ) and tempo (0.1s) signal sound
  • Obvious fluctuation of the data of skin moisture with the change of audition.
  • The data is easily susceptible to the external environment, such as temperature.

The experiment with heart rate

Heart rate variability, as one of biometric signals, it was fairly simple to collect and with less noise compared with other physiological signals (Zhang, H andLiu, G, 1952). The physiological data of heart rate is recorded using pulse sensor SEN-11574 which is a photoelectric reflective sensor for measuring heart rate, using a green LED light with a peak wavelength of 515 nanometer and a light sensor with a peak wavelength 565 nanometer. These two similar peak wavelengths guarantee the sensitivity of sensor. It has been explained in a previous paper that the measurement of heart rate is based on the fact that the transmittance of human bodily tissues differs when the heart pumps blood through the body (Ge, C, 2010).  Rafeal Lozano’s Pulse Room is an interactive installation characterizing one to three hundred clear light bulbs which flash at the exact rhythm of a participant’s heart rate. The pulse sensor as a biometric interface which detect participants’ heart rate is placed on the side of room. When a person touch the interface, his or her heart rate is recorded and sent to the first in the grid. It is ingenious to utilize the heart rate for in the realm of interactive architecture. Inspired by the Pulse Room, figure (7) are the prototype of using pulse sensor to control the movement of the membrane. The heart rate collected by the pulse sensor is sent to control the angles of rotation of the servos which are used for dominating the movement of membrane.

0Figure 6: the prototype of heart rate

The experiment with muscle tension

Muscle tension is another parameter used to track states of affect (Duffy, 1957). Inspired by the project created by the Japanese designer (Sawai and Ishibashi, 2017), the prototype in figure8 combine muscle sensor and servo motors together to track the movement of facial expressions. Measuring muscle activation via electric potential, known as electromyography (EMG), has been argued in previous papers for the medical and psychophysiology research (Hoehn-Saric et al., 1997). Muscle sensor used in this experiment is Mywave Muscle sensor (AT-04-001), an Arduino-powered, all-in-one EMG (electromyography) sensor. When people smile, the muscle group located in the corners of their mouths   begin to contract, which is then detected by the muscle sensor whose electrodes are placed in over these muscles which creates a more accurate reaction. Moreover, the rotation and speed of a servo is driven by the various bio-data which is collected by the sensor and sent to the computer. When the muscle group begins to shrink, the servo turns clockwise at a very low speed, and the muscle starts to relax, the servo turns counter-clockwise quickly.

Although it is efficient way to track the movement of facial expressions, it seems to leave several issues unsolved. Due to the fact that people can smile without emotion, the evidence of tracking states of affect through the measurement of muscle tension is inadequate. The second problem proposed in the experiment is that the muscle sensor uses electrodes which need to be placed on a person’s face. It is a disadvantage for the development of project, as a result of the limitation of distance. In other word, participants use this muscle sensor anywhere in the space required he is at a distance of less than 1 meter.


Figure 7: The prototype of muscle tension

Comparison of three types of biometric sensors

Through these three experiments of various biometric sensors, the most effective method discovered was Galvanic Skin Response sensor. Compared with the heart rate sensor, which proved to have its own limitations. Of which included the interference of light during the collecting of bio-data phase, also the location of the sensor had to be in a place where a strong pulse could be sensed, this could affect how accurate the results would become (Ge, C. 2010). Whereas the Galvanic Skin Response sensor used an electrode to sense the skin moisture which in this device a specific location was not necessary, although the palm did prove to be the most effective location.

While a limitation in the Muscle sensor was that the movement could be picked up, however there could also be no indication of an arousal fluctuation in the data results. This is due to fact that physical indications using the muscle could be faked in the sense that the participant showed no state of emotion behind an action such as smiling. Emotion being a key point to the ‘bio transformer’ project the ability to ‘fake’ a facial expression would be seen as a setback. Though the duration of the muscle sensor recordings were the same length of time as the heart rate sensor, the heart rate sensor was found to be the most cost efficient device; adding to the advantage of using a heart rate sensor.

The Galvanic skin response, is small in size, and was in the mid-range price wise but achieved the quickest response in a short amount of time, proving to be the more efficient device of the three sensors. While the Muscle sensor proved more time consuming, costly than compared with the other two sensors and had slight limitations to its device in relation to emotional responses. Though heart rate sensor was also small in scale, allowing best portable use and a smoother application on a participant’s body, there were also a few light sensitivity limitations which place it in mid-range of efficiency between the three devices.

Intergrated responsive design

Before combining the sensory inputs and the output part of kinetic responsive structures, the question of what kind of structure is going go be put into the project has not been answered. A variety of projects in recent years have explored transformations of kinetic structures. For instance, a origami structure designed by IAAC (Institute for advanced architecture of catalonia) students (designboom | architecture & design magazine, 2014) or squaring movable installation from USC (University of Southern California) (YouTube, 2017). Later the reconfigurable structures from Harvard School of Engineering and applied Sciences and Wyss institute (Wyss Institute, 2017) and aeroMorph from tangible media group of MIT (Massachusetts Institute of Technology) which is a foldable pneumatic structure. After comparing a number of experiments of diverse deformable structures, it was decided to use the more complex reconfigurable structure as the unit of the whole pavilion. Figure 9 shows one unit of the reconfigurable structure, it has three basic forms: the shrunken state (a), the middle state and the expanded state (c). The deformation of the structure is driven by pouch motors which are stuck on the surface and hinge of the structure. The hinges are made of rubber which is soft and flexible for the rotation. The transformations of structure unit depends on the duration of the inflation of pouch motors.

Intergrated responsive design

图片2        (a)Shrunken state                           (b)middle state                                 (c) expanded state                                       Figure 8: there basic transformations of the unit of structure

Before combining the sensory inputs and the output part of kinetic responsive structures, the question of what kind of structure is going go be put into the project has not been answered. A variety of projects in recent years have explored transformations of kinetic structures. For instance, a origami structure designed by IAAC (Institute for advanced architecture of catalonia) students (designboom | architecture & design magazine, 2014) or squaring movable installation from USC (University of Southern California) (YouTube, 2017). Later the reconfigurable structures from Harvard School of Engineering and applied Sciences and Wyss institute (Wyss Institute, 2017) and aeroMorph from tangible media group of MIT (Massachusetts Institute of Technology) which is a foldable pneumatic structure. After comparing a number of experiments of diverse deformable structures, it was decided to use the more complex reconfigurable structure as the unit of the whole pavilion. Figure 9 shows one unit of the reconfigurable structure, it has three basic forms: the shrunken state (a), the middle state and the expanded state (c). The deformation of the structure is driven by pouch motors which are stuck on the surface and hinge of the structure. The hinges are made of rubber which is soft and flexible for the rotation. The transformations of structure unit depends on the duration of the inflation of pouch motors.

The Question of “how can the sensory inputs be connected to transformations of the structure unit?” aroused a group discussion. As shown in the diagram figure10 (a) and (b), the experiments of pouch motors have showed that the angle of the rotation of the hinges can be steered by the duration of the inflation of air pump. Figure (b) shows the duration that pouch motors spend on various angles. For example, under 1KPa of the air pressure, a pouch motor need to spend 17.54s to rotate 180 degrees.

Put bluntly, the primary idea of the project is how a pavilion’s performance in the space can respond to human states of affect, which is similar to the notion of emotion recognition. In this sense, the sensory inputs consist of two parts: the data of facial expressions and the data from biometric sensors, which are all sent to the computer to control the duration of inflation of air pump for the angle of the rotation of the hinge. Due to the fact that facial expressions can be faked without biosensing, both the data of facial expressions and biosensing is classified by three different levels and arranged in a manner similar to the table below. The figure11 (b) shows 5 diverse transformations which represent five different conditions: neutral state, fake smile, relaxing smile, fake laugh and laugh state. In other words, when people laugh with a high level of arousal, the pouch motor is able to rotate 180 degrees and the unit of the structure achieve to maximize its expansion. However, the unit of the structure moves a little bit, when people flash a fake smile.

图片4Figure11 (a): The rotation of angle                       Figure11 (b): The combination of sensory inputs and reconfigurable structure

 Further development

There is still room for improvement on the “bio-transformer” project through the development of future prototypes and the combination of bio-data and architecture. There is also an increasing possibility of combining the sensory inputs and reconfigurable structures together, as there are more transformative options. For example, the data from facial expressions are able to control the transformations of the structure unit, and the change of biometric data steers the speed of motion through controlling the duration of the inflation and deflation of the air pump. These two kinds of bio-data are separate from each other, but are responsible for the different dimensions of the movements of the structure unit. Moreover, for the next stage of this project, would be to plan out and build a large scale pavilion, in order to observe the interactive response of the participants within the architecture. , That is why it is significant to explore any other ways of possible interaction between biofeedback and architecture, to push the notion of kinetic structure as far as it can go. With more options available, an immersive environment could be created, such ideas like moving walls to resemble breathing on a large scale project to incorporating light with our bodies’ responses. For the participants to receive a firsthand experience it would be a unique opportunity as all the architecture now in creation are stable and still. The ‘bio transformer’ project hopes to show such participants the future in architecture could resemble something of a life like building.


 In conclusion, the experiments conducted were aimed at trying to see if a biosensing installation could successfully interact with a person, combining with architectural design and the sensory input of physiological data. We began with exploration of experiments to infer the states of affect, achieving this through observations of the various kinds of facial expressions and monitoring other physical responses such as skin moisture, muscle tension and heart rate. Through these various experiments the aim was to find the most effective method in detecting a participant’s arousal. The next stage included the categorization of the changes in the bio-data, from this three levels of arousal were created. Following on from the three levels, 5 in total were later generated with developments in facial expressions to control parts of the prototype such as the duration of the air pumps. The changes in facial expressions, such as laughing with high level of arousal, altered the form of structure, expanding to its maximum capacity. This kind of motion could be further developed in other forms of transformative structures, such as incorporating future audio or using effects of light to showcase a response to physical indication

Several limitations were discovered during the project, such the bio data collected could not be totally representative of the participants responses as there are limitations in the biometric sensing devices as the biometric data could not distinguish the complexity of capturing the human affective state. But rather captured the various physical indicators such as the varied levels of biometric signals like the changes in skin moisture. The algorithm could also be improved as there was noise discovered during the collection process, this meant that it could reduce the accuracy of the physiological data. To remedy that a new algorithm could be developed that filters out such noise tampering with the results. Through inspection a structural limitation was found in the air pouch motor, it still could be improved in terms of the motion of friction with gravity, the air pouch needs to be modified at a later date to increase efficiency.

As the development of biometric technologies continuous, there is to be expected an increase in the application of biosensing in various domains, especially in the realm of architecture. Moreover, it is possible to raise awareness of the relationship of the human body and the surrounding environment. In the future, we are planning to create an empathetic environment through building a large-scale pavilion with an increase of potentially a dozen.

Works Cited:

Merrill, N. and Cheshire, C. (2016). Habits of the Heart(rate): Social Interpretation of Biosignals in Two Interaction Contexts. In: GROUP Supporting Group Work. Sanibel Island, Florida, USA: ACM New York, NY, USA ©2016, pp.31-38.

Feireiss, L. and Oosterhuis, K. (2006). GameSetandMatch II. Rotterdam: Episode Publ.

Shin, D. and Biocca, F. (2017). Health experience model of personal informatics: The case of a quantified self. Computers in Human Behavior, 69, pp.62-74.

Stables, J. (2017). Best fitness trackers 2017: Fitbit, Garmin, Misfit, Withings and more. [online] Wareable. Available at: (2006). Rafael Lozano-Hemmer – Project “Pulse Room”. [online] Available at: (2015). On Your Wavelength – Marcus Lyall Ltd. [online] Available at: (2016). aeroMorph. [online] Available at:

Wyss Institute. (2017). Reconfigurable Materials. [online] Available at: (2012). claudiarobles. [online] Available at:

Li, I., Medynskjy, Y., Froehlich, J., & Larsen, J. (2012). Personal informatics in practice: Improving quality of life through data. In CHI 2012, may 5-10, 2012, austin, TX, USA.

Russell, J. (1978). Evidence of convergent validity on the dimensions of affect. Journal of Personality and Social Psychology, 36(10), pp.1152-1168.

Russell, J. (1980). A circumplex model of affect. Journal of Personality and Social Psychology, 39(6), pp.1161-1178.

Hoyt, L., Craske, M., Mineka, S. and Adam, E. (2015). Positive and Negative Affect and Arousal. Psychosomatic Medicine, 77(4), pp.392-401.

Damasio, A. (2006). Descartes’ error. London: Vintage Books.

Duffy, E. (1951). The concept of energy mobilization. Psychological Review, 58(1), pp.30-40.

Mead, L. and Stevens, S. (1952). Handbook of Experimental Psychology. The American Journal of Psychology, 65(1), p.117.

Cai, J. (2010). The research of emotion recognition based galvanic skin response signal. ME Thesis, School of Electronic and Information Engineering, Southwest University.

Sawai, T. and Ishibashi, M. (2017). Taeji Sawai video clips. [online] Available at: [Accessed 12 Jul. 2017].

Hoyt, L., Craske, M., Mineka, S. and Adam, E. (2015). Positive and Negative Affect and Arousal. Psychosomatic Medicine, 77(4), pp.392-401.

Ge, C. (2010). The research of Pulse Signal in Emotion Recognition. ME Thesis, School of Electronic and Information Engineering, Southwest University.




Submit a Comment