The development of Technology makes people apart and negatively reshapes interpersonal relationships . People stop expressing emotion to others like we use to. Emotion is a feeling expressed physiologically and verbally, and it can be understood as "an impulse that induces the action" causing automatic reaction behaviors to the environmental stimulus . Understanding emotion is a way to identify and learn about oneself and others . Meanwhile, human's emotion has an inseparable connection with a human body which includes bio-signal, gesture, body temperature and other related aspects. Physiological signals which controlled by the autonomic nervous system are spontaneous and won't be intervened .
Meanwhile, affective computing plays an important role in developing the system for recognizing emotions. By affective computing, the invisible and nonverbal expression will be decoded and deduce corresponding emotion which helps people learn about others. In this project, we collect real-time electroencephalogram (EEG), muscle tension, heart rate and body temperature data from the body. We then use the data as a control parameter to manipulate the elements of a brush.
We proposed the EmotiBrush in the VR environment which is a powerful tool to offer the user a feeling of existing in the real world. The user will immerse in a space which isolated from the external world with the powerful head-mounted display. Additionally, some works show that emotional content increases the sense of presence in an Immersive Virtual Environment(IVE) and that, faced with the same content, the self-reported intensity of emotion is significantly greater in immersive than in non-immersive . Therefore, by collaborating using VR, the painter will have deeper interpretation and feeling of another user's emotion.
Design and Implementation
It is important to consider the composition of the brush itself that includes color, size, texture, value etc. We currently chose four variables that are automatically changed with the alteration of the data collected from the user. We decided to monitor muscle, heart rate, brainwave and body temperature to create an emotion map because these four parameters are relatively easier to detect, and all of them are supported by a huge amount of research. Research results that reliable measurement of muscle tension of individual differences in incidental facial responses and to emotional expressions is feasible .