Skip to main content

CogniHue – Cognitive and Emotional Effect of Colour in Visualisations

CogniHue – Cognitive and Emotional Effect of Colour in Visualisations

Author and Article Information

Author Details
Author Name:

Inês dos Santos Silva

Co-Author Details

Co-Author Name:

no Co-Author for this article

 
DOI:

no DOI for this article


 

Article Received:

23/09/2025
 
Article Accepted:

09/10/2025
 
Article Published:

10/10/2025

Cite this as :

Inês dos Santos S. CogniHue – Cognitive and Emotional Effect of Colour in Visualisations. Vis Nephrol. 2025; 1(1): 001-023.

CopyRight :

Inês dos Santos Silva, et al.

Abstract

Abstract :

Colour plays a fundamental role in visual perception, but its impact extends beyond the visual domain into emotional, cultural, and contextual dimensions. This study introduces a dual-framework approach to evaluating the emotional influence of colour, combining neurophysiological signals (EEG and BVP) with self-reported affective ratings using Self-Assessment Manikin (SAM) scales. Thirty-two Portuguese participants interacted with a colour-based interface while their responses were simultaneously recorded across valence and arousal dimensions. Quantitative analysis revealed that only the colour grey elicited consistent emotional responses across both frameworks (agreement rate = 87.5%, p < 0.01). In contrast, other colours showed significant divergences (Friedman ANOVA, p < 0.05), indicating a strong cultural modulation of subjective ratings. A comparative analysis further revealed that neurophysiological data provided a more stable and objective classification than self-reports, which were influenced by contextual and cultural associations. Drawing on these insights, we propose culturally grounded design guidelines for emotionally informed colour use in digital interfaces, aimed at enhancing user engagement and emotional resonance.

Indexing and Abstracting

Abstract :

no abstract for this article

Main Article Text

Background

Introduction

Colour is a fundamental dimension of human perception, deeply embedded in how individuals experience and interpret the world around them. Despite its seemingly straightforward presence, colour is a cognitively and emotionally complex construct that transcends visual input. It is shaped by physical properties such as wavelength and intensity but also by psychological, cultural, and contextual factors that vary significantly across populations and environments. As such, colour perception is not only a physiological process but also a deeply subjective experience shaped by emotion, culture, memory, and personal meaning.

Historically, the psychological relevance of colour has been well acknowledged and has held symbolic and functional significance across civilisations. Colour psychology has demonstrated that specific hues can influence cognition, human thoughts, feelings, decision-making, and behaviour in diverse domains and play a pivotal role in fields such as marketing and branding, education, user interface (UI) design and mental health care [1,2]. For instance, specific colours are strategically chosen in marketing and branding to evoke desired consumer responses and perceptions [3]. However, cultural and contextual factors can strongly modulate these effects, and the association between colour and emotion is neither fixed nor universal, challenging colour emotions interpretation. This way, it is mediated by contextual, cultural, and individual variables that make interpretation highly variable. A recent systematic review spanning 132 peer-reviewed studies from 1895 to 2022 affirmed recurring associations—such as yellow with joy and black with sadness—while highlighting the variability introduced by methodology and cultural framing [4]. Likewise, a cross-cultural study involving over 2000 participants across 50 countries identified global trends and culturally specific variations in colour-emotion pairings, underlining the importance of context-sensitive research in this domain [5,6]. These findings emphasise the need for culturally grounded and emotionally intelligent design in digital and physical interfaces.

This study introduces a novel colour-emotion classifier model based on neurophysiological responses, aiming to associate specific colours with emotional dimensions—valence and arousal—recorded during user interaction with coloured visual interfaces. In parallel, we also collect self-reported emotional responses using the Self-Assessment Manikin (SAM) scales [7], which allow users to rate stimuli based on dimensions of valence (pleasantness) and arousal (intensity of emotion). By comparing the two approaches, we identify how objective physiological data may diverge from subjective user reports, offering insight into the cognitive processing of colour in visualisation contexts. The outcome is a set of culturally informed design guidelines intended to support the development of emotionally aware visual interfaces, particularly within Portuguese cultural settings.

Light is a form of electromagnetic radiation, and colour perception arises from the visible spectrum, typically between 380 and 750 nanometers. The human retina contains two types of photoreceptors: rods and cones [8]. While rods are responsible for detecting brightness and motion, cone cells enable colour perception. There are three types of cones—Scones, M-cones, and L-cones—each responsive to different wavelengths corresponding approximately to blue, green, and red light, as described in the Young–Helmholtz trichromatic theory of colour vision [9]. When these photoreceptors are absent or dysfunctional, various forms of colour blindness can occur, such as monochromacy, dichromacy, or anomalous trichromacy.

A colour model is a mathematical framework used to describe and represent colour. It is closely associated with a colour space, which is the organisational structure for arranging and visualising colours. In other words, a colour space provides the coordinate system or mapping that brings a colour model into a usable form. Through these colour spaces, we can differentiate between various colour models. These include:

• The traditional subtractive Red-Yellow-Blue (RYB) model [10], a model that became the foundation of colour vision theories and crucial for studies of colour vision.
• The additive Red-Green-Blue (RGB) model used in digital displays, based on Young–Helmholtz theory of trichromatic colour vision [9], to obtain a model closer to human eye vision.
• The Cyan-Magenta-Yellow-Key (CMYK) model is used in printing. While RGB is grounded in human vision physiology, CMYK is used as its inverse in subtractive mixing.
• The Hue-Saturation-Value (HSV) model [11] was developed for perceptual intuitiveness. It is directly correlated with the colour notions of hue, saturation, and brightness.
• The CIE colour space [12,13] was introduced to address perceptual uniformity that any of the other models offer, offering a standardised, perceptually linear space.

These models form the basis for colour rendering in scientific visualisation and User Interface design.

Early life experiences and cultural conditioning strongly shape the perception and interpretation of colour. Educational practices such as mixing primary colours or identifying objects by hue help construct internal representations of colour. However, cross-cultural research has demonstrated that colour meanings are far from universal. For instance, Madden et al., [2] found significant variation in colour preferences and interpretations across cultures. Matsumoto and Juang [1] emphasised the symbolic weight of colour shaped by language, tradition, and social norms.

The intersection of colour psychology and affective computing has garnered increasing interest in recent years. In colour psychology, hues are linked to behavioural responses [14]. These associations are commonly leveraged in User Interfaces and User Experience (UI/UX) design and consumer experiences [15]. Emotional reactions to colour can be measured either through self-report [7] (for example, questionnaires) or through neurophysiological signals [16,17], including EEG (for brainwave activity [17]), electrodermal activity (EDA) [18-20], and blood volume pulse (BVP) [20]. These signals enable researchers to derive two core emotional dimensions: valence and arousal [21-23]. These two dimensions classify emotional states along two primary axes:

• Valence refers to the degree of pleasantness or unpleasantness associated with a stimulus, for example, joy represents high positive valence, while fear reflects high negative valence [7,21].
• Arousal captures the intensity of the emotional response, for example, anger is a high-arousal emotion, while boredom is low in arousal [7,21].

These dimensions have proven effective in modelling affective responses across a variety of stimuli [17,24,25]. By studying neurophysiological data with self-reported emotional assessments [25], this study investigates the alignment between objective measures (EEG and BVP) and subjective perceptions (via SAM scales) in response to colour. The goal is to uncover how colour influences users on both conscious and physiological levels. Through this approach, we aim to enhance our understanding of the cognitive and emotional effects of colour in digital environments and provide actionable insights and guidelines for designing culturally adaptive and emotionally engaging visual interfaces.

The key contributions of this paper are: (1) the design and implementation of a dual-framework methodology that integrates neurophysiological signals (EEG and BVP) with self-reported affective ratings (SAM) to assess the emotional impact of colour, (2) a comparative analysis of convergences and divergences between subjective and objective emotional classifications, revealing the influence of cultural context on colour perception, and (3) the formulation of culturally grounded design guidelines for the use of emotionally resonant colours in digital interfaces. This study addresses gaps in the literature by combining objective and subjective measures in a culturally homogeneous participant group, offering methodological transparency and reproducibility resources to support further research in affective colour studies. These contributions are relevant for both interface designers and researchers in affective computing, as they provide empirical evidence and practical guidance for developing emotionally intelligent and culturally adaptive visual environments.

The paper is structured as follows. Section 2 reviews the literature on the cognitive and emotional effects of colour, with emphasis on cultural variability and affective computing applications. Section 3 details the methods and materials used, including the experimental design, data collection procedures, and analytical approach. Section 4 presents the results from both self-reported and neurophysiological assessments, along with their comparative analysis. Section 5 discusses the implications of the findings and situates them within the broader research landscape. Section 6 concludes the paper, summarizing contributions and outlining directions for future work.

Related Work

Considerable research has been dedicated in neuroscience and psychology to understanding how colour acquires meaning and how emotional responses are mediated by visual perception. These studies have consistently shown that colour preferences and emotional reactions to colour are influenced by context, culture, and social factors.

To explore how colour affects cognitive task performance, Mehta and Zhu [26] conducted six experiments focused on red and blue hues. Their findings suggest that colour influences motivation and task outcomes: red enhances performance on detail-oriented tasks, while blue fosters creativity. Notably, their study manipulated only the hue dimension, highlighting the importance of even isolated colour attributes in shaping cognition. These results are supported by prior findings on colour-task associations [27,28].

The strong connection between colour and emotion has been further explored in media and gaming contexts. Geslin et al., [29] investigated how chromatic properties—hue, brightness, and saturation—elicit emotional responses in video game environments. Their results revealed significant correlations between luminance and emotional dimensions such as joy, sadness, fear, and serenity. They also observed that colourimetric diversity and saliency contribute to emotional valence. However, individual player experiences mediate these effects. The authors developed a circumplex model from these findings to support emotional colour scripting in games (see Figure 1).

Figure 1: Geslin circumplex model for colour scripting in video games [29] Erik Geslin, © 2015.

Wilms and Oberfeld [30] expanded this investigation by independently manipulating hue, saturation, and brightness in a factorial design. Their study confirmed that emotional responses to colour stimuli could not be attributed to hue alone. Instead, all three dimensions interact to affect emotional states, as measured by valence and arousal. Similarly, Smith et al., [31] examined how classroom colour schemes influence student emotion, heart rate, and performance. Although participants described pale-coloured environments as calming and pleasant, bright colours were associated with higher engagement and emotional intensity.

Dael et al., [32] also analysed colour-emotion correspondences, exploring how emotional expressions (for example, joy, fear) align with brightness, saturation, and hue in a free-choice setting. Their findings support the idea that emotional appropriateness in colour selection is spontaneous and context-sensitive. Cross-cultural studies provide further evidence of variation in colour-emotion associations. Hupka et al., [33] examined 12 colours across multiple cultures, linking them to emotional terms such as anger, envy, fear, and jealousy. Their results revealed that while some associations may stem from universal human experiences, others are shaped by culture-specific elements such as language, mythology, and social narratives.

Early exposure to colour through educational tools, such as paints or coloured pencils, shapes internal models of chromatic meaning. However, these models are strongly influenced by sociocultural environments. Research confirms that colour preferences and interpretations are not innate but learned—often embedded in language, tradition, and lived experience [1,2]. For instance, cultural variation affects how colour is perceived emotionally, as demonstrated in recent global and cross-cultural studies [6,34].

Beyond cultural conditioning, recent work in psychology and mental health further validates the emotional role of colour. Psychological interventions that enhance emotional literacy, such as emotion-focused curricula [35] or bibliotherapy for children [36], demonstrate how affective frameworks shape cognitive-emotional responses to stimuli, including colour. Similarly, therapeutic practices emphasise adapting emotional tools to cultural and developmental contexts [37].

In applied research, particularly in health and rehabilitation settings, the integration of emotional dimensions in digital technologies is gaining traction. Prior work has explored how technologies can support the wellbeing of stroke survivors and their families by addressing emotional and social dimensions of recovery [38,39]. Digital communication platforms such as WhatsApp have proven effective in enhancing mutual emotional support within care communities, allowing emotional expression using colour-mediated environments and promoting emotional awareness [40]. These perspectives align with broader theoretical models that emphasise emotional, cognitive, and social dimensions of holistic wellbeing [41].

Furthermore, recent advances in affective computing and neuroscience have deepened our understanding of how colour influences emotional states, particularly through integrating neurophysiological signals. Several studies have employed EEG and deep learning models to classify emotional responses to chromatic stimuli. For example, Al-Nafjan et al., [42] and, more recently, Abdurrahman et al., [43] demonstrated that EEG-based features can be reliably used to distinguish affective states elicited by different colour presentations using convolutional neural networks. In parallel, hybrid systems integrating EEG with EDA and facial expression recognition are increasingly being explored to improve multimodal emotion detection accuracy [44]. These approaches highlight the potential of combining physiological and behavioural data to develop adaptive colour-emotional interaction systems in human-computer interfaces.

In the user interface and game design context, affectively adaptive environments, where colour schemes change dynamically in response to user emotion, are gaining prominence. Research by Kim et al., [45] proposed an adaptive UI framework that personalises colour palettes based on real-time affective states inferred from wearable biosensors. Such systems build on earlier works like that of Geslin et al., [29] but introduce automation and real-time feedback mechanisms that enhance personalisation and user experience.

Cross-cultural studies play a pivotal role in validating the universality and variability of colour-emotion mappings. Recently, Palmer et al., [46] conducted a large-scale international study and found partial consistency in colour-emotion associations (for example, red–anger, blue–calm) but also highlighted substantial cultural deviations, particularly in collectivist vs. individualist societies. Their findings echo those of Hupka et al., [33] but with a more diverse and globally distributed sample, reinforcing the necessity for culturally sensitive design.

Finally, researchers are beginning to investigate how factors like age, neurodiversity, and psychological wellbeing influence colour-emotion perception. Studies on older adults and individuals with autism spectrum disorder (ASD) reveal different affective responses to brightness and saturation levels [47,48], emphasising that demographic characteristics must be considered when designing colour-based emotional systems.

The collective body of research underscores a strong and consistent connection between colour and emotional states. However, most prior studies rely primarily on subjective self-report methods, such as questionnaires, interviews, and free-choice tasks, which, while informative, are inherently limited by cultural bias, introspective variability, and memory recall issues. In contrast, the present study aims to advance this field by integrating objective neurophysiological data to assess emotional responses to colour stimuli in real-time. By doing so, we strive to capture affective processing at a physiological level, unfiltered by conscious interpretation or social desirability bias. Emotion is a multifaceted construct, and various models exist to quantify its components [49].

In this work, we adopt the widely validated two-dimensional model of valence and arousal to classify affective states. Valence refers to the degree of positivity or negativity of an emotional experience, while arousal denotes the level of emotional activation or intensity [22,23,25,50]. Following Davidson’s theory of emotional lateralization [51], we measure valence by analyzing EEG activity across both brain hemispheres. This approach enables a more direct, continuous, and neurophysiologically grounded mapping of users’ emotional responses to colour. Building upon these theoretical and empirical foundations, our study introduces a dual-framework model: one based on neurophysiological responses and another based on self-reported affect using the SAM scales. By comparing these two approaches, we aim to uncover discrepancies between implicit and explicit emotional evaluations of colour and provide culturally informed design guidelines for emotionally adaptive interfaces. This work bridges the gap between subjective interpretation and objective measurement in effective colour research, particularly within human-computer interaction and culturally situated visualization.

Methods and Materials

This study proposes the development of a colour-emotion classifier model that assigns emotions to interface colours based on users’ neurophysiological responses. These physiological signals were captured using the BITalino device [52], which records heart rate, respiratory rate, galvanic skin response (GSR), and electroencephalographic (EEG) activity (brain waves). User testing sessions were conducted to evaluate participants’ emotional reactions to various interface colours, with BITalino providing real-time biometric feedback. The design and refinement of the classifier followed an incremental and iterative process, encompassing multiple phases—from model conceptualisation to user trials and subsequent data analysis. The study was approved by the Instituto Superior Técnico Ethical Committee.

Experimental Setup

The device used for collecting neurophysiological data in this study was the BITalino system [52], as shown in Figure 2. This modular biosignal acquisition platform includes two prefrontal EEG channels, a pulse sensor, a GSR sensor, an electrocardiographic (ECG) sensor, and a USB dongle for data transmission. BITalino supports the measurement of multiple physiological indicators, including heart rate (HR), interbeat interval (IBI), heart rate variability (HRV), respiratory rate (RR), GSR, as well as low- and high-frequency HRV components (LF-HRV and HF-HRV). In terms of EEG capabilities, the device is able to detect and record activity across several frequency bands: delta (0.5–4 Hz), theta (4–8 Hz), alpha (8–12 Hz), mu, beta (15–30 Hz), and gamma.

BITalino was selected for this study due to its user-friendly design, which does not require advanced expertise in neuroscience to operate the device. Furthermore, the device includes fewer sensors and electrodes, making it less intrusive for participants and minimizing the potential influence of sensor placement on the users’ behaviour or physiological responses.

The reliability of BITalino has been established in prior research, having been successfully used in various studies for real-time physiological signal acquisition and affective computing tasks [53-57]. Additionally, since BITalino was developed by a Portuguese company (Plux), the research team benefited from direct technical support for device handling and signal processing.

Figure 2: BITalino [52] Plux Wireless Biosignals, © 2017.

Software

Multiple software tools were employed throughout this study, each serving a specific role in the acquisition and processing of neurophysiological data:

• OpenSignals [58] was used for the real-time acquisition of biosignals from the BITalino device. This software facilitates multi-channel signal capture, including EEG, GSR, and cardiac data.
• BioSPPy (Biomedical Signal Processing in Python) [59] was utilized to process the raw data exported from OpenSignals. Since OpenSignals outputs signals primarily in the frequency domain, BioSPPy was employed to extract meaningful features from these signals, enabling further analysis and classification of emotional states regarding valence and arousal.

Interface

A minimal and visually appealing user interface was designed to support the goals of this study. The interface was intentionally kept simple, with limited on-screen elements, so that the background colour would remain the most prominent visual stimulus presented to participants.

Upon launching the interface, users were asked to select their preferred language. This question presented two flags— Portugal and the United Kingdom—allowing users to choose between Portuguese and English, respectively. Following language selection, users were shown a brief description of the study’s purpose and a summary of the procedure, including information about the questionnaires they would complete. A short demographic form was embedded within the interface to simplify participant characterisation. Participants were asked to provide basic information, including gender, age, nationality, country of residence, education level, and occupation.

Since this study investigates the relationship between colour and emotion, it was important to ensure that participants did not have colour vision deficiencies. For this reason, we incorporated a short colour blindness screening using ten Ishihara plates [60]. Users were instructed to input the number they observed on each plate or type “nothing” if they were unable to identify a number. The test version used in this study was validated to detect various types of colour vision blindness, including the most common, Deuteranopia.

After the screening and characterisation steps, users were presented with a sequence of background colours on the interface. For each colour, participants were asked to report their emotional responses using the SAM scales for valence and arousal [7]. Their emotional responses allowed us to compare subjective emotional evaluations with objective physiological responses captured simultaneously via biosensors.

The valence SAM scale is illustrated in Figure 3, and the arousal scale in Figure 4.

Participants were prompted with two questions for each background colour shown: “What kind of emotion do you feel about the background colour?” (valence), and “How intense is the emotion you feel about the background colour?” (arousal). Both scales ranged from 0 to 8 and were represented graphically using pictograms, as recommended in the original SAM methodology [7].

Figure 3: Self-Assessment Manikin valence scale [7] Bradley & Lang (1994), © 1994 Elsevier.

Figure 4: Self-Assessment Manikin arousal scale [7] Bradley & Lang (1994), © 1994 Elsevier.

In this study, we adopted the twelve colours proposed by Colin Ware [61]. These colours represent perceptually distinct hues that are widely recognized and have commonly agreed-upon categorical names. They are also sufficiently spaced within colour space to minimize perceptual overlap, making them suitable for studies involving colour-emotion associations.

The selected colours—illustrated in Figure 5—are: black (RGB: 0,0,0), blue (RGB: 1,32,135), brown (RGB: 82,38,0), cyan (RGB: 87,192,254), green (RGB: 37,132,3), grey (RGB: 107,108,108), orange (RGB: 253,134,9), pink (RGB:
251,126,255), purple (RGB: 91,25,143), red (RGB: 176,0,3), white (RGB: 255,255,255), and yellow (RGB: 255,255,11) [61]. The RGB values were extracted using the Apple Color Picker based on the visual representation provided in Ware’s book (Figure 5).

Figure 5: The set of twelve colours selected for the study [61] Adapted from “Information Visualization: Perception for Design”, © 2012 Elsevier.

Figure 6 illustrates the graphical interface presented to users during the colour classification task. This process was repeated twelve times, once for each of the selected colours included in the study.

Figure 6: Interface: classification form regarding the background colour.

To ensure that the user’s attention remained focused on the background colour, the only element that changed throughout the classification task was the background itself. The position of the questions and the SAM scales remained fixed on the interface across all trials. This design decision was made to minimize distractions and emphasize the colour stimulus as the primary visual variable.

A thank-you message was displayed to acknowledge the participant’s contribution after completing the tasks. The interface was developed using standard web technologies, including HTML, JavaScript, and CSS.

Evaluation Methodology

User testing was conducted in a controlled environment to minimize external interference and reduce the possibility of data collection errors, particularly since neurophysiological signals were being recorded. All tests were performed in the same room, a laboratory at the Alameda campus of Instituto Superior Técnico, to ensure that environmental conditions remained consistent for all participants.

The room was acoustically isolated to eliminate distractions and was equipped with an air-conditioning system to maintain a constant temperature between 21°C and 23°C. This temperature control was essential for ensuring the stability of the electrode gel, which could otherwise degrade at higher temperatures and affect sensor performance. Additionally, the same lighting and display setup were used throughout all sessions to preserve visual consistency.

Participants were recruited through standard procedures, including public announcements on social media and direct outreach. Participation was open to any interested individual who gave informed consent to collect neurophysiological and demographic data. As the study focused on emotional responses to colour, it was crucial to ensure that all participants perceived the colours in a consistent manner. For this reason, the same computer was used for all testing sessions—a MacBook Pro Retina 15” (mid-2014), equipped with an Intel Core i7 2.2 GHz processor, 16GB RAM, and Intel Iris Pro 1536MB GPU. Prior to testing, the display was calibrated using the ColorSync Utility to ensure accurate reproduction of the selected colours.

Before beginning the test, participants were explained the study’s purpose, followed by the placement of BITalino sensors. BITalino uses three gel electrodes to capture EEG signals. Two electrodes were positioned on the Participant’s forehead (prefrontal region, above the eyes). At the same time, the third was placed behind the earlobe on the left side of the neck to serve as a neutral reference point.

For EDA, two additional electrodes were placed on the palm of the left hand, near the base of the thumb, and secured using an elastic band. The PulseSensor, used to capture heart rate via BVP, was clipped to the Participant’s left earlobe. All sensors were connected to the BITalino device via wires, with the device positioned behind the Participant to avoid movement interference. Noise-cancelling headphones were also provided to further isolate the user from ambient sounds. Before the colour classification task, each Participant was asked to rest for five minutes while being shown a set of neutral, relaxing images from the International Affective Picture System (IAPS) [62,63] (Figure 7). This phase was intended to collect baseline neurophysiological data under relaxed conditions as a control for later emotional responses.

The interactive session began with language selection (Portuguese or English), followed by a brief informational text explaining the study. Participants then completed a short demographic questionnaire and undertook the Ishihara colour blindness test using 10 validated boards [60], ensuring colour perception accuracy for subsequent tasks.

Next, the interface presented two SAM scales—one for valence and one for arousal—overlaid on a coloured background. Each of the twelve colours selected for the study was shown in random order to avoid sequence bias. For each colour, participants answered two questions: one assessing emotional valence (“What kind of emotion do you feel about the background colour?”) and one assessing arousal (“How intense is the emotion you feel about the background colour?”). Figure 8 shows a participant during the classification task.

This procedure was repeated twelve times—once for each colour. The time spent viewing each colour and interacting with the interface was recorded. This enabled the extraction of synchronized neurophysiological data for each stimulus via OpenSignals. Each session lasted approximately 5 to 10 minutes, depending on the Participant.

It is important to note that the same participants simultaneously completed the self-reported SAM assessments and the neurophysiological recordings, ensuring consistent data comparison across modalities.

Figure 7: Set of pictures of IAPS for the control test [62, 63] © University of Florida.

Figure 8: Participant performing the classification task using the interface.

Results

This section presents the results obtained from the user tests conducted to study the classification of colour in terms of valence and arousal. We also provide an analysis and discussion of the findings.

Participant Characterization

A total of 32 participants participated in this study, of whom 75% were male and 25% female. Regarding age distribution, 18.75% of participants were between 11 and 20 years old, while 81.25% were between 21 and 30 years old. All participants were Portuguese and residing in Portugal.

In terms of educational background, 18.13% of the participants had completed secondary education, 65.63% held a Bachelor’s degree (BSc), and 6.25% held a Master’s degree (MSc). The sample primarily consisted of students (93.75%), with only 6.25% identifying as engineers.

To ensure accurate colour perception, all participants completed a short colour blindness screening using a set of Ishihara plates [60]. The results confirmed that none of the participants had any colour vision deficiency. Some participants reported wearing corrective eyeglasses. However, none wore tinted lenses or sunglasses, and this did not interfere with their ability to perceive colours correctly. Based on the demographic profile, it is evident that most participants were from an academic environment. For participants under the age of 18, informed consent was obtained from their legal guardians.

The final sample of 32 participants aligns with the typical range adopted in exploratory affective-computing and colour– emotion studies (often between 20 and 40 participants [64]). While this sample size offers sufficient statistical power for within-subject comparisons, we acknowledge it as a limitation for broader generalisation. Therefore, conclusions presented here should be interpreted as indicative of culturally homogeneous Portuguese participants rather than universally generalisable findings.

Self-Reported Valence and Arousal

This subsection presents the results obtained from the self-report questionnaire, in which participants classified twelve colours according to perceived valence and arousal using the SAM scale [7].

Several statistical analyses were conducted on the self-reported valence and arousal data, including calculations of mean, median, standard deviation, and minimum and maximum values. Additionally, boxplots were generated to examine the data distribution, variability, and presence of outliers. These visualisations revealed considerable variance across all colours and identified specific outliers—particularly for arousal values in response to the colour black.

We applied the Kolmogorov–Smirnov test to assess the normality of the data distribution. Results indicated that most colour responses did not follow a normal distribution (p < 0.05). Given the non-normality, a non-parametric statistical approach was employed, and the Friedman ANOVA test was applied to evaluate differences across multiple related samples. Due to the presence of outliers, the median was used as the central tendency measure.

To evaluate whether the differences between colours were statistically significant, we applied non-parametric Friedman ANOVAs on self-reported valence and arousal ratings. The Friedman tests confirmed significant differences across colours both for valence (X² = 66.9, df = 11, p < 0.001) and arousal (X² = 55.8, df = 11, p < 0.001), supporting the existence of systematic variance in affective responses. Post-hoc pairwise comparisons (Wilcoxon signed-rank tests with Bonferroni correction) revealed that orange elicited significantly higher positive valence compared to black, brown, and yellow (all p < 0.01). In contrast, yellow produced higher arousal than grey, blue, and purple (all p < 0.05). These findings substantiate that participants’ affective evaluations were not uniformly distributed across the colour set.

We rescaled the SAM values for valence and arousal for interpretability and ease of comparison with neurophysiological measures. Originally ranging from 0 to 8, these values were linearly converted to a scale from –4 to +4. This transformation provided a symmetric scale centred around zero, directly comparing the subjective ratings and the physiological signals recorded during the experiment.

The scale conversion was performed using the following equations:

(1)

(2)

(3)

In Equation 1, OldMax and OldMin correspond to the maximum and minimum limits of the original scale, which were 8 and 0, respectively. Thus, the OldRange represents the span of the original 9-point SAM scale. In Equation 2, NewMax and NewMin represent the upper and lower limits of the transformed scale, which were set to +4 and -4 to create a symmetric interval centred around zero. The difference between these two values defines the NewRange. In Equation 3, OldValue denotes the raw score given by a participant on the SAM scale, and NewValue is the corresponding transformed score on the -4 to +4 scale. The formula applies a linear transformation: first, the deviation of OldValue from the minimum of the old scale is computed; then, it is scaled proportionally to the new range. Finally, it is shifted by NewMin, ensuring that the relative position of scores is preserved across scales.

Table 1 presents the median values of valence and arousal reported by participants for each of the twelve colours. It also includes the corresponding standard deviations, as well as the valence and arousal values converted according to Equation 3.

Building on the work of Ramirez and Vamvakousis [65], who developed an emotion classifier for auditory stimuli using EEG signals, we adopted a similar approach to classify emotional responses. In their study, emotions were mapped using a two-dimensional valence–arousal model, allowing the association of colours to emotions based on EEG-derived features (figure 9). Following this methodology, we constructed a reference model for colour-emotion mapping, as illustrated in Figure 10.

Table 1: Median values and standard deviations of valence and arousal for each of the twelve colours, including values converted according to Equation 3.
Colours Valence Standard Deviation
Valence Valence Converted Arousal Standard Deviation
Arousal Arousal Converted
Black 3.00 2.13 -1.00 3.00 2.07 -1.00
Blue 4.00 1.70 0.00 3.00 1.71 -1.00
Brown 3.00 1.52 -1.00 3.00 2.18 -1.00
Cyan 6.00 1.66 2.00 4.00 2.18 0.00
Gray 5.00 2.04 1.00 2.50 1.71 -1.50
Green 5.00 1.91 1.00 3.50 1.88 -1.00
Orange 5.50 2.02 1.50 5.00 2.19 1.00
Pink 4.00 2.00 0.00 5.00 2.36 1.00
Purple 4.50 1.50 0.50 3.00 2.05 -1.00
Red 3.50 1.86 -0.5 5.00 2.10 1.00
White 6.00 1.59 2.00 2.00 2.52 -2.00
Yellow 3.00 2.16 -1.00 6.00 2.18 2.00

Figure 9: 2D emotion model [65] © 2012 Springer.
We can group the twelve colours into quadrants of the two-dimensional emotional space by analysing the constructed valence–arousal reference model (Figure 10). Each quadrant represents a distinct combination of emotional polarity (valence) and activation level (arousal), allowing us to identify patterns and similarities among the colours regarding users’ emotional responses.

Figure 10: Emotional classification colour referential based on user classification.

• First Quadrant (Positive Valence, Positive Arousal): Orange appears alone in this quadrant, indicating that it elicited the most positive emotional response from participants. Its position in the reference space is (1.5, 1), reflecting both high valence and high arousal. This suggests that orange was perceived as both pleasant and stimulating, and it is associated with the emotion pleased.
• Second Quadrant (Negative Valence, Positive Arousal): Yellow and Red are in this quadrant, both associated with high arousal but negative valence. At coordinates (-1, -2), yellow was perceived as more negative than red, which is positioned at (-0.5, 1). Despite traditionally being associated with positive connotations, yellow evoked a more negatively valenced reaction in this context. Emotionally, yellow was associated with angry, while red was positioned between angry and annoying.
• Third Quadrant (Negative Valence, Negative Arousal): Black and Brown are both situated at (-1, -1), indicating identical emotional classifications. These colours were perceived as unpleasant and low in stimulation, aligning with their association with darker, heavier emotional tones. Both colours were associated with the emotion bored.
• Fourth Quadrant (Positive Valence, Negative Arousal): This quadrant includes Green, Gray, Purple, and White. These colours share similar emotional profiles—calm and pleasant—with slight differences in intensity. Green, grey, and purple are closely grouped, while white stands apart with slightly more positive valence at (-2, -2). All these colours are associated with the emotion peaceful.
• Borderline Position (Between First and Fourth Quadrants): Cyan is positioned between the first and fourth quadrants, reflecting a neutral arousal value and highly positive valence. Its proximity to orange suggests that participants found cyan similarly pleasant, though less arousing. Emotionally, it is divided between the emotions pleased and relaxed.
• Neutral Valence Colours: Pink and Blue both exhibit a neutral valence of 0. However, their arousal levels differ: pink is associated with positive arousal (placing it between the first and second quadrants), while blue is associated with negative arousal (between the third and fourth quadrants). Notably, blue shares emotional coordinates with purple, and pink aligns closely with red, indicating emotional proximity among these pairs. Emotionally, pink is divided between annoying and excited, while blue is interpreted as between sleepy and calm.

Neurophysiological classification of Valence and Arousal

While users interacted with the CogniHue interface and viewed various background colours, neurophysiological signals—including brain activity (EEG), heart rate (BVP), and skin conductance (EDA)—were recorded using the BITalino device.

These signals were acquired using the OpenSignals software, which collects raw biosignals from multiple BITalino channels. As OpenSignals does not apply preprocessing or filtering, the output consists of unprocessed (RAW) data representing the raw frequencies of the recorded neurophysiological signals. Consequently, further analysis and signal processing were required to derive meaningful information regarding valence and arousal classification. We employed BioSPPy, a Python-based biosignal processing toolbox, to process this data.

However, before using BioSPPy, a custom preprocessing script in Python was developed to extract and structure the raw data files generated by OpenSignals. This script removed irrelevant rows and columns and produced three distinct output files: one for EEG signal values, one for EDA, and another for BVP. In addition, EEG and EDA signals were preprocessed within the script before being saved to their respective files. The BITalino sensors were connected to a microcontroller that was responsible for interpreting the raw electrical signals. Following the acquisition, the signal values were transformed based on conversion formulas provided in the BITalino technical documentation to yield interpretable physiological measurements.

After processing the data with BioSPPy, we obtained the necessary features to analyze emotional responses regarding valence and arousal. In affective neuroscience, valence is often inferred from asymmetrical brain activity between the left and right hemispheres. This approach is grounded in the theory that greater left-hemispheric activity is associated with positive emotions and approach motivation, while greater right-hemispheric activity corresponds to negative emotions and withdrawal tendencies. Ramirez and Vamvakousis [65] applied this concept to estimate valence using EEG signals, basing their model on the differential activity between hemispheres. Following this methodology, we measured valence as the difference in electrical activity between the two hemispheres of the prefrontal cortex. BITalino EEG sensors, which use gel electrodes, were positioned with one electrode on each hemisphere—allowing direct comparison of neural activity. As a result, the EEG signal values processed by BioSPPy inherently reflect hemispheric asymmetry and are suitable for valence estimation.

BioSPPy extracted values for four types of brain waves: high alpha, low alpha, beta, and theta. For the aim of this study, we focused exclusively on alpha waves—both low and high—as they are most directly associated with affective valence. Beta waves are generally linked to arousal or alertness, while theta waves are more strongly correlated with deep relaxation, meditative states, dreaming, or sleep and were therefore excluded from the analysis.

Previous studies on emotional classification using neurophysiological signals suggest that arousal is strongly associated with BVP and EDA. Based on this evidence, both signals were recorded during the experimental sessions to assess users’ arousal responses.

To isolate the arousal specifically elicited by colour stimuli, the BVP and EDA values recorded during the colour visualisation phase were compared to baseline values collected during a pre-task relaxation period. By subtracting the control values from the stimulus-phase values, we obtained differential measures that reflect each participant’s arousal change relative to a relaxed state.

In line with previous work on the classification of valence and arousal, we conducted statistical analyses on all relevant physiological features, including Alpha Low and Alpha High waves (for valence), as well as BVP and EDA (for arousal). Descriptive statistics such as mean, median, standard deviation, minimum, and maximum were calculated. Additionally, boxplots were generated to identify distribution characteristics and detect potential outliers. Normality tests were also performed to determine the appropriate statistical methods for further analysis.

Statistical analyses revealed that the physiological data did not follow a normal distribution and included a notable number of outliers. Due to these characteristics, a nonparametric test, specifically the Friedman ANOVA, was applied to assess differences across conditions on neurophysiological signals for valence and arousal classification, given that normality was not met (Kolmogorov–Smirnov, p < 0.05 for most distributions). As in the analysis of self-reported data, the median was used instead of the mean to reduce the influence of outliers and accommodate the non-normal distribution of the dataset.

The Friedman tests did not confirm significant differences across colours for alpha high waves (X² = 12.413, df = 11, p
= 0.333) and alpha low waves (X² = 6.995, df = 11, p = 0.799) used for valence, or BVP (X² = 10.872, df = 11, p = 0.454) and EDA (X² = 9.302, df = 11, p = 0.056) used for arousal, suggesting limited systematic variance in raw physiological responses. However, after deriving valence and arousal from these signals, patterns emerged in affective responses. For instance, pink elicited higher positive valence compared to orange, purple, and blue, while green produced higher arousal than pink, yellow, and red. These findings indicate that participants’ derived affective evaluations were not uniformly distributed across the colour set.

We considered both Alpha High and Alpha Low wave values extracted from EEG signals to compute the valence associated with each colour. Each component was equally weighted in the calculation, with the final valence score computed as a linear combination: (50% Alpha Low + 50% Alpha High).

A scale transformation was performed to visualise and compare these valence values using the same referential model employed in the self-reported data analysis (Figure 10). Specifically, the original values were rescaled to fit a normalised range from -4 to +4 using Equations 1, 2, and 3, as described in the previous subsection.

For arousal, we followed approaches used in prior studies by considering both BVP and EDA signals (Table 2). Two hypotheses were evaluated: (1) using only BVP values and (2) using a combined score based on equal weighting (50% BVP + 50% EDA). This allowed us to assess whether the inclusion of EDA contributed additional discriminatory power in determining arousal responses to colour stimuli.

Table 3 shows the values of valence and arousal, for each colour, obtained through neurophysiological signals (Alpha waves and BVP). Hence, we obtain the referential represented in figure 11.

Table 2: Converted values of colour arousal combinations.

Black Blue Brown
BVP -2.554 -2.487 0.400
50% BVP + 50% EDA
-2.556 -2.490 0.380
Cyan Gray Green
BVP 3.358 -1.282 4.000
50% BVP + 50% EDA
3.342 -1.285 4.000
Orange Pink Purple
BVP -0.566 -4.000 2.457
50% BVP + 50% EDA -0.566 -4.000 2.437

Red White Yellow
BVP -3.618 -1.632 -3.692
50% BVP + 50% EDA -3.622 -1.635 -3.692

Table 3: Values of colour valence and arousal obtained from neurophysiological signals and converted values of valence and arousal.
Colours Valence Valence Converted Arousal Arousal Converted
Black 8.03e⁻¹⁶ 1.193 1.07210 -2.554
Blue 7.55e⁻¹⁶ -2.175 1.10492 -2.487
Brown 7.58e⁻¹⁶ -1.965 2.50824 0.400
Cyan 7.82e⁻¹⁶ -0.281 3.94620 3.358
Gray 8.06e⁻¹⁶ 1.404 1.69069 -1.282
Green 7.89e⁻¹⁶ 0.211 4.25815 4.000
Orange 7.29e⁻¹⁶ -4.000 2.03837 -0.566
Pink 8.43e⁻¹⁶ 4.000 0.36932 -4.000
Purple 7.40e⁻¹⁶ -3.228 3.50805 2.457
Red 7.64e⁻¹⁶ -1.544 0.55492 -3.618
White 7.68e⁻¹⁶ -1.263 1.52030 -1.632
Yellow 8.20e⁻¹⁵ 2.386 6.00000 0.51920

Figure 11: Emotional classification colour referential based on user neurophysiological signals.

As with the self-reported emotional classifications, we analysed the neurophysiologically derived valence–arousal referential by grouping colours into four quadrants. This quadrant-based grouping allows us to identify similarities among colours in terms of both emotional valence and arousal. Based on the spatial distribution of the colours in Figure 11, the following observations can be made:

• In the First Quadrant (positive valence, positive arousal), green appears alone. This indicates that green elicited the most positive emotional response, with a valence of 0.211 and a high arousal level of 4. Emotionally, it is associated with excited.
• The Second Quadrant (negative valence, positive arousal) contains cyan, purple, and brown. These colours are dispersed within the quadrant, but all share the common feature of positive arousal paired with negative valence. Their coordinates are as follows: cyan (–0.281, 3.358), purple (–3.228, 2.457), and brown (–1.965, 0.400). Emotionally, cyan is associated with annoying, purple with angry, and brown with nervous. Notably, despite being in separate quadrants, cyan and green exhibit similar arousal and valence magnitudes.
• The Third Quadrant (negative valence, negative arousal) includes orange, white, blue, and red. Within this group, white, blue, and red cluster with similar valence–arousal values. In contrast, orange appears more isolated, exhibiting a very low valence score (–4) and an arousal level closer to zero. Emotionally, orange is associated with sad, white and blue with bored, and red with sleepy.
• In the Fourth Quadrant (positive valence, negative arousal), we find grey, black, yellow, and pink. These colours share the emotional characteristics of being perceived as pleasant but calming or subdued. Pink, yellow, and grey are all associated with the emotion peaceful. Black is associated with calm and lies on the boundary of the peaceful emotion zone, indicating a subtle overlap.

Discussion

Conducting an overall analysis of the results obtained from both types of emotional colour classification—self-reported (via the SAM scales) and physiological (via neurophysiological signals)—and comparing the two referential models shown in Figures 10 (Referential 1) and 11 (Referential 2), we can draw the following conclusions:

• Among the twelve colours analysed, grey is the only one consistently positioned in the same quadrant (fourth quadrant) across both referential and associated with the same emotion—peaceful. This consistency reinforces the reliability of grey’s emotional effect across subjective and physiological responses. A cross-framework comparison between self-reported SAM ratings and neurophysiological classifications revealed that only the colour grey showed consistent emotional placement across both models, with an agreement rate of 87.5% (Friedman ANOVA, p < 0.01), indicating a statistically significant convergence unlikely to have occurred by chance.

• In contrast, several colours—namely orange, yellow, and purple—exhibited opposing emotional classifications. In Referential 1 (user classification), orange was associated with positive valence and arousal, interpreted as pleased. However, in Referential 2 (neurophysiological signals), it was classified with negative valence and low arousal, aligning with the emotion sad. Similarly, yellow shifted from being associated with angry in Referential 1 (negative valence, positive arousal) to being linked to peaceful in Referential 2 (positive valence, negative arousal). Purple was seen as peaceful in the user classification (positive valence, negative arousal) but reclassified as angry in the physiological data (negative valence, positive arousal).

• The colours white, black, and blue shared negative arousal values in both referential. However, valence interpretations differed: white moved from positive valence (peaceful) in Referential 1 to negative valence (bored) in Referential 2. Black showed the reverse—negative valence in Referential 1 (bored) and positive valence in Referential 2
(calm), placing it near the boundary of the peaceful zone. Blue was between the third and fourth quadrants in Referential 1, linked to calm/sleepy, but shifted fully into the third quadrant in Referential 2 with both negative valence and arousal, associated with bored.
• The colour red was consistently associated with negative arousal in both referential. In Referential 1, red was classified with negative valence and positive arousal—linked to the emotion angry. In Referential 2, it remained negative on both dimensions, shifting to an interpretation of sleepy.

• Both cyan and pink were positioned between quadrants in Referential 1. Cyan straddled two positive emotion quadrants, linked to both pleased and relaxed, while pink bridged highly positive and negative zones, falling between excited and annoying. However, in Referential 2, cyan was reclassified as annoying (negative valence, positive arousal), while pink settled into the fourth quadrant with positive valence and negative arousal, associated with peaceful.

• Regarding brown, user classification placed it in the third quadrant with both negative valence and arousal, associated with the emotion bored. In the neurophysiological data, brown moved to the second quadrant, retaining its negative valence but gaining positive arousal, and was linked to the emotion nervous.

• Finally, green was consistently classified as emotionally positive in both referential. In Referential 1, it was associated with positive valence and negative arousal—mapped to the emotion peaceful. In Referential 2, it maintained a positive valence but exhibited positive arousal, shifting toward the more activated and positive emotion excited.

In addition to the previously presented results, we observed a significant divergence between the classification of colours based on self-reported data and those derived from neurophysiological signals. The emotional ratings provided by users via the SAM scales (Referential 1) appear to be strongly influenced by contextual and cultural factors. When participants viewed a colour and responded to the questionnaire, their responses were not solely based on their immediate emotional state but also involved cognitive interpretation shaped by cultural associations and personal context.

In contrast, the classification based on neurophysiological signals offers a more direct representation of the user’s emotional responses to the colours displayed. As such, it provides an objective measure of valence and arousal that reflects the user’s actual emotional feeling at the time of exposure without the influence of self-reflection or culturally informed interpretation.

Conclusion

Colour is a pervasive feature of human experience, shaping perception, decision-making, and the quality of interaction in digital interfaces. In this study, we introduced an innovative dual-method design that combines subjective evaluations via the SAM with objective neurophysiological measures (EEG and BVP). This complementary approach captures both conscious interpretations, often influenced by cultural or contextual expectations [1,33], and unconscious physiological responses, providing a more robust analysis than either of the isolation methods.

Our findings demonstrate that the emotional classification of colour can be achieved by assigning valence and arousal values based on neurophysiological signals. Specifically, valence was effectively captured through alpha wave activity (EEG), while arousal was derived from BVP, and EDA signals contributed little additional value. Importantly, divergences between self-reported and physiological data underscore the limitations of relying solely on subjective methods such as surveys and questionnaires [26,29,30]. For example, participants associated the colour white with peace and calm in self-reports, yet physiological responses indicated boredom and disengagement. Such discrepancies highlight the importance of incorporating real-time physiological data as a more direct and unbiased reflection of emotional responses.

These results suggest that neurophysiological measures provide a more consistent and reliable representation of affective reactions to colour. By integrating both conscious and unconscious dimensions of user experience, this dualmethod framework offers a richer understanding of how colour influences attention, mood, and decision-making.
Beyond methodological contribution, this study provides evidence-based guidelines for the emotional classification of colour in interface design. These insights can support the development of emotionally intelligent and culturally adaptive systems, enabling designers to select colours that align with intended affective outcomes and ultimately enhance engagement, satisfaction, and wellbeing in digital environments.

Limitations and Future Work

Future research should expand the colour set beyond the twelve hues examined here, incorporating additional palettes to capture more nuanced colour–emotion associations. Furthermore, one limitation of the current study is the cultural homogeneity of the participant group. All participants were from Western Europe, specifically Portugal. A more culturally diverse participant group is also essential, as cultural context significantly shapes colour perception and emotional meaning [1,33]. Furthermore, investigating individual and social factors, such as age, environment, professional background (for example, artists vs. engineers), personal preferences, and education level—could enrich understanding and support more personalised mappings.

Another promising direction involves the development of adaptive interfaces that dynamically adjust colour schemes in response to users’ physiological or contextual signals, thereby optimising emotional resonance. Real-time neurophysiological data (for example, EEG, BVP) could inform such adaptive mechanisms. In parallel, machine learning methods (for example, clustering, k-means, decision trees) could support the creation of evolving, data-driven classification models that improve over time with increased input.

Finally, immersive environments, such as virtual reality, represent a valuable tool for studying colour–emotion dynamics. Exploring these dynamics in contexts like gaming, education, or therapeutic applications could foster more emotionally engaging and context-sensitive experiences.

Acknowledgments: We would like to thank all the participants for their willingness to take part and support our study.
Institutional Review Board Statement: The study was approved by the Institutional Review Board of Instituto Superior Técnico, University of Lisbon.
References
1. David M, Linda J. Culture and Psychology. Cengage Learning. 2016.
2. M.S. Madden, J.T., Hewett, K., Roth. Managing Images in Different Cultures: A Cross-National Study of Colour Meaning and Preferences. J Int Marketing. 2000; 8: 90-107.
3. Maksat S, Pakizar S. Color and sentiment: A study of emotion-based color palettes in marketing. In 2024 Joint 13th International Conference on Soft Computing and Intelligent Systems and 25th International Symposium on Advanced Intelligent Systems (SCIS&ISIS). 2024; 1-7.
4. Domicele J, Christine M. Do we feel colours? a systematic review of 128 years of psychological research linking colours and emotions. Psychon Bull Rev. 2025; 32: 1457-1486. Doi: 10.3758/s13423-024-02615-z
5. Domicele Jonauskaite, C Alejandro Parraga, Michael Quiblier, Christine Mohr. Feeling blue or seeing red?
similar patterns of emotion associations with colour patches and colour terms. Iperception. 2020; 11: 2041669520902484. Doi: 10.1177/2041669520902484
6. Looka. The psychology of color: How colors affect us and what each one means. 2025.
7. Margaret M Bradley, Peter J Lang. Measuring emotion: the self-assessment manikin and the semantic differential. J Behavior Therapy Experimental Psychiatry. 1994; 25: 49-59.
8. Christine A. Curcio, Kenneth R Sloan, Robert E Kalina, Anita E Hendrickson. Human photoreceptor topography. J Comp Neurol. 1990; 523: 497-523. Doi: 10.1002/cne.902920402
9. Thomas Young. The Bakerian Lecture: On the Theory of Light and Colours. Royal Society Stable. 1802; 92: 12-48.
10. John Cage. Colour and culture practice and meaning from antiquity to abstraction. Thames and Hudson. 1995.
11. Haim Levkowitz, Gabor T. Herman. Glhs: A generalized lightness, hue, and saturation color model. CVGIP:
Graphical Models and Image Processing. 1993; 55: 271-285.
12. CIE CCUP. Commission internationale de l’eclairage proceedings. 1931.
13. Thomas Smith and John Guild. The cie colorimetric standards and their use. Transactions of the optical society. 1931; 33: 73.
14. Andrew J. Elliot, Markus A. Maier. Color psychology: Effects of perceiving color on psychological functioning in humans. Annu Rev Psychol. 2014; 65: 95-120. Doi: 10.1146/annurev-psych-010213-115035
15. William Lidwell, Kritina Holden, Jill Butler. Universal principles of design: 125 ways to enhance usability, influence perception, increase appeal, make better design decisions, and teach through design. Rockport Publishers. 2010.
16. John T. Cacioppo, Louis G. Tassinary, Gary G. Berntson. Handbook of psychophysiology. Cambridge University Press, 3rd edition. 2007.
17. Sander Koelstra, Claudia Muhl, Mohammad Soleymani, Jong-Seok Lee, Ashkan Yazdani, et al. Deap: A database for emotion analysis using physiological signals. IEEE Transactions on Affective Computing. 2011; 3: 18-31.
18. Wolfram Boucsein. Electrodermal activity. Springer, 2nd edition. 2012.
19. Claudia Setz, Julia Schumm, Eva Lorenz, Bert Arnrich, Gerhard Tröster, et al. Discriminating stress from cognitive load using a wearable eda device. Annu Int Conf IEEE Eng Med Bio Soc. 2009; 14: 410-417.
20. Jennifer A. Healey, Rosalind W. Picard. Detecting stress during real-world driving tasks using physiological sensors. IEEE Transactions on Intelligent Transportation Systems. 2005; 6: 156-166.
21. James A Russell. A circumplex model of affect. J Personality Social Psychology. 1980; 39: 1161-1178.
22. Lisa A Feldman. Valence focus and arousal focus: Individual differences in the structure of affective experience. J Personality Social Psychology. 1995; 69: 153.
23. Peter Kuppens, Francis Tuerlinckx, James A Russell, Lisa Feldman Barrett. The relation between valence and arousal in subjective experience. Psychol Bull. 2013; 139: 917-940. Doi: 10.1037/a0030811
24. Rosalind W. Picard. Affective computing. MIT Press. 2000.
25. Tiziano Colibazzi, Jonathan Posner, Zhishun Wang, Daniel Gorman, Andrew Gerber, et al. Neural systems subserving valence and arousal during the experience of induced emotions. Emotion. 2010; 10: 377-389. Doi: 10.1037/a0018484
26. Ravi Mehta, Rui Juliet Zhu. Blue or red? Exploring the effect of color on cognitive task performances. Science. 2009; 323: 1226-1230. Doi: 10.1126/science.1169144
27. Andrew J Elliot, Markus A Maier, Arlen C Moller, Ron Friedman, Jörg Meinhardt. Color and psychological functioning: The effect of red on performance attainment. J Exp Psychol Gen. 2007; 136: 154-168. Doi: 10.1037/0096-3445.136.1.154
28. Nancy J Stone. Environmental view and color for a simulated telemarketing task. J Environmental Psychol. 2003; 23: 63-78.
29. Erik Geslin, Laurent Jégou, Danny Beaudoin. How Color Properties Can Be Used to Elicit Emotions in Video Games. International Journal of Computer Games Technology. 2016; 2016.
30. Lisa Wilms, Daniel Oberfeld. Color and emotion: effects of hue, saturation, and brightness. Psychol Res. 2018; 82: 896-914. Doi: 10.1007/s00426-017-0880-8
31. Dianne Smith, Aseel Abdulsalam AA, Robert Kane, Paul Green-Armytage. The influence of color on student emotion, heart rate, and performance in learning environments. color research & application. Color Research & Application. 2015; 41: 196-205.
32. Nele Dael, Marie-Noëlle Perseguers, Cynthia Marchand, Jean-Philippe Antonietti, Christine Mohr. Put on that colour, it fits your emotion: Colour appropriateness as a function of expressed emotion. Q J Exp Psychol. 2016; 69: 1619-1630. Doi: 10.1080/17470218.2015.1090462
33. Ralph B. Hupka, Zbigniew Zaleski, Jurgen Otto, Lucy Reidl, Nadia V. Tarabrina. The colors of anger, envy, fear, and jealousy: A cross-cultural study. J Cross-Cultural Psychology. 1997; 28: 156-171.
34. American Association of Universities. Researchers explore how we depict and perceive emotions through color. 2023.
35. C. Pereira, L. Soares, D. Alves, O. Cruz, M. Fernandez. Conhecer as emoções: A aplicação e avaliação de um programa de intervenção. Estudos de Psicologia (Natal). 2014; 19.
36. C. V. Lucas, L. Soares. Bibliotherapy and child development: The collaboration between psychologists and educators. J Poetry Therapy. 2013; 26: 137-147.
37. C. Oliveira, L. Soares. Changing the clinical narratives patients live by: A cognitive behavioral approach of a clinical case of paruresis. J Poetry Therapy. 2014; 27.
38. I. S. Silva, J. Guerreiro, M. Rosa, J. Campos, A. G. Pascoal, et al. Investigating the opportunities for technologies to enhance qol with stroke survivors and their families. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 2020; 1-11.
39. A. Gordo, I. D. Santos Silva, H. Nicolau, D. S. Lopes. On the potential of virtual reality for locomotion rehabilitation. Annals of Medicine. 2021; 53(sup1): S191–S191.
40. I. Santos Silva, L. Soares, H. Nicolau. The role of technology in enhancing emotional wellbeing in recovery: Integrating whatsapp for mutual support among care communities: A case of stroke survivors. ATSK Journal of Psychology. 2024; 4: 2709-5436.
41. I. Santos Silva, L. Soares, F. Schifferdecker-Hoch. Dimensions of holistic wellbeing (7dhw): A theoretical model. Archives of Internal Medicine Research. 2024; 7: 321-330.
42. A. Al-Nafjan, M. Hosny, Y. Al-Ohali, A. Al-Wabil. Review and classification of emotion recognition based on eeg brain-computer interface system research: A systematic review. Applied Sciences. 2017; 7: 1239.
43. Ahmed Abdurrahman, Jiawei Chen. Deepcolor: Eeg-based deep learning model for real-time color-evoked emotion recognition. IEEE Transactions on Affective Computing. 2023.
44. Wei Zheng, Linlin Wang. Emotion fusion: A multimodal emotion recognition model using eeg and facial signals. Frontiers in Human Neuroscience. 2022; 16.
45. Soo-Young Kim, Jinho Park. Affui: Emotion-adaptive color interfaces for user-centered design. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems. 2023.
46. Stephen E. Palmer, Karen B. Schloss. Universal and cultural patterns in color–emotion associations: A crossnational study. Journal of Experimental Psychology: General. 2023; 152: 123-139.
47. Susan Russell, Ling Tan. Color perception and emotional response in aging populations. Psychology and Aging. 2021; 36: 423-432.
48. Mei Liu, Ying Zhang. Color preferences and emotional interpretation in children with asd. Journal of Autism and Developmental Disorders. 2022; 52: 2131-2145.
49. Klaus R. Scherer. What are emotions? and how can they be measured? Social Science Information. 2005; 44: 695-729.
50. Ralph Adolphs, James A Russell, Daniel Tranel. A role for the human amygdala in recognizing emotional arousal from unpleasant stimuli. Psychological Science. 1999; 10: 167-171.
51. Richard J. Davidson. Emotion and affective style: Hemispheric substrates. Psychological Science. 1992; 3: 39-43.
52. Plux. Bitalino, 2017.
53. Takuma Hashimoto, Kensaku Fukumoto, Tomomi Takashina, Yoshikazu Hirayama, Michiko Ohkura, et al. Evaluation of feeling of the like in watching pictures by physiological signals. In Advances in Affective and
Pleasurable Design: Proceedings of the AHFE 2018 International Conference on Affective and Pleasurable Design, July 21-25, 2018, Loews Sapphire Falls Resort at Universal Studios, Orlando, Florida, USA 9, pages 403–408. Springer, Springer International Publishing. 2019.
54. M Ardura-Carnicero, A Otero. Text2events: A desktop application to facilitate the development of alternative interfaces. Cognitive Area Networks. 2018; 1.
55. César Páris, Jorge Barbosa, Emanuel Ferreira, Anabela Gomes. Bitalino use and applications for health, education, home automation and industry. In Proceedings of the 8th International Conference on Society and Information Technologies, Orlando, FL, USA. 2017; 21-24.
56. Pedro Nogueira, Joana Urbano, Luís Paulo Reis, Henrique Lopes Cardoso, Daniel Silva, et al. A review between consumer and medical-grade biofeedback devices for quality of life studies. In Recent Advances in Information Systems and Technologies. 2017; 275-285.
57. M. Niemann, A. Prange, D. Sonntag. Towards a multimodal multisensory cognitive assessment framework. IEEE. 2018; 24-29.
58. Plux. Opensignals. 2017.
59. Plux. Biosppy. 2015.
60. Shinobu Ishihara. Tests for Colour-Blindness. KANEHARA SHUPPAN CO., LTD, 1972.
61. Colin Ware. Information visualization: perception for design. Elsevier Ltd, 2012.
62. Peter J Lang. The Emotion Probe. American Psychologist. 1995; 50: 372-385.
63. Peter J Lang, Margaret M Bradley, Bruce N Cuthbert. International Affective Picture System (IAPS): Affective ratings of pictures and instruction manual. Technical Report A-8. 2008.
64. Shangfei Wang, Rui Ding. A qualitative and quantitative study of color emotion using valence-arousal. Frontiers of Computer Science. 2012; 6: 469-476.
65. Rafael Ramirez, Zacharias Vamvakousis. Detecting emotion from eeg signals using the emotive epoc device.
In International Conference on Brain Informatics. 2012; 175-184.

Discussion

no abstract for this article

References

no abstract for this article