Aura AI

Aura AI is a concept I designed and prototyped in React. I wanted to turn emotions into something you could see. It might sound odd to robotize feelings, but shape and color can make emotions easier to name, track, and connect to the body. The project includes a color orb, a body map, and a system that ties emotional tone to visual traits like contrast and saturation to make your emotions and their effects easier to understand.

Year

2024

Services

UX Concept & Dev

Client

Personal

/ Concept

I focused on the core idea: the Aura.
The vision was an AI chat on the left where users could talk through their thoughts, while the aura on the right updated with colors on an orb and a body view. I wanted to show the connection between what we feel and where it shows up physically.

/ Design

I created a system that mapped emotion to visual cues.
Core emotions were based on models like IBM’s classification system and James A. Russell’s Circumplex Model of Affect, which defines emotions across valence and arousal axes. I used those dimensions to capture emotional tone and intensity, then applied visual variables such as contrast and saturation to translate them into color logic, similar to how image processing works. Each emotion corresponded to a region of the body and included simple, actionable suggestions to help users reflect and respond.

/ Result

Aura AI remains a concept, but the foundation is in place. I built the emotion-to-color system and tagged the core visual and physiological nuances. Overall, it was a fun way to use my biomedical engineering background for something not only useful but beautiful.

Aura AI

Aura AI is a concept I designed and prototyped in React. I wanted to turn emotions into something you could see. It might sound odd to robotize feelings, but shape and color can make emotions easier to name, track, and connect to the body. The project includes a color orb, a body map, and a system that ties emotional tone to visual traits like contrast and saturation to make your emotions and their effects easier to understand.

Year

2024

Services

UX Concept & Dev

Client

Personal

/ Concept

I focused on the core idea: the Aura.
The vision was an AI chat on the left where users could talk through their thoughts, while the aura on the right updated with colors on an orb and a body view. I wanted to show the connection between what we feel and where it shows up physically.

/ Design

I created a system that mapped emotion to visual cues.
Core emotions were based on models like IBM’s classification system and James A. Russell’s Circumplex Model of Affect, which defines emotions across valence and arousal axes. I used those dimensions to capture emotional tone and intensity, then applied visual variables such as contrast and saturation to translate them into color logic, similar to how image processing works. Each emotion corresponded to a region of the body and included simple, actionable suggestions to help users reflect and respond.

/ Result

Aura AI remains a concept, but the foundation is in place. I built the emotion-to-color system and tagged the core visual and physiological nuances. Overall, it was a fun way to use my biomedical engineering background for something not only useful but beautiful.

Aura AI

Aura AI is a concept I designed and prototyped in React. I wanted to turn emotions into something you could see. It might sound odd to robotize feelings, but shape and color can make emotions easier to name, track, and connect to the body. The project includes a color orb, a body map, and a system that ties emotional tone to visual traits like contrast and saturation to make your emotions and their effects easier to understand.

Year

2024

Services

UX Concept & Dev

Client

Personal

/ Concept

I focused on the core idea: the Aura.
The vision was an AI chat on the left where users could talk through their thoughts, while the aura on the right updated with colors on an orb and a body view. I wanted to show the connection between what we feel and where it shows up physically.

/ Design

I created a system that mapped emotion to visual cues.
Core emotions were based on models like IBM’s classification system and James A. Russell’s Circumplex Model of Affect, which defines emotions across valence and arousal axes. I used those dimensions to capture emotional tone and intensity, then applied visual variables such as contrast and saturation to translate them into color logic, similar to how image processing works. Each emotion corresponded to a region of the body and included simple, actionable suggestions to help users reflect and respond.

/ Result

Aura AI remains a concept, but the foundation is in place. I built the emotion-to-color system and tagged the core visual and physiological nuances. Overall, it was a fun way to use my biomedical engineering background for something not only useful but beautiful.