Minjun Chen
stream作品集封面 copy-01.png

Stream (Sponsored by HTC Creative Labs)

A VR application that helps artists to overcome their creative blocks by integrating the brain-sensing technology.

Type | MHCI+D Capstone Project , Jan 2018 - Aug 2018 

My Role | UX Research, Interaction Design, 3D Interface Design, Video Production

Team | Minjun Chen, Will Wang, Brian Nguyen, Shin Liu

Sponsor | HTC Creative Labs

Stream

 

How might we use biometric data to support artists in overcoming their creative blocks?

The Challenge

 

The Solution

Stream helps artists overcome their mental blocks during their creative process by integrating the brain-sensing technology into virtual reality. Realized via neurotechnology, electroencephalogram (EEG) in particular, the affective environment is triggered by the artist’s brainwaves via the tracking by the MUSE headband.

 

The affective environment

The white theme has orbs which increase in size and redness if there is an increase in brain activity. The black theme has particles that increase the number of connections between them with higher activity in the brain. The affective room serves as ambient feedback to the users’ current mental states while they are creating in the affective environment. This is significant since a large part of the artistic creation is to externalize the artists’ mental state. Representing the artists’ internal mental states through affective visualizations can support them during their creative process.

Key Features

 
 

Timeline

Artists can go back and review a recreation of their pervious projects and how they were feeling during the creative process. Artists can mark any moment on their timeline where they want to go back to later.

 
 

Reflection Sphere

A sphere appearing in the corner of the users’ viewport helps artists self-reflect on their past mental states when revisiting their projects. The more filled the sphere becomes, the closer the user is to matching their previous mental state. 

 
 

Creation

Artists from different areas are able to do any type of creative activity, such as writing or musical composition , in this virtual space. For example, artists can unleash their creativity with 3D brush stokes and create their own 3D paintings. 

 

Technologies and Tools

How do we monitor EEG data?

We non-invasively collect and record electrical activity of the brain and visualize this data into VR using the HTC VIVE and the MUSE headband. The Muse headband is able to record brain activity and transfers these signals to the virtual space via Unity, which then reflects the intensity of the brainwaves through color and movement.

How do we integrate EEG data into virtual reality?

We prototyped bringing EEG data into VR with the Muse headband by using Open Sound Control (OSC) Protocol. MuseLab is a supporting tool for the headband that, once configured, streams the data over a selected port. There is a wide range of values that can be sent through MuseLab such as delta, theta, alpha, beta, gamma, EEG, and more. The alpha and beta waves were of particular interest, as these states are most associated with mindfulness and wakefulness. Ultimately, the alpha and beta values were used as they were the most consistent of all the data.

How do we map EGG data into the visualization?

With a script, the values would be passed into Unity3D by reading the port through which the data was being sent. The visualizations are particle systems within Unity which have variables that can be adjusted. By mapping the EEG data values to the particle system values a direct correlation exists where the higher the EEG data reading, the higher the value of the variable within the particle system. For our "light" visualization, this meant an increase in size and redness. For our "dark" visualization, this meant increased connections between the dots in the space.

 

Interaction Model (Controllers)