Minjun Chen
shama illustration-01.jpg

Shama

 

An interactive installation that encourages human-plant interaction by turning living plants into user interfaces to display natural sounds and pixelated representations of people through simple touch interaction.

Type | Course Project for MHCI+D Physical Computing, March 2018 (1.5 weeks)

My Roles | Interaction Design, Rapid Prototyping, Usability Testing, Video Shooting and Editing

Team | Minjun Chen, Will Wang, Will Oberleitner

Shama

 
 

The Challenge

How might we leverage the physical environment around us to interact with digital information?

We almost entirely rely on personal digital devices, such as smart phones and laptops, to remotely interact with digital information.Interactions with pixels on these devices are not consistent with how we feel and sense the rest of the physical environment through our hands and body. Unfortunately, the interactions are increasingly happening in graphical user interfaces (GUI) while we ignore the way interactions can also take place with a variety of physical objects in our surroundings.

the challenge.png
 

Tangible interaction & biological design 

Our original idea for Shama comes from our interest in tangible interaction that integrates digital information into daily physical forms and objects seamlessly. This project is inspired by this concept of tangible interaction, allowing users to directly manipulate and control digital information through physical materials using their hands. 

The design of Shama is also driven by my desire to explore the role of design, life science and biology in our rapidly transforming society. Biological design was the inspiration for the physical form we might use in our design. We thought living plants could be appropriate interactions for representing digital information because plants already naturally transform information, such as light. Therefore, we decided to explore how we might we create a living plant interface that can respond to users and display information by combining living plants with electronic components.

The Inspirations

 
final shama tech.png

Technologies and Tools

How do we make living plants responsive to human touch?

We placed a capacitive sensor and a single wire in the bottom of the pot. This sensor enabled the plant to become an electric circuit with a low current running through it, allowing the plant to respond to human touch without damaging it. A combination of Arduino and Processing working with the capacitor sensor translate an analog value to activate visual and audio outputs.

How do we capture and translate real-time body movement to the iPad screen?

We used a depth-sensing camera, Microsoft Kinect V2, to track body movement. Kinect V2 measures how far the human body is away from the camera, and then uses the depth image to visualize the body shape and movement. The depth image is cropped to fit to the size of the interaction area we designed. We used the Open Kinect for Processing library to get the depth image from Kinect V2 and transfer it into Processing. We wrote a program in Processing to render pixels of the depth image into multiple dots.

How do we trigger natural sounds?

We used Sound library to add sounds of streams and forests (MP3 files) in Processing.

 

 

Design & Development Process

 

Iteration 1

 

“Is Nature Natural” is an interactive prototype that simulates a discussion about human-plant relationship. Humans’ relationship to nature has never been entirely natural, as we can see through humans’ interference with nature over time. Humans have altered nature through agriculture and raising and domesticating animals, for instance.  

We wanted to represent this relationships with our first prototype in order to bring human-plant interaction to life and understand how people react to it. By physically touching plants, visitors can hear different sounds to represent how humans manipulate plants from the dawn of history to our shared future:

sounds.jpg
 
 

 
 

Challenge 1: What is the correct prototype to build as an initial exploration, given the time constraint?

 
 
 

Building first prototype in Scratch & Makey Makey

After doing research about prototyping materials, I came up with the solution that we could build an interactive prototype with these simple but powerful tools : 

  • Scratch: an online platform for novices to learn programming languages and easily create interactive games, animations and stories.

  • Makey Makey: an easy-to-use invention kit to turn various objects that are conductive into a computer keyboard

iteraction1.png
 
Artboard.jpg
1.jpg
2.jpg
 

Concept Evaluation

To evaluate our initial concept, I created a video that showed the human-plant interaction and shared it with participants.

By watching this video, participants could see how three different sounds were triggered after people touched the plants.

 
 
 

User Feedback

We identified value in encouraging users to interact with plants,  as it helped us design for non screen-based devices.

“It is so amazing!” All of our participants were fascinated by this unexpected human-plant interaction.  We also reflected on the following aspects after this evaluation: 

Artboard Copy.jpg
 
 

In this iteration, we first removed three different sounds and added sounds of forests and birds as the output. We chose forest and bird sounds because they presented the idea of “nature. But here we had trouble solving this technical problem:

Iteration 2

 
 

 
 

Challenge 2: How could we capture interaction without connecting users to the electric component, such as a wire or an alligator clip?

 
 
 

I researched human-plant technology to understand some current solutions.

Disney Research has developed highly expressive interactive plants called botanics interacticus, which used the capacitive sensor to turn any object into a touch or gesture-sensitive interface.

 

We successfully solved the challenge by replacing Makey Makey with the capacitive sensor.

In this iteration, users can hear natural sounds after touching the plants without connecting their fingers to a wire or an alligator clip.

iteration2.png
 
 

User Testing

4 participants were allowed to touch the plant and hear the sounds during the test. Users were willing to interact with the plants without holding a wire or an alligator clip. However, we still needed to improve our design in the following aspects:

userfeedback2.jpg
 
 

Iteration 3

 

We attempted to address the recommendations made by testers in the previous iteration, but also experienced difficulties achieving goals determined out of the needs of potential users.

 

 
 

Challenge 3 How could our design go beyond simple touch and allow richer interaction?

 
 
 

We added pixelated representations of human body movement as another output — This was the more difficult challenge of the two recommendations. 

We thought visual feedback can engage the user with the interface more intimately than simple audio feedback. Richer interaction combining auditory and visual information could make the interface more useful both in work and home contexts. We wanted to improve human-plant interaction and bring more fun by allowing users to see themselves instead of being a passive receiver.

We used Microsoft Kinect V2 to check the human body movement, and visualize their image in Processing. When people touch the plant, they can see their bodies move in the visualization.

iteration3.png
 
 

We also made these modifications in this iteration:

  • We replaced the domesticated plant with the bamboo and schefflera.

  • Each plant was given its own unique audio feedback (visual output was consistent), providing different features, such as talking about the weather, or providing natural sounds.

21.jpg
 
 

User Testing

At this round of user testing, we were more interested in emotional responses and to gauge the reactions to follow-up questions we asked the participants after touching each plant.

An interesting finding was that — when people interacted with nature/digital hybrid, they immediately thought of voice assistants and how this could improve their form factor.

Two users thought it would be helpful to get information about weather or traffic informaiton while others were tired of having more voice assistants at home. “There are too many personal voice assistants, like Google home and Amazon Alexa in the market”. Even if people are tired of them, our design can promote engagement through providing customized natural objects for a variety of functions, including meditation, installation, entertainment and education.

 
 

Insights from testing this iteration also resulted in these design recommendations that informed our final design.

Match the length of touches with the length of sounds

“The ongoing sound seems weird.” — Participant 1

Provide customized natural objects as the interface to improve engagement

“I would like to play with my favorite natural objects, like fire, wood or water. ” — Participant 3

Hide the wires and other technical components that kept people away from touching the plants

“ When i saw this initially, i wouldn’t touch it. I saw these wires.”— Participant 1

 

 

Final Design & Outcome

 

We made several modifications in response to user feedback:

  • Compared with the earlier iteration, we adjusted the length of sounds to match the time the users touched the plant.

  • We built the presentation materials like the podium and iPad holders to enable the wires and laptop to be completely visually concealed.

  • We added water as another organic interface that allows users to interact with. We also put a lot of small rocks in the water pot to let the capacitive sensor to be visually hidden.

What we changed

10.jpg
 

We received a lot of positive feedback from visitors since they were delighted by this novel interaction.

“I feel so surprised about the interaction it comes out.” “This is genius! I never knew plants can work like this. I would definitely use it a lot in my everyday life.”

Final Exhibition

 

Reflection

  • Prototyping is actually more about getting around technical constraints in order to explore designs.

  • Designing for non-screen-based devices requires different senses of stimulus.

Future direction

Our next step is to make Shama go beyond an entertaining interface to a personal meditation assistant by integrating the brain sensing headband Muse. By sensing the user’s brainwaves through the Muse headband, their changing attention levels could be measured and visualized in our system. Then the viewer can control his body tracking visualizations and natural sounds through his brainwaves after physically touching the natural objects. The changing visualization and sounds in turn help people learn the essence of meditation and mindfulness while enjoying a playful immersive experience.

 
teammate.jpg
 

Special thanks to my awesome teammates ❤️🤭❤️