The Sensory Extension - Embedded Wearable and Sensing Technology

UX Research, Ideation, Prototype, Art Direction

Speculative Design, Interaction Design

February 2016

Schei Wang


With the advent of wearables and the Internet of Things, and with increasingly accurate voice, gestural, computer vision and brain-computer interfaces, our interactions are transitioning from point and click, multi-touch, and typing to talking, gesturing, behaving, and even thinking.


As everyday computational systems move from computers and phones to wearables, smart objects and environments, what are the implications for design? What are the new design patterns? How does the character of interaction change when there is no screen to look at or touch? What new uses will embedded interaction create?

What if one day we could upload all of our sensory data to the cloud, retrieve it and re-experience it through simple gestural controls? The gestures are so subtle that it enables the user to communicate with the computer silently.


Participants can use gesture controls to stream smells. This experiment were designed to let participants switch between two scents and make deliberate choice if they wanted to fuse the smells. Two smells are emitted at the same time when thumb stops at the middle point of index finger.

How might it work?

For sensing different “feelings,” Sensory Extensions chips will be implanted in our finger tips; we’ll be able to receive and emit olfactory, gustatory, tactile, visual, and acoustic impressions. The Sensory Extension would allow users to save, share, and receive sensory data, even have multiple sensory impacts simultaneously.


Instead of external commands, we designed various gestures to do multiple functions, since the feelings are more internal and intimate than speaking.

The subtle gestures allow users to interact with computers secretly and privately. People are able to experience poetic sensations when they use fingers to caress the surfaces of objects, and even feel things they miss through the collaboration of imagination and technology. The experience is poetic, sensitive and intimate.


We speculate that people can use fingers to perceive, save and share moments to others in the future.

With sharing personal moments, viewing and experiencing things from other people’s positions is no longer an impossible thing.

With some simple gestural instructions and the help of devices, users will be able to experience a unique dining experience — or share it via social media. Users will be even able to feel the pets they miss...

With this speculative device, users are not only capable of feeling more, but also able to save and share our personal moments to others. 


The project was inspired by how the sense of touch means for us in our daily life. Tactile sensation plays a crucial part for helping us distinguish objects, confirm the existence of things, and even importantly, amplify other senses. One of our interests in fingers were aroused by the culture, where people “describe in rhapsodic terms the advantages of eating with their fingers: the sensuous connection to the food, the feeling of sharing and community” (Mindess).


Related research, such as Disney 3D touch surface and Google Project Soli, focus more on the technical engineering aspects, but less on users’ mental or physical responses to the technology. Instead, what important for us is to bring memory, emotions, and privacy into the project.


Experiment A - (15 interviewers) The sense of taste collaborates well with tactile, visual, olfactory impressions. One of our experiment is about unveiling the interrelation among olfactory, gustatory, tactile, visual, and acoustic impressions. Participants were asked to touch different objects with their eyes covered, and then draw or describe how they think the objects look like, smell like, taste like, and sound like.

Experiment B - The patterns of communicating our gustatory experience via touch were explored. The tactile patterns are related to the feelings on our tongues while eating.

Experiment Insights

+ The sensations might be limited by linguistic and visual vocabulary, as a result, leaving sensory perception unexploited.

+ The sense of touch doesn’t reach how sensitive we expect it to be. For instance, only via touching objects, participants cannot tell whether their whole fingers are soaked in liquid or not, without sensing the differences in temperature or the resistance of water.

+ Tactile and visual experience are well connected to the shape of objects, as smell and taste are related. However, the hearing sensation is isolated from other sensory experiences.

+ With well collaboration among senses, our sensory capability can be amplified. We ask how we can apply these discoveries for further exploration.


When the functions of computers become more and more complicate and personal.

- How would emotions and memory be more involved in our conversations with computers?

- How would the relationship between human and computers be?

- Who would have more power in that relationship?

- Rather than mimicking the real world in virtual reality, can we compose the "real" differently or in a poetic way?

Anthropologist Mary Bateson said that “poetry is important for finding out about the world,…” which is able to bring us a new perspective to our life.

Future Possibilities

We speculate the future of human senses. Besides fingers, the sensory extension chips also have chance to be implanted in other parts of bodies.
What if we can create an aspiring experience by a poetic composition of senses? This question is what we also want to explore in the future research. For instance, we can let participants see a yellow chair, hear water boiling, taste milk, and smell lavender at the same time.

- How would people perceive the simultaneous stimulations?

- How much could this technology lead human to be sensitive in their life?

- How often it might cause people to be distracted from current experience?

︎     ︎     ︎