Silent Signal

Breaking the language barrier with technology

 About the project

The Experiment aimed to investigate how digital tools can be used to create presence, community, and engagement over distance and time.

‘Silent Signals’ is an experiment that aims to break down language barriers between users across locations by enabling them to send and receive text messages in the language of the intended recipients using simple gestures. The gesture detection framework is based on poseNet technologies, and the experiment uses PubNub API to send and receive these messages. It is intended to be a seamless interaction where users’ bodies become controllers and triggers for the messages. It does away with the keyboard as input and takes communication into the physical realm, engaging humans in embodied interactions. It can comprise of multiple users and is irrespective of the spatial distance between these participants.

 

Team
Jignesh, Priya & Nadine. Digital Futures M. Des

OCAD University, Toronto

My Role
Concept, Prototyping & Testing

Duration
2 Weeks exploration & Prototype

 

Prototype

Concept & Demonstration

 

 Project Context

When we first started thinking about communication, we realized that between the three of us, we had three different languages: Priya and Jignesh’s native Hindi, Nadine’s native French, and English that all three shared as a second language. We imagined teams collaborating on projects across international borders, isolated seniors who may only speak one language, and globetrotting millennials who forge connections throughout the world. How could we enable them to connect across language barriers by making them connect across language barriers?

 Adaptive Challenges

There is currently no standard format for body-to-machine gestures. Gestures and their meanings vary from country to country.

  • While the thumbs up gesture has a positive connotation in the North American context, it has a vulgar connotation in West Africa and the Middle East.

  • Indian head wobble there is no fixed meaning to any head shake gesture to communicate still the message is conveyed to the other person with the help of facial expressions and actions.

 Observations

  • Participants expressed that it was a unique and satisfying experience to engage in this form of embodied interaction using gestures.

  • The users were appreciative of the fact that we can develope our own set of gestures to communicate instead of confining to existing sign languages.

  • Concerns of sending messages when not intended.

Next Steps

  • Due to false identification of gestures while transitioning between gestures team decided to test the prototype using image recognition technologies instead of skeleton detection framework based on poseNet technologies.

  • The application can be a combination of voice, text and gesture control interface.

 “Today’s technology can sometimes feel like it’s out of sync with our senses as we peer at small screens, flick and pinch fingers across smooth surfaces, and read tweets “written” by programmer-created bots. These new technologies can increasingly make us feel disembodied.”

Paul R. Daugherty, Olof Schybergson and H. James Wilson
Harvard Business Review

CODE ON GITHUB

https://github.com/jigneshgharat/Silent-Signals

REFERENCES

Oliveira, Joana. “Emoji, the New Global Language?” In Open Mind https://www.bbvaopenmind.com/en/technology/digital-world/emoji-the-new-global-language/. Accessed online, November 14, 2019

Evans, Vyvyan. Emoji Code: the Linguistics behind Smiley Faces and Scaredy Cats. Picador, 2018. 

https://us.macmillan.com/excerpt?isbn=9781250129062. Excerpt accessed online, November 15, 2019

Schybergson H, Paul R. Daugherty Olof, and James Wilson. “Gestures Will Be the Interface for the Internet of Things.” in Harvard Business Review, 8 July 2015, https://hbr.org/2015/07/gestures-will-be-the-interface-for-the-internet-of-things. Accessed online November 12, 2019

Oved, Dan. “Real-time Human Pose Estimation in the Browser with TensorFlow.js” in Medium. 2018.

https://medium.com/tensorflow/real-time-human-pose-estimation-in-the-browser-with-tensorflow-js-7dd0bc881cd5. Accessed online November 10, 2019.

Previous
Previous

STEPS

Next
Next

BIRD-E