top of page
Banner

Introduction

SignSense is an innovative AI-powered sign language translator designed to foster inclusivity by seamlessly integrating into popular platforms such as Google Lens, CCTVs, and video conferencing applications. This pioneering solution addresses the communication barriers faced by the over 70 million individuals worldwide who rely on sign languages as their primary mode of communication.

Team

3 Designers

My Role

Responsible for UX Research & Conceptualisation

Duration

6 Weeks

Background

WFD : World Federation of the Deaf
More than 70 million people worldwide use sign languages as their primary means of communication

Visual

Discover

We began by learning about how sign language functions, who uses it, how well it works, and the role of technology in it.

Bubble Diagram

Research

There exist more than 300 types of sign languages around the world, depending upon the region, the meaning of signs may vary

Most used Sign Languages globally -

• American Sign Language(ASL) 
• Chinese Sign Language (CSL)

• British Sign Language (BSL)
• Indian Sign Language (ISL)

After going through numerous articles, blogs, and research papers about sign language, we have gained insights into how sign language works, who uses it, and the challenges people face when using sign language for communication.

We also get to know about our Target audience.
It's not only individuals who use sign language that face challenges but also those who wish to interact with them.

Pyramid

Design for All

Challenges

One Major challenge they face is
"The Communication Barrier"

Visual

Define

After analyzing the existing solution, we examine where they are lacking and how we can incorporate those as our strengths defining them as our Areas of opportunities

Area of Opportunities

Visual

Accessibility

Visual

Emerging AI

As technology has progressed, we have seen the emergence of better AI tools that are more accessible and efficient. This has led us to consider using AI as a possible solution to our problems.

Background

Problem Statement

How can AI be used to enhance the lives of individuals who use sign language?

Ideation

We began by brainstorming all possible ideas, taking into account all our areas of opportunities and considering all the potential scenarios where sign language is used.

Ideation_Visual

We have listed the top ideas with which we would like to go further.

• Live AI Translation

• Integration into Google lens

• Integration into Online meets

• Integration into CCTVs

• Even for other systems like smart glasses

We then thought of creating an ecosystem, considering all the scenarios and making the solution the most efficient and effective one.

Supporting_Visuals
hero_page

An AI-powered sign language translator for greater inclusivity by integrating into Google lens, CCTVs & Video Conferencing apps

Eco-System

Supporting_Visuals

How SignSense works – Let's build a scenario

Meet Vanshika

A deaf person who uses sign language. Her dog gets sick but she doesn't know any nearby vets. She meets someone and tries to convey the message through her signs, but the person is not familiar with sign language.

Persona
screen_design

So the other person opens Google Lens to translate her signs into text

screen_design

Clicked on SignSense option

screen_design

The intro screen

(splash screen) will be shown

screen_design

Tap on the Record button to record the gestures

screen_design

Received real-time translation of sign language into text

screen_design
screen_design

Text to speech

Entering the reply in form of text

After you stop recording, a screen will display the whole message

screen_design

Translation of text message to sign language through avatar, makes it easier for Vanshika to understand the message (if in case Vanshika is not literate)

Will again switch the camera towards Vanshika to further translate her signs and the loop goes on. 

Vanshika felt relieved and grateful that her dog could receive the care it needed, just because SignSense made Vanshika communicate at ease

screen_design

We have integrated SignSense into Google Lens because every smart phone has it inbuilt and most people are familiar with the interface. This makes SignSense more accessible to all.

Integration of SignSense

In today's fast-paced world, online meetings have gained immense popularity. Their ability to quickly connect with a wide audience have made them a preferred choice.

Supporting_Visual

Lets see how this integration works

Once again, SignSense came to Vanshika aid during her online meeting. In the past, she struggled to convey her thoughts through typing messages, which often took a lot of time.

screen_design

Now she can just turn on SignSense in the message tab

screen_design

A splash screen appears for 2-3 seconds

screen_design

Now, she can directly translate her sign languages into text messages

Which saves her time and energy by avoiding typing long messages

Having an Emergency situation

SignSense into CCTVs

Consider a Scenario

In a crowded public place, Alex, who relies on sign language to communicate, suddenly faces an emergency. With no one else around, he spotted a nearby CCTV camera and urgently signalled for help using signs.

The integration of SignSense into the CCTV system would enable the person surveilling to understand the individual's sign language to quickly respond to assistance

supporting_image

Agree, That the probability of a sign language user to stuck in such a situation is much less than a normal person.

bottom of page