Augmented

Language

An Augmented Reality application integrated with hand-tracking to help learn Sign Language

Application Development | Accessibility | Mixed Reality 

 

Tools: Blender, Unity

Device: Magic Leap 

Duration: Oct to Dec 2020

Project Overview

Deaf or Hard of Hearing (DHH) individuals choose sign language as their mode of communication, which is different from the spoken language hearing individuals' mode of communication, creating a communication gap between these two communities. As these communities share the same space, like being co-enrolled in academic settings or working as colleagues in the same company, it becomes more than important to reduce the communication gap.

 

Learning sign language is quite different from learning a spoken language because the spoken language is based on speech-auditory senses, while sign language is based on visual-manual senses. Sign language, along with facial expressions and lip-movements, heavily depend on hand gestures in a 3-dimensional space. Currently available resources like images and videos are restricted to the 2-dimensional domain, making it difficult to master sign language using these resources. Utilizing Mixed Reality can help us break this 2D barrier.

Problem Statement

How can we improve the effectiveness of learning sign language using advanced technology?

Solution

A viable solution should reinforce

An interactive learning method

Remote and constant availablity

Expert supervision not required

Real-time feedback

Inspired to provide these functionalities, we can look towards using Mixed Reality as a potential solution. The goal of this project is to develop an Augmented Reality (AR) application prototype integrated with Mixed Reality (MR) technologies like hand-tracking. 

 

The application would help the user to visualize hand gestures and movement of a particular sign in AR and would perform hand-tracking to provide real-time feedback on whether the user is reproducing the correct sign or not. To limit the scope for the prototype, this project would focus on helping self-learning British Sign Language (BSL) numbers from 0 to 10.

The application helps users self-learn British Sign Language (BSL) numbers from 0 to 10 by visualizing the signs in AR and the hand-tracking technology provides real-time feedback.

Mixed Reality Device

Magic Leap

3D development

Unity engine

3D graphics creator

Blender

Problem Scope

$165 billion

that is 85%

in the next 4 years

According to Global Market Insights, AR could surge up to

Target Domain

466 million

people are devoid of auditory senses

​Sign Language is the primary mode of communication for individuals from Deaf and Hard-of-Hearing community

User Flow

A user would wear Magic Leap, a headset-mounted device, and 3D instructional hands would be displayed, in augmented reality, on the screen of the headset. This AR instructional hand would help the user to visualize and guide about a particular sign's hand gesture, and movement in space. The user would now need to reproduce that same gesture, and the headset would perform hand-tracking to recognize the user's hand gesture in real-time to figure out if the user performed the correct gesture or not.

Sketches

Home Screen

All the signs will be displayed on the home screen. The users would select the sign they want to learn and the sign would open on the next screen.

ViewSign Screen

The selected sign gets displayed. Now the user can go to the previous or next sign in the list. 

Now the user can view the hand movement of a particular sign by clicking on the “View Animation” button or the user can practice reproducing the sign. 

PracticeSign Screen

If the "Practice Sign" button is selected, the system would go to the next screen where the user is asked to reproduce that particular sign. While the user is producing the sign, the system would perform gesture recognition to find if the user has reproduced the correct sign or not.

Implementation

01

Created Hand models 

To create the desired application, I first needed to make the instructional hands and give them animations. The hand models, their rigging, and animations all were done using Blender 2.9.

02

Engine and device setup

To display these hand models on the Magic Leap device, the Unity engine was used but before that, I first had to integrate these two technologies. Magic Leap uses Lumin OS, so it was important to include Magic Leap Lumin SDK  v0.24.1 and Magic Leap Unity Package v.24.2.

03

Uploading models on engine

Once the Unity setup was ready, objects were uploaded on unity. As the hand models included animation, I exported them as .fbx files from the blender. Each model had its own animation defined in the blender and was allotted its own Animator Controller, in order to gain the flexibility to control the object’s animation.

04

Scenes setup and Logic scripts

A plane needed to be defined in order to display signs from the user’s perspective. All the signs were placed on the plane with a text box representing their detail. To display the frame where signs of each hand model was made, a script AnimScript was created.

05

Magic leap Hand-tracking integration

Magic Leap’s built-in hand-tracking technology would perform hand tracking and show if the user is performing the correct or wrong sign. If in 3 sec the user produces the correct sign with over 90% KeyPoseConfidence then a success message is shown. Or else a wrong sign message on the screen.

Final Result

Future Scope

Currently, Magic Leap only detected 8 key poses, which limits the scope of the project. I was only able to detect a few of the BSL signs. BSL numbers 0,1,5,8, and 10 can be detected using the Magic leap key poses. While BSL numbers 2,3,4,6,7, and 9 cannot be detected. I tried using key points, however, Magic Leap does not provide a Boolean variable to show if desired key points are available or not. 

This project was focused on developing a functional model. In the future, we can consider working on the form of the project and improve the user experience

The prototype can be implemented to form a complete application that can help users learn various different signs from different languages. The prototype can be further extended to include signs using two-hands. Moreover, the sign’s position can be programmed to be displayed concerning the user’s body in order to help them understand the sign orientation in a better form.