Rupert Bennett

About the Project



The Sign Language Interpreter was a project in which I took on for my final year dissertation. I looked into the communication gap between deaf and non-deaf people and realised that in the UK, deaf people can only communicate to 0.23% of the population using British Sign Language (BSL). This is clearly an issue and I hoped that I could create an application to bridge this gap.

This project enhances on the basic features of the Leap Motion controller and the Leap SDK to be able to record and recognise hand gestures and poses and interpret them into words and sentences to feedback to the user via an audio output. The ability to train the system is vital so that the learnt poses can be recognised in the future. The idea is that a deaf person could use the application to sign what they wish to say and the non-deaf participant could hear through the system what the user is saying without having to understand sign language themselves. This would take the 0.23% communication level of the UK's population up to 100%.

The Design

The design of this system is a relatively basic design which doesn't take much time to understand. The LeapMotion controller is connected to the PC via a USB cable and this is the same with the Arduino (the Arduino is only needed for the "home automation" part of the system). The code was developed using the Java programming language using the Netbeans IDE. The Arduino code was developed using the Arduino IDE and speaks to the Java code, this allows messages to be passed via COMM port communication.

Gestures and poses are stored in lists at run time and are serialised to a file when the program is closed. Upon launching the application, the gestures and poses learnt, are serialised back into the lists. Local audio files are stored in the file system to allow for quick access and prevent unnecessary lags in the system. This could be changed to use a voice as a service provider VAAS. This choice would benefit the system as audio files would no longer need to be stored locally but latency may become an issue.

The Implementation

Implementation of the system was carried out in a very structured way having planned in advanced. As I had never used the Leap Motion before, I spend some time getting to understand its functionalities and capabilities also looking into the SDK which can be downloaded online.

I then began to get the controller to be able to recognise how many fingers I was extending and developed an algorithm to recognise certain combinations of extended fingers which matched to American Sign Language poses. This became my number recognition part of the system.

The next step I took was to be able to record and recognise alphabetical letters. These algorithms involved pattern matching to determine which letter what being posed if any.

I was then able to expand on these algorithms to allow the system to be able to record and recognise gestures which correspond to words and therefore string together to create sentences. This was a challenging concept as I was no longer analysing an individual frame from the controller. Instead I had to know when to start recording frames and when the stop recording frames, based on the users hand velocity. Once all of these frames had been recorded, analysis of them all was integrated into the algorithms to understand the direction of the user's hands during every frame.

Finally, I connected to Arduino to the system and developed a way of communicating between the Arduino and the Leap Motion. This allows an LED to be turned on and off based on the user's gestures performed.

The Product

ASL Sentences

ASL Numbers

ASL Alphabet

ASL Home Automation