Implementation of the system was carried out in a very structured way having planned in advanced. As I had never
used the Leap Motion before, I spend some time getting to understand its functionalities and capabilities also looking
into the SDK which can be downloaded online.
I then began to get the controller to be able to recognise how many fingers I was extending and developed an algorithm
to recognise certain combinations of extended fingers which matched to American Sign Language poses. This became my number
recognition part of the system.
The next step I took was to be able to record and recognise alphabetical letters. These algorithms involved pattern matching
to determine which letter what being posed if any.
I was then able to expand on these algorithms to allow the system to be able to record and recognise gestures which correspond
to words and therefore string together to create sentences. This was a challenging concept as I was no longer analysing an individual
frame from the controller. Instead I had to know when to start recording frames and when the stop recording frames, based on the
users hand velocity. Once all of these frames had been recorded, analysis of them all was integrated into the algorithms to understand
the direction of the user's hands during every frame.
Finally, I connected to Arduino to the system and developed a way of communicating between the Arduino and the Leap Motion. This allows
an LED to be turned on and off based on the user's gestures performed.