Skip Ribbon Commands
Skip to main content

Speaking with Your Hands


As an important means of non-verbal communication, hand gestures are commonly used in our daily life. However, can we make machines to understand hand gestures too? For example, can we teach social robot human sign language so that it can communicate and interact with deaf people naturally and non-verbally? In the living room, can we control the TV remotely by using hand gestures only? In the virtual environment, can we manipulate virtual object with our hand and control the display using our hand gestures?

Although human can perform naturally well, analyzing and understanding human hand gestures is not easy for machines. Comparing with whole-body gestures, hands are much smaller objects with more complicated articulations, which easily subject to image sensing errors. Therefore, robust and efficient hand gesture recognition for real-life application is very challenging.

Prof Junsong Yuan’s research group from NTU EEE pioneers the works in robust hand gesture understanding and interaction using commodity depth cameras, such as Microsoft Kinect. Their work has been sponsored by Nanyang Assistant Professorship and Microsoft Company.

While prior work of Microsoft has utilized depth cameras to analyze whole-body motion, Prof Yuan’s team is among the first to deal with hands using depth cameras. With their novel technologies for hand gesture analysis using a depth sensor, they have developed pattern recognition and machine learning methods to enable intelligent human-machine interaction using hand gestures. By accurately capturing the hand gesture, they have implemented prototypes for recognizing hand sign languages, interacting with social robot and virtual object with hands, and playing a virtual piano on a piece of white paper. Their work has also received the best paper award of IEEE Transaction on Multimedia in 2016.

By Professor Junsong Yuan

Click here to find out more.


Published on: 18-August-2017 ​​​​​

Not sure which programme to go for? Use our programme finder
Loading header/footer ...