Scientists have selected over twenty signs to develop a new visual interpretation system, which allows deaf people to carry out consultations in the language they commonly use.
Signs can vary slightly depending on each user. Project researchers took this into account during the trials carried out with different people to help the system 'become familiarised' with this variability.
The signs recognised by the system were programmed to allow deaf people to maintain a basic conversation, including asking for help or directions.
The hardware includes a video camera which records image sequences when it detects the presence of a user wanting to make a consultation.
A computer vision and automatic learning system detects face, hand and arm movements, as well as any screen scrolling, and incorporates these into a classification system which identifies each movement with the word associated with the sign.
One of the aspects worth highlighting is the ability to adapt the system to any other sign language, since the methodology used is general. The system would only need to be reprogrammed with the signs used in that specific language.
The amount of signs the system can recognise is also scalable, although researchers do admit that new data will increase the difficulty in differentiating them.
Applications such as the one developed by CVC-UAB researchers require extreme precision in the identification phase and are very difficult to configure given that the surroundings in which they will be used include changes in light and shadow, different physiognomies and speeds at which the signs are formed.