Summary
Useful ICT innovations are continuously developed improving the quality of life for many people. However, such solutions do not typically included people with severe dual vision and hearing impairments, and at times also coupled with cognitive disability. Deafblindness is a grave condition. Though, rare at birth, it can be acquired due to different causes. There is an estimated 2.5 M deafblind persons in the EU. Limited communication is a major problem for this group; something that will be addressed by SUITCEYES in a novel way. Benefits are not limited to this group; rather the solution will scale to other areas.
SUITCEYES proposes a new, intelligent, flexible and expandable mode of haptic communication via soft interfaces. Based on user needs and informed by disability studies, the project combines smart textiles, sensors, semantic technologies, image processing, face and object recognition, machine learning, and gamification. It will address three challenges: perception of the environment; communication and exchange of semantic content; learning and joyful life experiences. SUITCEYES will extract and map the inner structure of high-dimensional, environmental and linguistic clues to low-dimensional spaces, which then translate into haptic signals. It will also utilize image processing, mapping environmental data to be used for enriched semantic reasoning. SUITCEYES’ intelligent haptic interface will help the users to learn activation patterns by a new medium. With this interface, users will be able to take more active part in society, improving possibilities for inclusion in social life and employment.
The solution will be developed in a user-centred iterative design process, with frequent evaluations and optimizations. The users’ learning experiences will be enriched through gamification and mediated social interactions. The proposed solution will take into account the potential differences in levels of impairments and user capabilities and adapt accordingly.
SUITCEYES proposes a new, intelligent, flexible and expandable mode of haptic communication via soft interfaces. Based on user needs and informed by disability studies, the project combines smart textiles, sensors, semantic technologies, image processing, face and object recognition, machine learning, and gamification. It will address three challenges: perception of the environment; communication and exchange of semantic content; learning and joyful life experiences. SUITCEYES will extract and map the inner structure of high-dimensional, environmental and linguistic clues to low-dimensional spaces, which then translate into haptic signals. It will also utilize image processing, mapping environmental data to be used for enriched semantic reasoning. SUITCEYES’ intelligent haptic interface will help the users to learn activation patterns by a new medium. With this interface, users will be able to take more active part in society, improving possibilities for inclusion in social life and employment.
The solution will be developed in a user-centred iterative design process, with frequent evaluations and optimizations. The users’ learning experiences will be enriched through gamification and mediated social interactions. The proposed solution will take into account the potential differences in levels of impairments and user capabilities and adapt accordingly.
Unfold all
/
Fold all
More information & hyperlinks
Web resources: | https://cordis.europa.eu/project/id/780814 |
Start date: | 01-01-2018 |
End date: | 30-06-2021 |
Total budget - Public funding: | 2 359 963,00 Euro - 2 359 963,00 Euro |
Cordis data
Original description
Useful ICT innovations are continuously developed improving the quality of life for many people. However, such solutions do not typically included people with severe dual vision and hearing impairments, and at times also coupled with cognitive disability. Deafblindness is a grave condition. Though, rare at birth, it can be acquired due to different causes. There is an estimated 2.5 M deafblind persons in the EU. Limited communication is a major problem for this group; something that will be addressed by SUITCEYES in a novel way. Benefits are not limited to this group; rather the solution will scale to other areas.SUITCEYES proposes a new, intelligent, flexible and expandable mode of haptic communication via soft interfaces. Based on user needs and informed by disability studies, the project combines smart textiles, sensors, semantic technologies, image processing, face and object recognition, machine learning, and gamification. It will address three challenges: perception of the environment; communication and exchange of semantic content; learning and joyful life experiences. SUITCEYES will extract and map the inner structure of high-dimensional, environmental and linguistic clues to low-dimensional spaces, which then translate into haptic signals. It will also utilize image processing, mapping environmental data to be used for enriched semantic reasoning. SUITCEYES’ intelligent haptic interface will help the users to learn activation patterns by a new medium. With this interface, users will be able to take more active part in society, improving possibilities for inclusion in social life and employment.
The solution will be developed in a user-centred iterative design process, with frequent evaluations and optimizations. The users’ learning experiences will be enriched through gamification and mediated social interactions. The proposed solution will take into account the potential differences in levels of impairments and user capabilities and adapt accordingly.
Status
CLOSEDCall topic
ICT-23-2017Update Date
27-10-2022
Images
No images available.
Geographical location(s)
Structured mapping
Unfold all
/
Fold all