Sign languages allow Deaf people to express their thoughts, emotions and opinions in a complex and complete way, just like oral languages. Each sign language is unique and has its own grammar, syntax and vocabulary. Among them is Mexican Sign Language (LSM), which has two main components: (i) spelling or dactylology, and (ii) ideograms. The first is similar to spelling in oral languages, and is used to communicate proper names, technical terms or words for which there are no specific signs, and using only the hands. While ideograms are signs that represent words or complete phrases, and are made in addition to the hands, from facial gestures and other body movements.
In our laboratory, studies have been carried out to address the recognition of the LSM in both components, through automatic tracking of the body's joints and applying machine learning techniques. The main objective of this research is to bridge the communication gap that exists between the Deaf community and Spanish speakers in Mexico, through the application of computational learning techniques.
This research is conducted in collaboration with the Regional Association of the Deaf in Ensenada (ARSE).
In recent years, there has been an increasing interest in designing wearable devices to measure human gait parameters. The main advantages of having wearable devices, in contrast to specialized laboratories, are their low-cost production and ubiquitous portability. Wearable devices to measure human gait parameters may effectively complement traditional gait analysis systems, and may be capable of continuously monitoring gait parameters during daily activities, reducing the stress and anxiety in individuals subjected to controlled clinical gait studies.
This research is conducted in collaboration with researchers of the Motion Analysis Lab of the National Institute of Rehabilitation, Mexico. Additionally, a concurrent validation was performed using a video-based system (kinect) in collaboration with Universidad de Castilla - La Mancha.
Recently, various research studies have been done to analyze human actions based on wearable sensors. A large number of these studies focus on identifying which are the most informative features that can be extracted from the actions data as well as in searching which are the most effective machine learning algorithms for classifying these actions. Wearable sensors attached to human anatomical references, e.g., inertial and magnetic sensors (accelerometers, gyroscopes and magnetometers), vital sign processing devices (heart rate, temperature) and RFID tags, can be used to gather information about the behavioral patterns of a person. Robustness to occlusion and to lighting variations, as well as portability are the major advantagesof wearable sensors over visual motion-capture systems. Additionally, the visual motion-capture systems require very specific settings for properly operating.
Wearable inertial and magnetic sensors have been used in some applications for human motion analysis. Measuring body movements accurately is crucial to identify abnormal neuromuscular control, biomechanical disorders and injury prevention. Quantitative analysis in daily living environments provides valuable and complementary information to that obtained in laboratory tests.
Calculating biomechanical variables from wearable inertial sensors is possible by using computational techniques for information fusion. This research proposes to use angles between segments of upper and lower opposite limbs, as the unit of measure for tracking human motion, because they are less sensitive to the particularities of persons, such as height, weight, gender and age, in contrast to other measures such as relative position. Also, this research proposes the calculation of these angles using only wearable inertial sensors, that can be easily worn and carried by people in daily scenarios.
Determine parameters of behavior and lifestyle that can be sensed by mobile devices in a naturalistic way and that are relevant to the study of aging and health.