Developing a large-scale foundation AI model capable of personalized situational awareness and intent understanding by jointly learning diverse human-centered multimodal data such as biosignals, behavioral information, and language.
Developing core AI technologies that integrate autonomy, adaptability, and empathy to realize intelligent mutualistic AI. The system aims to understand human intentions, emotions, and situational contexts in complex and dynamic human-robot interaction environments, enabling trust-based long-term collaboration.
Developing key embedded AI technologies with lightweight design, real-time capability, and continual learning, to enable high-performance autonomous inference and real-time adaptation in resource-constrained environments such as mobile robots and wearable devices, while minimizing dependence on the cloud.
Researching core technologies to ensure the security of physical AI systems that interact closely with users and handle sensitive personal biosignals, while also aiming for explainable AI models.
Developing a skin-integrated multimodal signal sensing interface capable of acquiring high-quality biosignals and environmental interaction data, along with a sensor-AI computing accelerator network system for efficient signal processing.