MICCRO
- Microfluidic Integration for Chemo-mechanical Cancer cell Regulation and Optical tracking -
R&D of advanced microfluidic platforms that enable precise control of the chemical and mechanical microenvironment surrounding live cancer cells. These systems allow us to apply tunable stress cues while observing real-time cellular responses at single-cell resolution. Integrated computational mechanobiology and machine-learning analytics, enables migration tracking, shape dynamics, and phenotype transitions. The approach provides quantitative insight into epithelial-mesenchymal transition — a key driver of acquired stemness, invasion, and metastatic potential. Ultimately, our work establishes high-fidelity models for probing tumor progression and supports the discovery of predictive biomarkers/therapeutic targets.
3DRTT
3D Radiation Therapy Training
3DRTT is a real-time, web-based 3D simulator for radiation therapy linear accelerators, designed to enhance treatment precision and accelerate training. By providing accurate visualization of hardware components and their motion, it improves radiation treatment planning and delivery. Integrated with patient CT data, it enables virtual pre-treatment setup and collision prediction for complex beam configurations. 3DRTT also strengthens radiation therapy education by offering interactive 3D environments tailored to specific systems. It supports training for technologists, dosimetrists, and physicists, improving understanding of linac components, motion limits, and accessories—ultimately advancing clinical preparedness, workflow safety, and treatment quality.
Prof. Meena V. Cruz & Felix Hamza-Lup
Assessment using Brain Computer Interfaces (ABC-I)
Emotional Analysis Using Facial Expression
This study reviews machine learning and deep learning techniques for facial expression recognition, including traditional and modern neural network–based models. It highlights key datasets, evaluation metrics, and practical challenges affecting real-world emotion recognition systems.
Emotion Quantification in the Metaverse
This study proposes an EEG-based brain–computer interface for objective emotion quantification in metaverse environments, using real-time signal processing and machine learning–based classification. The framework enables continuous, non-invasive assessment of affective states for adaptive virtual interactions and mental health applications.
EEG-Based Brain–Computer Interfaces for Healthcare and Consciousness Research-
This research advances EEG-based Brain–Computer Interface technologies to create objective, non-invasive solutions for healthcare, consciousness research, and mental well-being. By integrating advanced EEG signal processing, machine learning, and experimental neuroscience, the work enables reliable quantification of cognitive and emotional brain states. Through systematic BCI analysis and empirical studies on mindfulness and brain wave dynamics, the research bridges theoretical neuroscience with real-world clinical and mental health applications, delivering scalable and data-driven neurotechnological impact.