Research Projects
Human Computer (AI)nteraction
Human Computer (AI)nteraction
MicroExpressions
Rapid, involuntary facial movements that reveal authentic emotional states, often beyond conscious control. Their detection and analysis present a compelling challenge for computing research, requiring advances in computer vision, deep learning, and high-resolution spatiotemporal modeling. Studying microexpressions drives innovation in affective computing, real-time AI systems, and multimodal data fusion, while also pushing the field toward more interpretable and ethical machine-learning approaches. As human–computer interaction becomes increasingly natural and adaptive, microexpression research enables technologies that better understand human emotion, enhance decision-making, and support applications across healthcare, security, education, and collaborative environments.
Brain Computer Interfaces
The combination of the synchronized electrical activity of all the neurons in the brain - brainwaves - can be captured in real-time using Brain-Computer Interfaces (BCIs). Portable and affordable BCIs emerged recently, and have the potential to objectively quantify attention and improve learning. We explore the use of BCIs in quantifying user attention in various user interfaces, including an e-Learning platform. For a brief introduction to BCIs please see: Learning Assessment Using Brain-Computer Interfaces: Are You Paying Attention?
Computer Security, Leveraging Enhanced Accurate Penetration with AI (LEAP-AI)
AI is rapidly reshaping cyber defense by automating vulnerability discovery and accelerating response with greater precision—while simultaneously being weaponized by adversaries. LEAP-AI investigates the integration of smart-automation into penetration testing to evaluate both AI-driven attack techniques and AI-powered defensive strategies. A proof-of-concept system simulates intelligent cyberattacks and applies AI-enhanced penetration testing to uncover weaknesses and deploy adaptive mitigation measures, offering a dual perspective on emerging AI threats and the defenses needed to counter them.
Web3D - 3D Radiation Therapy Training (3DRTT) - Medical Simulation and Training
Part of this project is a 3D graphical simulation web-based tool that can be used as a stand alone application to assist in external treatment planning by visually simulating collisions between various LINAC hardware modules. The system is based on X3D and Java programming that allows for accurate simulation of several Linear Accelerators hardware modules. The system helps medical personnel to determine the 3D collision space for a patient and is independent of any computing platform.
Web3D - Neuro-Pathways - Medical Simulation and Training (Demo)
Medical students historically have difficulty conceptualizing and projecting the three-dimensional aspects of neural pathways and embryonic organ development from two-dimensional text materials and electronic resources. This collaboration will support the development of interactive three-dimensional simulations systems to allow students to visualize and conceive neural pathways and embryonic development.
Web3D - X3D for Physics and Chemistry
Many fundamental concepts in physics and chemistry require the development of active mental models for the students to understand the physical and mathematical components. We want to explore student attention and learning of difficult concepts while improving their mental models using 3D augmentation. The proposed research will improve faculty and student expertise in 3D computer-based simulations.
Web3D - VIEW (Virtual Interactive Engineering on the Web)
The main objective of this project is to implement virtual laboratories to supplement the course: Introduction to Engineering Materials. Hands on laboratories are an essential part of engineering and science education wherein students are introduced to data analysis, report writing, finding empirical correlations between experimental variables and data and to validate theories.
HaptEK-16 (Haptic Environments for K16)
The term haptic originates from the Greek haptikos, meaning “able to touch.” Haptics is a transformative scientific and engineering discipline dedicated to integrating tactile sensation into digital environments, enabling users to experience realistic, physical feedback through the sense of touch. By bridging the gap between the virtual and physical worlds, haptic technologies convert abstract digital interactions into immersive, multisensory experiences.
This project advances the frontiers of haptic feedback by systematically promoting its integration across high-impact domains. We highlight a broad spectrum of applications poised to benefit from recent breakthroughs in haptic hardware and software, with particular emphasis on simulation and training platforms for education. From medical procedure rehearsal and engineering prototyping to immersive STEM learning and skill-based technical training, next-generation haptic systems significantly enhance realism, engagement, knowledge retention, and performance outcomes. Our work aims to accelerate adoption, expand accessibility, and redefine how learners interact with complex digital systems—transforming education through touch-enabled intelligence.