UAV-assisted integrated object detection and tracking system
Moving-object detection using self-attention and heatmap -based CNN model
DQN-based deep reinforcement learning object tracking
Apply to urban environments for various weather conditions
Pedestrian-aware TSC
Multi-purpose objective function in terms of vehicle waiting time, emergency vehicle preference, lane fairness, and pedestrian maximum waiting time requirements.
Dueling DQN based deep reinforcement learning architecture
Autonomous driving using DQN reinforcement learning
Implement digital twin environments
Considering obstacles, other vehicles, and traffic signals
Efficient DQN learning architecture
Reward function design
Optimal vehicle traffic signal control
Reduce waiting time and queue length
Provide fairness and consider emergency vehicles
Support inter-signal coordination between intersections
Jointly consider vehicle traffic and pedestrian waiting time at the crosswalks.
Indoor disaster monitoring IoT sensor network implementation (Rassberry Pi embedded devices)
Q-learning based sensing data routing protocol implementation at IoT edge devices
GBM (Gradient Boosting Machine) based Ensemble Prediction for disaster area prediction
Beacon IoT device implementation and RSS (Received Signal Strength-based DNN (Deep Neural Network) positioning
Reinforcement learning based optimum evacuation path decision
WiFi and 5G based communication for emergency information delivery
AR(Augmented Reality) based individual navigation system: avoiding the disaster areas and congested areas for fast and safe evacuation.
Wildfire detection wireless sensor network topology simulation
Adaptive duty-cycle enabled sensor node implementation for energy saving
DNN-based sensing value estimation for in-active sensors using active sensor information
RNN-based disaster area prediction
Ultrasonic sinusoidal signal generation
Doppler effect visualization due to the gestures
CNN-based gesture recognition
MQTT IoT protocol using cloud server
Remote home automation IoT device control
Sensing data acquisition using multiple UAVs
Optimum UAV path decision with energy and time constraints
UAV risk managements due to the geographical conditions
Multiple UAV connectivity control and ad-hoc routing
Bio-inspired AI algorithm implementation: GA (Genetic Algorithm), ACO (Ant Colony Optimization), and PSO(Particle Swarm Optimization)
CNN based lane and vanishing point detection
Q-learning based autonomous driving implementation using robot mobile platform
Vehicle platooning implementation using reinforcement learning
DNN-based autonomous driving simulator implementation
Reinforcement Learning based VANET routing simulator
Directional beam-forming for secure and reliable data transmission
On-demand Ad-hoc routing between UAVs
Dynamic direction and angle adjustment algorithm implementation
GUI based FANET simulator implementation
Tactical military network environment configuration
Cognitive radio policy engine implementation
CR tactical cluster-based ad-hoc networking simulator
Dynamic frequency hopping channel management
Common control channel selection simulator
GA(Genetic Algorithm)-based CR network coexist management