Abar:
A new probability distribution discovery with probability density function as
Abar:
A new probability distribution discovery with probability density function as
MCP-H:
Introduced a new point process named "Matern cluster process with holes at the cluster centers".
Clustered Molecular Bio-Nano Networks:
Introduced the novel concept of clustered molecular bio-nano networks for faster cooperative bio-sensing.
Robust Hierarchical Federated Learning (MultiAirFed, QHetFed):
Proposed a new hierarchical federated learning framework with periodic gradient and model aggregations, demonstrating robustness and improved performance under noisy communication conditions and heterogeneous data distributions.
Weighted Over-the-Air Federated Learning (WAFeL):
Introduced a new approach in over-the-air federated learning—beyond the established CSIT-aware and blind paradigms—by shifting the focus from widely used fixed aggregation weights to their optimization. This weighted aggregation framework enables significant simultaneous improvements in communication efficiency and the handling of computational heterogeneity, without requiring prior channel state information and while remaining robust under limited antenna settings. It achieves markedly lower complexity and resource usage, opening a new research direction in wireless federated learning.
Federated Learning via Joint Source-Channel Coding (FedCPU):
Introduced the first federated learning framework based on joint source-channel coding, marking a departure from traditional approaches that treat source and channel coding separately. This unified design leverages lattice coding to both quantize model parameters and ensure compatibility with existing digital hardware, while enabling error-free aggregation over wireless channels. By bridging coding theory and machine learning, this work lays a foundational path not only for federated learning but also for digital over-the-air computation, advancing beyond the limitations of previous analog paradigms.
Multi-Layer Hierarchical Federated Learning (MLHFL):
Developed the first general framework for multi-layer hierarchical federated learning with theoretical guarantees, overcoming the traditional limitation of two-layer aggregation in federated learning systems. Our work supports an arbitrary number of aggregation layers and flexible network configurations, while fully realizing the potentials of hierarchical structures. This opens up new, unexplored capabilities for large-scale, hierarchical learning.
Unified Optimization Integration for Hierarchical Federated Learning (HierFADMM, HierF2ADMM):
Proposed a systematic approach to move beyond the widely used gradient-descent-based methods in federated learning. Our framework enables the integration of diverse optimization algorithms into hierarchical federated learning, allowing different optimization properties to be combined. This novel concept introduces a flexible optimization structure for federated learning, unlocking previously unattainable design possibilities.