Wireless for ML - Over-the-Air Computation for Distributed Learning over Wireless Networks - Seminar @ InterDigital

Post date: Jun 26, 2021 2:47:47 PM

I gave a talk at InterDigital Scientific Seminars on June 25th, 2021.

Here is the abstract:

Abstract: Over-the-air computation (AirComp) is a concept that exploits the superposition property of the wireless multiple-access channel to perform numerical calculations over the air. With AirComp, co-channel interference, which is often considered a major issue in wireless communication systems, turns out to be a useful phenomenon, particularly, to address the latency and congestion problems in disturbed learning scenarios over wireless networks. In this seminar, we focus on one of the well-known distributed learning paradigms, i.e., federated learning (FL), as a use case for a wireless network. With FL, a neural network is trained by using the local datasets at the edge devices (EDs) (e.g., Internet-of-Things (IoT) devices or sensors) without uploading them to a central server. Instead of the local datasets, for each iteration of training, a large number of neural network parameters or gradients from many EDs are transmitted to the edge server (ES) (e.g., a base station or access point) in the uplink for aggregation, which causes a major latency bottleneck for a wireless network with limited radio resources. In this seminar, we discuss state-of-the-art AirComp methods to address this latency problem. We particularly focus on AirComp methods compatible with orthogonal frequency division multiplexing (OFDM). The seminar will introduce two new AirComp techniques to achieve reliable low-latency distributed learning in a mobile wireless network by eliminating the need for channel state information (CSI) at the EDs or ES.

The presentation is attached below.