Network Coding: Connections Between Information Theory and Estimation Theory

Connections between information theory and estimation theory is established for coded noisy networks. When the network coding information flow is contaminated by AWGN noise on the network, this illuminates intimate connections between information measures and estimation measures, namely the mutual information and the minimum mean squared error. This on the one hand shifts the I-MMSE identity into a network I-MMSE version with arbitrary parameters like precoding, decoding and the network topology playing fundamental roles in its characterisation. On the other hand, this shifts the network coding problem from one that relies on random representation of the encoding matrix into one that is deterministic. In particular, capitalising on gradients of the mutual information with respect to arbitrary network parameters, new Deterministic Linear Network Codes (DLNC) is highlighted [Ghanem19] which is optimal for deterministic network topologies making the Random Linear Network Coding fail optimality under deterministic network topologies.

Keywords: Network Coding, Information Theory, Estimation Theory

[Ghanem 19], Forward and Reciprocal Noisy Coded Networks: Precoding, Topology, and Error Analysis, IEEE ICCSPA 2019