Partial/Parallel Consensus - Arithmetic Average fusion

Finite Mixture Modeling (FMM)

Like noise is often used to model unknown system input, one may use various hypotheses or finite mixture model to deal with the uncertain state space model or data association. Meanwhile, consensus may be sought over the cross-correlated sensors. These all drive a need for representing the probability distribution by a finite mixture of properly weighted component distributions, which arithmetically average the information gained from different models/hypotheses or from different sensors.

We are interested in the convex combination of the single-target probability density or the multiple target probability hypothesis density (PHD) from multiple sensors, namely Finite Mixture of Distributions/Densities

1. Best Fit of Mixture:

The fusion rule that approaches the true multi-sensor posterior by fitting the mixture of multiple sensor posteriors--or their first order moments--using various optimization distances. 

see: Li T, et al.  Best Fit of Mixture for Computationally Efficient Multi-sensor Poisson Multi-Bernoulli Mixture Filtering, Signal Process., Vol. 202, 2023, 108739.  

Remark 1. When g(X) is the true/target distribution p(X), the above Theorem indicates that the average of the mixture fits the target distribution better than all component distributions on average. This therefore provides an information-theoretic justification for distribution mixing/averaging, whether the information diversity is due to model/association uncertainty or sensing diversity. This is regardless of the mixing/fusion weights. optimized mixing weights will accentuate the benefit of fusion.

Remark 2. When the fusing weights are properly designed, the average of the mixture may fit the target distribution better than the best component.

Since the year of 2019, worldwide independent research groups from China to Swissland, Germany, Australia, Italy, South Korea, Sweden, U.S. and Israel have reported some interesting theoretical properties and experimental advantages of the arithmetic average (AA) fusion (or the like using another name) in comparison with the prevailing geometric average (GA) fusion approach called GCI or EMD, especially in the context of RFS fusion for decentralized target tracking; see the following references in addition to those of mine at the end of this pageFor example

This unexpected resonance demonstrates explicitly that the AA fusion is not only effective for RFS fusion but superior to its competitor, especially in dealing with frequent missed detection, closely distributed targets and massive sensors, and in enabling homogeneous and heterogeneous RFS fusion based on the unified PHD-AA consistency, not to mention faster/online computing. These findings based on critical theoretical studies and careful experimental studies overturn some earlier cheap influential criticizes about the AA fusion.  

The above Frechet mean property of the AA and GA in the context of multitarget densities (and in RFS processes), though well known in pure math in the context of variables, is first pointed out in the following paper:

Remark 3. Since there still lacks a proper distance/divergence for the LRFS densities with different, discrete labels. As such, the above minimization can not be directly applied for the labeled RFS density distributions in general. Yet, this has been often violated in the literature.

See discussion given in: T. Li, Arithmetic Average Density Fusion - Part II: Unified Derivation For Unlabeled and Labeled RFS Fusion.  IEEE Transactions on Aerospace and Electronic Systems, vol. 60, no. 3, pp. 3255-3268, June 2024  @IEEE Xplore 

2. Partial Consensus --- Many Could be Better than All

We propose a novel consensus notion, called "partial consensus", for distributed Gaussian mixture probability hypothesis density fusion based on a decentralized sensor network, in which only highly-weighted Gaussian components (GCs) are exchanged and fused across neighbor sensors. It is shown that this does not only gain high efficiency in both network communication and fusion computation but also significantly compensates the effects of clutter and missed detections. 


The premilinary work of the above journal publication, also the earliest one of our series work in this direction, is the followng conference paper, which proposes the original idea of "combining" PHDs (and then merging or averaging) for fusing
T. Li, J.M. Corchado and S. Sun, On Generalized Covariance Intersection for Distributed PHD Filtering and a Simple but Better Alternative, FUSION’17 Xi’an, July 10-13, 2017  @ IEEE Xplore
Fig.2. Unweighted AA and GA of two weighted GMs. The dashed ellipse indicates the position of a potential target.

3. Parallel Consensus -- parallelization of filtering and communication/fusion operations

The first ever distributed filter based on parallel filtering-communication/fusion mode! 

We propose a particle-based distributed PHD filter for tracking an unknown, time-varying number of targets. To reduce communication, the local PHD filters at neighboring sensors communicate Gaussian mixture (GM) parameters. In contrast to most existing distributed PHD filters, our filter employs an `arithmetic average' fusion. For particles--GM conversion, we use a method that avoids particle clustering and enables a significance-based pruning of the GM components. For GM--particles conversion, we develop an importance sampling based method that enables a parallelization of filtering and dissemination/fusion operations. The proposed distributed particle-PHD filter is able to integrate GM-based local PHD filters. Simulations demonstrate the excellent performance and small communication and computation requirements of our filter.

See also our attempt based on the GM filter:

Comparison between serial (a) and parallel (b) filtering-communication modes

Thanks to the importance sampling approach, the particle implementation  renders a higher level of parallel filtering-communication than the GM implementation does. Further investigation is certainly of significance in the other distributed filter design, simply because the length of the green block is severely limited in practice!!  


Existing serial filtering-communication/consensus protocol  is just too ideal to be practical useful! 

Average consensus is a good idea, but has to be adjusted to the real-time mode to be useful for online tracking.

4. More Realistic Useful Implementations on AA fusion: 

See also with different communication protocols are emploed:

5. AA Density Fusion - Part I:  Statistic and information-theoretical results

The quality of estimation in the Bayesian inference manner, should be given with regard to the posterior distribution, not point estimate.   Even that, the AA fusion has a close connection with with known covariance intersection, covariance union and so on


The AA fusion works well with the benchmark Kalman filter  and Student's t filter!

See: 

It is reasonable yet simple to approximate the KL divergence between student's t distribution by that of corresponding Gauss distributions

6. AA Density Fusion - Part II: Unified Derivation for (unlabeled and labeled) RFS Fusion

The arithmetic average (AA) fusion is a fundamental information fusion methodology which has recently demonstrated great performance for multi-sensor tracking of a random number of objects based on the random finite set (RFS) theory. Since there are extra multi-object set cardinality (namely the number of objects) and even identities need to be estimated jointly with the multi-object states, the AA fusion has to be tailored in various specific means to accommodate different RFS filters. All can be strictly derived from the same unlabeled/labeled probability hypothesis density (PHD) averaging formulation which ensures consistency in the (labeled) PHD estimation. In this paper, we first explain how the (labeled) PHD-consistency/AA fusion can lead to more accurate and robust detection and localization of the present objects which forms a both theoretically solid and physically meaningful reason for fusion. Then, we derive and analyze the formulas of suitable RFS-AA fusion for different (labeled) RFS filters on the same base of the (labeled) PHD-AA/consistency. These derivations are exact and need no approximation.

See:

T. Li, Arithmetic Average Density Fusion - Part II: Unified Derivation For Unlabeled and Labeled RFS Fusion. IEEE Transactions on Aerospace and Electronic Systems, vol. 60, no. 3, pp. 3255-3268, June 2024  @IEEE Xplore 

Realization has been shown in the following papers on Heterogeneous Fusion

The above MSE^s regards the probability volume of any local area s.

7. AA Density Fusion - Part III and IV: Heterogeneous Unlabeled and Labeled RFS Density Fusion

Thanks to the above fact that all AA-RFS filters are built on averaging their relevant unlabeled/labeled probability hypothesis densities (PHDs), the following paper proposes the first ever heterogenous unlabeled and labeled RFS filter cooperation approach based on Gaussian mixture implementations where the local Gaussian components (L-GCs) are so optimized that the resulting unlabeled PHDs best fit their AA, regardless of the specific type of the local densities. For illustration, a computationally efficient, approximate approach is proposed which only revises the weights of the LGCs, keeping the other parameters of L-GCs unchanged. In particular, the PHD filter, the unlabeled and labeled multi-Bernoulli (MB/LMB) filters are considered. Simulations have demonstrated the effectiveness of the proposed approach for both homogeneous and heterogenous fusion of the PHD-MBLMB filters in different configurations.

See:

T. Li, Arithmetic Average Density Fusion - Part III: Heterogeneous Unlabeled and Labeled RFS Filter Fusion,  IEEE Transactions on Aerospace and Electronic Systems, vol. 60, no. 1, pp. 1023-1034, Feb. 2024 @IEEE Xplore 

T. Li, Arithmetic Average Density Fusion - Part IV: Distributed Heterogeneous Unlabeled and Labeled RFS Filter Fusion via Variational Approximation submitted to IEEE TSP, @arXiv.

Overview:  Beyond AA! 

From the celebrated Gaussian mixture, model averaging estimators to the cutting-edge multi-Bernoulli mixture of various forms, ...........