We analyze the connection between minimizers with good generalizing properties and high local entropy regions of a threshold-linear classifier in Gaussian mixtures with the mean squared error loss function.
Local entropy is in general a difficult quantity to measure. Here we use the tecnique of replicating the estimator y times, coupling the replicas togheter by fixing their distance to a centeral replica, and using that centeral replica as the new estimator. The spirit is similar to enseble learning, but our scheme can control on which states we bias the equilibrium measure.