The ML degree of a toric variety (discrete exponential model) is at most its degree. A toric variety twisted by generic weights achieves this upper bound. The locus of nongeneric weights that gives rise to toric models with lower ML degree is the principal A-determinant (see here). Based on the principal A-determinant, characterize the ML-degree stratification of all (smooth) toric surfaces (see this for a start); do the same for Segre products of projective spaces (the variety of rank one tensors or independence models), more generally, for decomposable graphical models (see here) as well as Veronese varieties. What has this stratification to do with strict linear precision ? Characterize all weights for the above families which bring the ML degree down to one (see this and this). Compute the ML degree stratification for the toric variety given by an associahedron; in particular, compute the minimum ML degree achievable. See Section 7.1 in here for a concrete conjecture and read this on scattering amplitudes and likelihood geometry for context.
Is there a dual maximum likelihood estimator, a type of M-estimator (see Section 3 in here), for toric varieties ? If yes, is there a dual ML degree? Take all questions that are asked (and maybe answered) for the ML degree of toric varieties, and ask and answer them in the case of the dual ML degree.
The set of m x n matrices of rank at most r is an algebraic variety defined by determinantal (polynomial) equations. The ML degree of such varieties has been studied very carefully here and here, though we still do not know the ML degree of these determinantal varieties in general. A (statistically) more interesting set consists of m x n matrices of nonnegative rank at most r. Such a matrix M is a matrix with nonnegative entries and has the form M = AB where A is m x r and B is r x n, both with nonnegative entries. The likelihood optimization problem over the latter set requires the understanding of the algebraic boundary of this set inside the variety of m x n matrices of rank at most r. These algebraic boundaries are unions of algebraic varieties of which we know very little. The analogous problem can be stated for tensors. For some initial work see this, this, and this. Start with small values of m, n, and r (or tensors of small size) and compute/characterize the components of the said algebraic boundaries. Compute the ML degree of these components.
An m-dimensional centered Gaussian distribution is characterized by its covariance matrix which is a m x m symmetric positive definite matrix. Algebraic Gaussian models are defined by polynomial equations in the entries of these covariance matrices. Again, the ML degree of such a model is the number of complex critical points of the Gaussian likelihood function over the model. For a good introduction see Section 2.1 of this. There has been amazing amount of work on the ML degree and the likelihood geometry of Gaussian models such as concentration models (see also the following important papers: 1, 2, 3). An outstanding open problem asks for the characterization of algebraic Gaussian models with ML degree one. This might be too hard. Then how about characterizing one-dimensional Gaussian models with ML degree one (see this for the discrete case) ?
Every maximum likelihood problem starts with data and a model X. For generic data, the number of complex critical points of the likelihood function is the ML degree mld(X). As data varies smoothly, these critical points move smoothly. If one takes a closed path in the data space and moves along this path from the starting point back to the same point, the critical points are permuted. These permutations make up the monodromy group which is a subgroup of the symmetric group on mld(X) elements. Computing/understanding monodromy groups is a crucial task in numerical algebraic geometry (see here). Compute monodromy groups in the likelihood geometry of algebraic statistical models. Characterize models where the monodromy group is not the full symmetric group. Find geometric reasons why this happens.