摘要:AbstractSample maximum likelihood(SML) method is frequently used to identify errors-invariables(EIV) system. It generates the estimate through minimizing relevant cost function built on the mean input-output data and sample noise variances. To help gradient-based algorithm overcome local convergence, we examine the attraction domain for the SML cost. It is shown in this paper that the asymptotic convergence properties of the objective can be learned equivalently by the noiseless version. Moreover we present some special attraction domains that contain the global minimum under certain structures. For the particular models, careful initialization locating in the same domain leads the algorithm to find the global minimum.
关键词:KeywordsEIV systemMaximum likelihood estimationGradient-based optimizationGlobal and local convergenceAttraction domain