摘要:The minimum divergence estimators have proved to be useful tools in the area of robust inference. The robustness of such estimators are often measured using the classical Influence Function analysis. However, in many complex situations like that of testing a composite null hypothesis require the estimators to be restricted over some proper subspace of the parameter space. The robustness of these restricted minimum divergence estimators are very crucial in order to have overall robust inference. In this paper we provide a comprehensive description of the robustness of such restricted estimators in terms of their influence function for a general class of density based divergences along with their unrestricted versions. In particular, the robustness of some popular minimum divergence estimators are also demonstrated under certain usual restrictions through examples. Thus the paper provides a general framework for the influence function analysis of a large class of minimum divergence estimators with or without restrictions on the parameters and provides theoretical solutions for measuring the impact of the parameter restrictions on the robustness of the corresponding estimators.