Cell dwell time (DT) and unencumbered interruption time (IT) are fundamental time interval variables in the teletraffic analysis for the performance evaluation of mobile cellular networks. Although a diverse set of general distributions has been proposed to model these time interval variables, the effect of their moments higher than the expected value on system performance has not been reported in the literature. In this paper, sensitivity of teletraffic performance metrics of mobile cellular networks to the first three standardized moments of both DT and IT is investigated in a comprehensive manner. Mathematical analysis is developed considering that both DT and IT are phase-type distributed random variables. This work includes substantial numerical results for quantifying the dependence of system level performance metrics to the values of the first three standardized moments of both DT and IT. For instance, for a high mobility scenario where DT is modeled by a hyper-Erlang distribution, we found that call forced termination probability decreases around 60% as the coefficient of variation (CoV) and skewness of DT simultaneously change from 1 to 20 and from 60 to 2, respectively. Also, numerical results confirm that as link unreliability increases the forced termination probability increases while both new call blocking and handoff failure probabilities decrease. Numerical results also indicate that for low values of skewness, performance metrics are highly sensitive to changes in the CoV of either the IT or DT. In general, it is observed that system performance is more sensitive to the statistics of the IT than to those of the DT. Such understanding of teletraffic engineering issues is vital for planning, designing, dimensioning, and optimizing mobile cellular networks.