In this paper we consider a linear regression model when error terms obey a multivariate t distribution, and examine the effects of departure from normality of error terms on the exact distributions of the coefficient of determination (say, R2 ) and adjusted R2 (say, R2 ). We derive the exact formulas for the density function, distribution function and m -th moment, and perform numerical analysis based on the exact formulas. It is shown that the upward bias of R2 gets serious and the standard error of R2 gets large as the degrees of freedom of the multivariate t error distribution (say, ν0) get small. The confidence intervals of R2 and R2 are examined, and it is shown that when the values of ν0 and the parent coefficient of determination (say, Φ) are small, the upper confidence limits are very large, relative to the value of Φ.