Why does heteroskedasticity distort the results of a regression analysis?
Because regression analysis relies on the assumption that the residuals are all from the same normal distribution (with the same variance), and evidence of heteroskedasticity shows this assumption is not valid.
Heteroskedasticity (meaning "unlike variance") is the circumstance where the variance of the response variable changes, depending on the value of the input variable.
If there is evidence of heteroskedasticity in the model, then for areas where residual variance is low,
Heteroskedasticity also affects the validity of confidence intervals (C.I.'s) for such predicted values, since the C.I. formula for any predicted value uses the statistic
Tests like Bartlett's test for heteroskedasticity can be performed to validate the assumption of equal variance across the sample. If the test shows significant departure from normality, then the computed estimates of the regression coefficients