Слайд 2
![Outliers Impact](/_ipx/f_webp&q_80&fit_contain&s_1440x1080/imagesDir/jpg/16756/slide-1.jpg)
Слайд 3
![Assumptions Parametric tests based on the normal distribution assume: Additivity](/_ipx/f_webp&q_80&fit_contain&s_1440x1080/imagesDir/jpg/16756/slide-2.jpg)
Assumptions
Parametric tests based on the normal distribution assume:
Additivity and linearity
Normality something or other
Homogeneity of Variance
Independence
Слайд 4
![Additivity and Linearity The outcome variable is, in reality, linearly](/_ipx/f_webp&q_80&fit_contain&s_1440x1080/imagesDir/jpg/16756/slide-3.jpg)
Additivity and Linearity
The outcome variable is, in reality, linearly related to
any predictors.
If you have several predictors then their combined effect is best described by adding their effects together.
If this assumption is not met then your model is invalid.
Слайд 5
![Normality Something or Other The normal distribution is relevant to:](/_ipx/f_webp&q_80&fit_contain&s_1440x1080/imagesDir/jpg/16756/slide-4.jpg)
Normality Something or Other
The normal distribution is relevant to:
Parameters
Confidence intervals around a parameter
Null hypothesis significance testing
This assumption tends to get incorrectly translated as ‘your data need to be normally distributed’.
Слайд 6
![When does the Assumption of Normality Matter? In small samples](/_ipx/f_webp&q_80&fit_contain&s_1440x1080/imagesDir/jpg/16756/slide-5.jpg)
When does the Assumption of Normality Matter?
In small samples – The
central limit theorem allows us to forget about this assumption in larger samples.
In practical terms, as long as your sample is fairly large, outliers are a much more pressing concern than normality.
Слайд 7
![Spotting Normality](/_ipx/f_webp&q_80&fit_contain&s_1440x1080/imagesDir/jpg/16756/slide-6.jpg)
Слайд 8
![The P-P Plot](/_ipx/f_webp&q_80&fit_contain&s_1440x1080/imagesDir/jpg/16756/slide-7.jpg)
Слайд 9
![Assessing Skew and Kurtosis](/_ipx/f_webp&q_80&fit_contain&s_1440x1080/imagesDir/jpg/16756/slide-8.jpg)
Assessing Skew and Kurtosis
Слайд 10
![](/_ipx/f_webp&q_80&fit_contain&s_1440x1080/imagesDir/jpg/16756/slide-9.jpg)
Слайд 11
![Homoscedasticity/ Homogeneity of Variance When testing several groups of participants,](/_ipx/f_webp&q_80&fit_contain&s_1440x1080/imagesDir/jpg/16756/slide-10.jpg)
Homoscedasticity/ Homogeneity of Variance
When testing several groups of participants, samples
should come from populations with the same variance.
In correlational designs, the variance of the outcome variable should be stable at all levels of the predictor variable.
Can affect the two main things that we might do when we fit models to data:
– Parameters
– Null Hypothesis significance testing
Слайд 12
![Assessing Homoscedasticity/ Homogeneity of Variance Graphs (see lectures on regression)](/_ipx/f_webp&q_80&fit_contain&s_1440x1080/imagesDir/jpg/16756/slide-11.jpg)
Assessing Homoscedasticity/ Homogeneity of Variance
Graphs (see lectures on regression)
Levene’s
Tests
Tests if variances in different groups are the same.
Significant = Variances not equal
Non-Significant = Variances are equal
Variance Ratio
With 2 or more groups
VR = Largest variance/Smallest variance
If VR < 2, homogeneity can be assumed.
Слайд 13
![](/_ipx/f_webp&q_80&fit_contain&s_1440x1080/imagesDir/jpg/16756/slide-12.jpg)
Слайд 14
![Homogeneity of Variance](/_ipx/f_webp&q_80&fit_contain&s_1440x1080/imagesDir/jpg/16756/slide-13.jpg)
Слайд 15
![Independence The errors in your model should not be related](/_ipx/f_webp&q_80&fit_contain&s_1440x1080/imagesDir/jpg/16756/slide-14.jpg)
Independence
The errors in your model should not be related to
each other.
If this assumption is violated: Confidence intervals and significance tests will be invalid.
Слайд 16
![Reducing Bias Trim the data: Delete a certain amount of](/_ipx/f_webp&q_80&fit_contain&s_1440x1080/imagesDir/jpg/16756/slide-15.jpg)
Reducing Bias
Trim the data: Delete a certain amount of scores from
the extremes.
Windsorizing: Substitute outliers with the highest value that isn’t an outlier
Analyze with Robust Methods: Bootstrapping
Transform the data: By applying a mathematical function to scores
Слайд 17
![Trimming the Data](/_ipx/f_webp&q_80&fit_contain&s_1440x1080/imagesDir/jpg/16756/slide-16.jpg)
Слайд 18
![Robust Methods](/_ipx/f_webp&q_80&fit_contain&s_1440x1080/imagesDir/jpg/16756/slide-17.jpg)
Слайд 19
![Transforming Data](/_ipx/f_webp&q_80&fit_contain&s_1440x1080/imagesDir/jpg/16756/slide-18.jpg)
Слайд 20
![Log Transformation](/_ipx/f_webp&q_80&fit_contain&s_1440x1080/imagesDir/jpg/16756/slide-19.jpg)
Слайд 21
![Square Root Transformation](/_ipx/f_webp&q_80&fit_contain&s_1440x1080/imagesDir/jpg/16756/slide-20.jpg)
Square Root Transformation
Слайд 22
![Reciprocal Transformation](/_ipx/f_webp&q_80&fit_contain&s_1440x1080/imagesDir/jpg/16756/slide-21.jpg)
Reciprocal Transformation
Слайд 23
![But …](/_ipx/f_webp&q_80&fit_contain&s_1440x1080/imagesDir/jpg/16756/slide-22.jpg)