Search for notes by fellow students, in your own course and all over the country.
Browse our notes for titles which look like what you need, you can preview any of the notes via a sample of the contents. After you're happy these are the notes you're after simply pop them into your shopping cart.
Document Preview
Extracts from the notes are below, to see the PDF you'll receive please use the links above
Chapter 3
One way Analysis of Variance (ANOVA)
3
...
Example: Fifteen students took part in an experiment to assess the effect of study habits on
the retention of material
...
· Treatment one was a control where the students simply read the material,
· Treatment two involved reading the material and then producing a summary
...
Each student was then assessed on his knowledge of the material by means of a multiple
choice exam
...
0
65
...
5
The question of interest: Does the study method affect the retention of the material?
The Hypotheses:
But why are a series of t-tests not an appropriate way of answering this question?
Suppose a t-test is used to compare ‘control’ v ‘reading & summary’ with a level of
significance of
...
P(type 1 error) = P(reject null hypothesis when it is true) =
P( accept null hypothesis when it is true) = 1 -
12
Similarly, for a test of ‘control’ v ‘skimming & reading’,
P(accept null hypothesis when it is true) = 1 -
And for test of ‘ skimming & reading’ v ‘reading & summary’,
P(accept null hypothesis when it is true) = 1 -
...
05, the following
values of P = Probability of incorrectly rejecting the null hypothesis are obtained
k= No
...
of pairs = N
1
3
6
10
45
P= 1 0
...
14
0
...
40
0
...
40; with 10 treatments, the value
rises to 0
...
3
...
In general, assume, experimental units are homogeneous and the treatments are randomly
allocated to the experimental material
...
...
k
13
Response
Mean
Variance
Assumptions:
1
...
(normal assumption)
2
...
(homogeneity of variances)
3
...
The Model:
...
(Notice the Normal assumption for the errors
...
Two sample variances are considered; one measuring the variation between groups; the other
variation within groups
...
If there is a difference between the population means of the
groups, then the variation between groups will be greater than variation within groups
...
This test
statistic has the F - distribution if the null hypothesis is true
...
f
k-1
N-k
SS(Error)
MS(Error)
N-1
MS Ratio
MS(Treatments)
MS(Error)
The F-Test
MS(Treatments)
H o if
MS(Error) >
Reject
Otherwise there is no evidence to reject
...
3
The SPSS output
This is the data spreadsheet in SPSS
15
Using Graphs/Boxplots we obtain
50
40
Examination scores
30
20
10
N=
5
5
5
Control-reading only
Skimming, thinking,
Reading and summary
Type of study habit
Using Analyze/Compare Means/One-Way ANOVA we obtain the following table
...
000
612
...
000
df
2
12
14
Mean Square
180
...
000
F
3
...
...
Corrected Model
360
...
000
3
...
062
Intercept
15360
...
000
301
...
000
FACTOR
360
...
000
3
...
062
Error
612
...
000
Total
16332
...
000
14
a
...
370 (Adjusted R Squared =
...
529 with p=0
...
05, no evidence to reject the null hypothesis
...
The summary measures confirm the earlier calculations
...
4
Residual Analysis and Graphical output
...
· Equal Variances (homogeneity of variances
...
Plots and common sense will usually suffice
...
4
...
The model is
The estimated equation is
The residual is
If the Normality assumption is valid then the residuals should be approximately Normally
distributed
...
ii) Q-Q plot or Normal Probability Plot for the residuals
...
The residuals may be stored and analyzed graphically:
To store the residuals in SPSS use General Linear Model/Univariate/Save and select one
17
of the types of residuals as in the example below
You can plot a histogram of the residuals
...
6
5
4
3
2
1
Std
...
93
Mean = 0
...
00
0
-1
...
00
-
...
00
Standardized Residual for SCORES
18
...
00
1
...
The Q-Q plot should be a straight line if
the Normal assumption is valid
...
0
1
...
0
...
0
-
...
0
-1
...
0
-1
...
0
-
...
0
...
0
1
...
0
Observed Value
3
...
2
The Assumption of Equal Variances
This is usually checked visually: Box-plots of the data, distinguished by Factor level, should
show approximately equal variability
...
Alternatively, we can test the homogeneity of
variances using the Levene’s test
...
05
...
a
Levene's Test of Equality of Error Variances
Dependent Variable: SCORES
F
...
...
a
...
3
...
e
...
05, then further analysis may be required to ascertain
where the differences lie
...
Fundamental to any multiple comparisons are the following:
19
Types of Comparison
· a-priori comparisons (decided upon before experimentation)
· post hoc comparisons (conducted after experimentation)
Error Rates (where error = p(type 1 error) = P(reject when is true)
· error rates per comparison, e
...
when comparing Treatment 1v Treatment 2
· error rates per experiment, e
...
number of Type 1 errors which would be expected for an
experiment
...
g
...
3
...
1
Post Hoc Multiple Comparisons
Some multiple comparison tests are:
LSD: The Fisher’s least significant difference procedure
...
Tukey and Fisher:
test all pairs of treatments
...
Other methods exist, in particular, Scheffe’s method, Bonferroni or Dunn’s test, Holm’s
multistage test, Newman-Keul etc
...
In SPSS the following options
20
will produce the following table
...
Error
-6
...
5166
95% Confidence Interval
Lower Bound Upper Bound
...
0498
6
...
0000
4
...
051
-24
...
983E-02
6
...
5166
...
0498
18
...
0000
4
...
407
-18
...
0498
12
...
0000
-6
...
5166
4
...
5166
...
407
...
9828E-02
-6
...
8409
24
...
0498
3
...
0000
*
4
...
021
-21
...
1591
6
...
5166
...
8409
15
...
0000
4
...
209
-15
...
8409
12
...
0000
6
...
0000
*
4
...
5166
4
...
5166
...
209
...
038
2
...
8409
-5
...
6972
21
...
8409
17
...
3028
Reading and summary
Skimming, thinking,
reading
Based on observed means
...
The mean difference is significant at the
...
a
...
3
...
2
Sig
...
e
...
Example: Consider three group means
i)
21
A comparison of group 1 (e
...
control group) with the average of the means of group
2 and group 3 is represented by the contrast
1
C 1 2 3
2
1
1
11 2 3
2
2
ii)
A difference between the means of group 2 and group 3 is
C 2 3
0 1 1 2 (1) 3
An estimate of C is given by the corresponding sample contrast
k
ˆ
C ci y i
i 1
ˆ
The standard error of C is
sC s p
where
s2
p
c i2
n
i 1
i
k
= MS(Error)
To test the hypotheses
H0 :C 0
H1 : C 0
use the test statistic
ˆ
C
T
sC
If H 0 is true, then T ~ t N k
...
A 100(1 - )% confidence interval for C is
ˆ
C t N k , / 2 s C
Orthogonal Contrasts
Two linear contrasts, are said to be orthogonal if
k
c ic
1n 2i 0
i
...
i 1 i
The SS(Treatments) can be partitioned into k – 1 independent sum of squares, each with 1
d
...
and each corresponding to one of a set of k – 1 orthogonal contrasts
...
f
...
f
...
Separate hypothesis tests are now provided by
MS (Contrast )
F
MS ( Error ) on 1 and N – 1 d
...
Contrasts in SPSS
Suppose we planned the following comparisons prior to experimentation
...
e
...
1
H 1 : 1 2 3
i
...
C1 0
2
2
...
e
...
e
...
23
By entering the contrast coefficients, c1 2, c 2 1, c 3 1 for C1 and
c1 0, c 2 1, c3 1 for , C 2
24
the following tables are produced
...
Error
-18
...
82
-6
...
52
-18
...
63
-6
...
63
Sig
...
301
12
...
328
12
...
359 8
...
044
-1
...
617
Interpretation of output
For the test of H 0 : C1 0 , t = -2
...
040
Hence since p < 0
...
Conclude that there is a
significant difference on average between the score on study habit 1 and the mean of the
scores on the other two methods
...
233
For the test of H 0 : C 2 0 , t = -1
...
209
Since p > 0
...
Conclude that there is no significant
difference between the mean scores of study method 2 and study method 3
...
6 Violations of Assumptions in ANOVA
Lack of Normality has only a slight effect on the type I error rate,
...
We say the F test statistic is robust (i
...
insensitive to non-normality) with respect to the normality assumption
For unequal variances, if the group sizes are equal or approximately equal (largest/smallest <
1
...
That is the actual is close to the
nominal
...
5) and a statistical test shows that the population variances are unequal
...
26
3
...
Here are some suggested transformations:
Logarithmic transformation
The logarithmic transformation is appropriate when
i)
the standard deviation is proportional to the mean
(plot of s
...
against mean show a linear relationship)
ii)
the data are markedly positively skewed
The form of the transformation is
y logx
where x is the original value
...
You can use any base for logarithmic transformation
...
The form of the transformation is
y x
If the values of x are fairly small (<10), then use
y x 0
...
The form of the transformation is
1
y
x
27
Arcsine transformation
This transformation is appropriate when the values are proportions, p
...
The form of the transformation is
y 2 sin 1 p
where p is the proportion
...
3
Transformation
none
square root,
2 sin 1
3
2
28
Poisson data (counts)
x
p
arcsine,
Logarithm, log(x) or
ln(x)
reciprocal square
1
root,
2
Comments
Data are proportions
Only for positive values
x
1
Reciprocal, x
Survival time data
Practical 3
...
One-way ANOVA: example 1
In a study of memory recall, 50 subjects are allocated randomly to one of 5 tasks
...
The Adjective group must think of a word to describe the other word
...
The Intentional group is told they will be tested on their recall
...
After a specified length of time the number of words recalled was recorded as follows:
count
9
8
6
8
10
4
6
5
7
7
rhyme
7
9
6
6
6
11
6
3
8
7
adjective
11
13
8
6
14
11
13
13
10
11
image
12
11
16
11
9
23
12
10
19
11
intent
19
19
14
5
10
11
14
15
11
11
1
...
2
...
Assign the following values to the
groups:
1 = count, 2 = rhyme, 3 = adjective, 4 = image, 5 = intent
...
Click
Analyze
Compare Means
One-Way ANOVA
to open the One-way ANOVA dialog box
...
Click on the variable to be tested and then click on > to move it to the Dependent List
box
...
Click on the grouping variable, then click > to move it to the Factor box
...
Click OK
...
Investigate the one-way ANOVA options
...
Two hours after treatment, each student
tapped and the number of taps per minute recorded
...
Clear the data sheet by clicking on
File
New
Data
and then type in data
...
Click on Analyze
General Linear Model
Univariate
to produce an ANOVA table
3
...
Check the assumption of equal variances by clicking on Options and selecting
homogeneity tests
...
Use the Post Hoc option to produce the LSD, Tukey and Dunnett tests
...
1
General Questions
i)
What defines the one way analysis of variance?
a) the number of levels in the factor
b) the number of factors in the analysis
ii)
Why is the use of a Box-plot a good idea in the one way analysis of variance?
iii)
What are the underling assumptions in the analysis of variance?
For each of the dataset in practical 3
...
iv)
What statistical models correspond to the null and alternative hypothesis?
v)
What is the result of your test? State clearly the conclusion?
vi)
How you can check the assumption of equal variance (homogeneity)? Is this
assumption satisfied?
vii)
Which of your assumptions does the residual analysis check? What are your findings?
iix)
31
For dataset 2, use the post hoc test to find whether there is a difference in the means in
Tapping between level 0 and level 300 in caffeine dose
...
Practical 3
...
1
...
Example 2
An experiment was conducted to study the effect of food and/or water deprivation on
behaviour in a learning task
...
In
treatment 3 animals were food deprived, in treatment 4, they were water deprived, and in
treatment 5 they were deprived of both food and water
...
2
...
32
Food and
Water
Deprived
12
11
8
13
11
Enter the data into one named column and create a grouping variable for the treatment
groups
...
Use the Contrasts option to specify linear contrasts to compare
a)
combined control groups (treatments 1 and 2) versus combined experimental
groups
b)
control groups with each other
c)
the singly deprived groups treatments with the doubly deprived treatment
d)
the singly deprived treatments with each other
Questions for Practical 3
...
ii)
State the null and alternative hypotheses for using the two contrasts to test
...
Behaviour in learning experiment
i)
Interpret the results of the analysis of variance
...
iii)
Show that the contrasts are orthogonal