tips:multiple_factors_interactions
Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revision | Last revisionBoth sides next revision | ||
tips:multiple_factors_interactions [2022/08/03 11:35] – Wolfgang Viechtbauer | tips:multiple_factors_interactions [2023/05/30 07:51] – Wolfgang Viechtbauer | ||
---|---|---|---|
Line 1: | Line 1: | ||
===== Models with Multiple Factors and Their Interaction ===== | ===== Models with Multiple Factors and Their Interaction ===== | ||
- | The example below shows how to test/examine multiple factors and their interaction in (mixed-effects) meta-regression models. | + | The example below shows how to test and examine multiple factors and their interaction in (mixed-effects) meta-regression models. |
==== Data Preparation ==== | ==== Data Preparation ==== | ||
For the example, we will use the data from the meta-analysis by Raudenbush (1984) (see also Raudenbush & Bryk, 1985) of studies examining teacher expectancy effects on pupil IQ ('' | For the example, we will use the data from the meta-analysis by Raudenbush (1984) (see also Raudenbush & Bryk, 1985) of studies examining teacher expectancy effects on pupil IQ ('' | ||
+ | |||
<code rsplus> | <code rsplus> | ||
library(metafor) | library(metafor) | ||
dat <- dat.raudenbush1985 | dat <- dat.raudenbush1985 | ||
</ | </ | ||
+ | |||
I copy the dataset into '' | I copy the dataset into '' | ||
For illustration purposes, we will categorize the '' | For illustration purposes, we will categorize the '' | ||
+ | |||
<code rsplus> | <code rsplus> | ||
dat$weeks <- cut(dat$weeks, | dat$weeks <- cut(dat$weeks, | ||
Line 39: | Line 42: | ||
19 19 | 19 19 | ||
</ | </ | ||
+ | |||
Variable '' | Variable '' | ||
The '' | The '' | ||
+ | |||
<code rsplus> | <code rsplus> | ||
dat$tester <- relevel(factor(dat$tester), | dat$tester <- relevel(factor(dat$tester), | ||
Line 47: | Line 52: | ||
A table with the number of factor level combinations (and margin counts) can be obtained with: | A table with the number of factor level combinations (and margin counts) can be obtained with: | ||
+ | |||
<code rsplus> | <code rsplus> | ||
addmargins(table(dat$weeks, | addmargins(table(dat$weeks, | ||
Line 57: | Line 63: | ||
Sum 9 10 19 | Sum 9 10 19 | ||
</ | </ | ||
+ | |||
So, all factor level combinations are present in this dataset. | So, all factor level combinations are present in this dataset. | ||
Line 62: | Line 69: | ||
We start by fitting a model that only contains the main effects of the two factors: | We start by fitting a model that only contains the main effects of the two factors: | ||
+ | |||
<code rsplus> | <code rsplus> | ||
res.a1 <- rma(yi, vi, mods = ~ weeks + tester, data=dat) | res.a1 <- rma(yi, vi, mods = ~ weeks + tester, data=dat) | ||
Line 78: | Line 86: | ||
QE(df = 15) = 24.1362, p-val = 0.0628 | QE(df = 15) = 24.1362, p-val = 0.0628 | ||
- | Test of Moderators (coefficient(s) | + | Test of Moderators (coefficients |
QM(df = 3) = 9.9698, p-val = 0.0188 | QM(df = 3) = 9.9698, p-val = 0.0188 | ||
Line 90: | Line 98: | ||
--- | --- | ||
- | Signif. codes: | + | Signif. codes: |
</ | </ | ||
- | Let's start with the table at the bottom containing the model results. The first coefficient in the table is the model intercept ($b_0 = .4020$), which is the estimated average standardized mean difference for levels '' | + | Let's start with the table at the bottom containing the model results. The first coefficient in the table is the model intercept ($b_0 = 0.4020$), which is the estimated average standardized mean difference for levels '' |
- | The second and third coefficients correspond to the dummy variables '' | + | The second and third coefficients correspond to the dummy variables '' |
Note that we have not yet tested the '' | Note that we have not yet tested the '' | ||
+ | |||
<code rsplus> | <code rsplus> | ||
anova(res.a1, | anova(res.a1, | ||
Line 105: | Line 114: | ||
QM(df = 2) = 9.1550, p-val = 0.0103 | QM(df = 2) = 9.1550, p-val = 0.0103 | ||
</ | </ | ||
+ | |||
This is a Wald-type chi-square test with 2 degrees of freedom, indicating that the factor is in fact significant ($p = .0103$). | This is a Wald-type chi-square test with 2 degrees of freedom, indicating that the factor is in fact significant ($p = .0103$). | ||
We have also not yet tested whether there is a difference between levels '' | We have also not yet tested whether there is a difference between levels '' | ||
+ | |||
<code rsplus> | <code rsplus> | ||
anova(res.a1, | anova(res.a1, | ||
Line 124: | Line 135: | ||
The '' | The '' | ||
+ | |||
<code rsplus> | <code rsplus> | ||
library(car) | library(car) | ||
Line 142: | Line 154: | ||
2 | 2 | ||
</ | </ | ||
- | So, we actually do not have enough | + | |
+ | So, we actually do not have sufficient | ||
The '' | The '' | ||
+ | |||
<code rsplus> | <code rsplus> | ||
library(multcomp) | library(multcomp) | ||
Line 161: | Line 175: | ||
(Adjusted p values reported -- none method) | (Adjusted p values reported -- none method) | ||
</ | </ | ||
+ | |||
No adjustment to the p-values has been used for these tests ('' | No adjustment to the p-values has been used for these tests ('' | ||
- | Finally, the last (4th) coefficient in the model table corresponds to the dummy variable '' | + | Finally, the last (4th) coefficient in the model table corresponds to the dummy variable '' |
The output includes some additional results. First of all, the results given under '' | The output includes some additional results. First of all, the results given under '' | ||
Based on the fitted model, we can also obtain the estimated average effect for each combination of the factor levels. For this, we can use the '' | Based on the fitted model, we can also obtain the estimated average effect for each combination of the factor levels. For this, we can use the '' | ||
+ | |||
<code rsplus> | <code rsplus> | ||
predict(res.a1, | predict(res.a1, | ||
Line 177: | Line 193: | ||
3 -0.0402 0.1020 -0.2401 0.1597 -0.3163 0.2359 | 3 -0.0402 0.1020 -0.2401 0.1597 -0.3163 0.2359 | ||
</ | </ | ||
+ | |||
For the same levels of the '' | For the same levels of the '' | ||
+ | |||
<code rsplus> | <code rsplus> | ||
predict(res.a1, | predict(res.a1, | ||
Line 187: | Line 205: | ||
3 -0.0913 0.0857 -0.2592 0.0767 -0.3452 0.1627 | 3 -0.0913 0.0857 -0.2592 0.0767 -0.3452 0.1627 | ||
</ | </ | ||
+ | |||
Note that, by default, the intercept is automatically included in the calculation of these predicted values (so only a vector of length 3 or a matrix with 3 columns should be specified via the '' | Note that, by default, the intercept is automatically included in the calculation of these predicted values (so only a vector of length 3 or a matrix with 3 columns should be specified via the '' | ||
We can use an alternative model specification, | We can use an alternative model specification, | ||
+ | |||
<code rsplus> | <code rsplus> | ||
res.a2 <- rma(yi, vi, mods = ~ weeks + tester - 1, data=dat) | res.a2 <- rma(yi, vi, mods = ~ weeks + tester - 1, data=dat) | ||
Line 219: | Line 239: | ||
Signif. codes: | Signif. codes: | ||
</ | </ | ||
+ | |||
Note that the first three coefficients now directly correspond to the estimated average effects for levels '' | Note that the first three coefficients now directly correspond to the estimated average effects for levels '' | ||
The model including only the main effects of the two factors assumes that the influence of the two factors is additive. We can illustrate the implications graphically by plotting the estimated average effects for all factor level combinations: | The model including only the main effects of the two factors assumes that the influence of the two factors is additive. We can illustrate the implications graphically by plotting the estimated average effects for all factor level combinations: | ||
+ | |||
<code rsplus> | <code rsplus> | ||
plot(coef(res.a2)[1: | plot(coef(res.a2)[1: | ||
Line 237: | Line 259: | ||
Now let's move on to a model where we allow the two factors in the model to interact (note that it is questionable whether such a complex model should even be considered for these data; again, these analyses are for illustration purposes only). Such a model can be fitted with: | Now let's move on to a model where we allow the two factors in the model to interact (note that it is questionable whether such a complex model should even be considered for these data; again, these analyses are for illustration purposes only). Such a model can be fitted with: | ||
+ | |||
<code rsplus> | <code rsplus> | ||
res.i1 <- rma(yi, vi, mods = ~ weeks * tester, data=dat) | res.i1 <- rma(yi, vi, mods = ~ weeks * tester, data=dat) | ||
Line 270: | Line 293: | ||
</ | </ | ||
- | As before, the first coefficient in the table is the model intercept ($b_0 = .3067$), which is the estimated average effect for levels '' | + | As before, the first coefficient in the table is the model intercept ($b_0 = 0.3067$), which is the estimated average effect for levels '' |
These last two coefficients allow the influence of the '' | These last two coefficients allow the influence of the '' | ||
+ | |||
<code rsplus> | <code rsplus> | ||
anova(res.i1, | anova(res.i1, | ||
Line 280: | Line 304: | ||
QM(df = 2) = 0.8576, p-val = 0.6513 | QM(df = 2) = 0.8576, p-val = 0.6513 | ||
</ | </ | ||
+ | |||
Based on this Wald-type chi-square test, we find no significant evidence that an interaction is actually present ($Q_M = 0.86, df = 2, p = .65$). | Based on this Wald-type chi-square test, we find no significant evidence that an interaction is actually present ($Q_M = 0.86, df = 2, p = .65$). | ||
Alternatively, | Alternatively, | ||
+ | |||
<code rsplus> | <code rsplus> | ||
res.a1.ml <- rma(yi, vi, mods = ~ weeks + tester, data=dat, method=" | res.a1.ml <- rma(yi, vi, mods = ~ weeks + tester, data=dat, method=" | ||
Line 289: | Line 315: | ||
</ | </ | ||
<code output> | <code output> | ||
- | df AIC | + | df AIC |
Full 7 8.5528 15.1639 18.7346 2.7236 | Full 7 8.5528 15.1639 18.7346 2.7236 | ||
- | Reduced | + | Reduced |
</ | </ | ||
+ | |||
The results from the LRT ($\chi^2 = 0.67, df = 2, p = .71$) lead to the same conclusion. | The results from the LRT ($\chi^2 = 0.67, df = 2, p = .71$) lead to the same conclusion. | ||
For illustration purposes, let us however proceed with a further examination of this model. As before, the '' | For illustration purposes, let us however proceed with a further examination of this model. As before, the '' | ||
+ | |||
<code rsplus> | <code rsplus> | ||
predict(res.i1, | predict(res.i1, | ||
Line 305: | Line 333: | ||
We can also make use of an alternative parameterization of the model, which is particularly useful in this context: | We can also make use of an alternative parameterization of the model, which is particularly useful in this context: | ||
+ | |||
<code rsplus> | <code rsplus> | ||
res.i2 <- rma(yi, vi, mods = ~ weeks: | res.i2 <- rma(yi, vi, mods = ~ weeks: | ||
Line 338: | Line 367: | ||
Now, the table with the model results directly provides the estimated average effect for each factor level combination. It is now also quite easy to test particular factor level combinations against each other. For example, to test the difference between levels '' | Now, the table with the model results directly provides the estimated average effect for each factor level combination. It is now also quite easy to test particular factor level combinations against each other. For example, to test the difference between levels '' | ||
+ | |||
<code rsplus> | <code rsplus> | ||
anova(res.i2, | anova(res.i2, | ||
Line 354: | Line 384: | ||
Or with the '' | Or with the '' | ||
+ | |||
<code rsplus> | <code rsplus> | ||
linearHypothesis(res.i2, | linearHypothesis(res.i2, | ||
Line 364: | Line 395: | ||
To test the same contrast within the '' | To test the same contrast within the '' | ||
+ | |||
<code rsplus> | <code rsplus> | ||
anova(res.i2, | anova(res.i2, | ||
Line 380: | Line 412: | ||
Or with the '' | Or with the '' | ||
+ | |||
<code rsplus> | <code rsplus> | ||
linearHypothesis(res.i2, | linearHypothesis(res.i2, | ||
Line 389: | Line 422: | ||
</ | </ | ||
- | If one were inclined to do so, one could even test all 6 factor level combinations against each other (i.e., all possible pairwise comparisons). The '' | + | If one were inclined to do so, one could even test all 6 factor level combinations against each other (i.e., all possible pairwise comparisons). The '' |
<code rsplus> | <code rsplus> | ||
summary(glht(res.i2, | summary(glht(res.i2, | ||
Line 420: | Line 454: | ||
(Adjusted p values reported -- none method) | (Adjusted p values reported -- none method) | ||
</ | </ | ||
+ | |||
As before, no adjustment to the p-values has been used for these tests, but this would probably be advisable when conducting so many tests. | As before, no adjustment to the p-values has been used for these tests, but this would probably be advisable when conducting so many tests. | ||
We can also illustrate the results from the interaction model by plotting the estimated average effects for all factor level combinations: | We can also illustrate the results from the interaction model by plotting the estimated average effects for all factor level combinations: | ||
+ | |||
<code rsplus> | <code rsplus> | ||
plot(coef(res.i2)[1: | plot(coef(res.i2)[1: |
tips/multiple_factors_interactions.txt · Last modified: 2024/05/14 13:09 by Wolfgang Viechtbauer