[R-lang] Re: Fwd: Simple effects with lmer
T. Florian Jaeger
tiflo@csli.stanford.edu
Sat Jul 30 18:06:20 PDT 2011
Dear Claire,
this never seems to have been answered. So let me see.
I need to do simple effect analyses for a mixed model.
> I started with what was discussed here:
> http://pidgin.ucsd.edu/pipermail/r-lang/2009-December/thread.html#202
>
> My model, however, is slightly different: 5 predictors, and one interacts
> significantly with two predictors (one has 2 levels, one 3). Predictors with
> 2 levels were centered, the others were ordered.
> It was suggested to do a contrast treatment for both predictors in
> interaction, and I assumed I had to run my final model and just replace the
> original predictors with the contrast treatment.
>
> QUESTION 1: Is it the right thing to do, or shall I run a model with just
> the interaction?
>
depends on your question. Can you say more about your model? For questions
like yours, it would help to see an excerpt of the data (head(data)), the
model call and an explanation of all predictors and the outcome variable.
As for how to start your analysis, I indeed would suggest that you start
with contrast coding (it's closest to ANOVA). This will allow you to read
off the group differences from the model output.
Now, it seems that you want to perform some sort of simple effect analysis
for a higher order interaction? or is it a two-way interaction? why don't
you just paste the model and then point to the effects you'd like to conduct
a simple effect analysis for?
My first advice is: consider using an ANOVA if you know what to do in that
framework. As long as your DV is continuous, approximally normally
distributed and the variances are homogeneous, this might be the fastest
solution for you.
So, now let's assume that you insist on doing this in a regression model ;).
Then, to quote http://www.davidakenny.net/cm/moderation.htm, "There are two
ways to determine simple effects. The first and relatively simpler way is to
estimate the simple effects within the regression equation. The second way
is to estimate separate regression equations for each level of the
moderator. The latter strategy is preferable if there are differences in
error variance for the different levels of the moderator."
The second approach (splitting your data) suffers from a loss of power, but
if you have enough data and a big enough effect, this might be the easiest
bet.
If you want to calculate simple effects for the whole data, this essentially
comes down to re-coding your variables, so that one of the parameters in the
model encodes the simple effect. then you ask whether that parameter is
different from zero. that's all i can say without knowing more about your
data and model.
This works fine with the 2*2 interaction, with an acceptable correlation
> between the predictors and the interaction
> .
> For the second interaction however, this is not working: I had to
> residualize terms of the interaction to remove collinearity, and, obviously,
> contrast treated variables now are still highly collinear.
>
This is hard to follow without the data and model. Ideally, pls also provide
a commented script of what you did.
>
> QUESTION 2: What is the next step to have simple effects for this
> interaction? Can we residualise contrast treated predictors?
>
You can, but that becomes completely un-interpretable. I am not yet sure how
you ended up in a situation with 5 predictors that do not seem to be
balanced.
Florian
>
> I really hope this makes sense.
> Many thanks for your input.
>
> Best,
>
> Claire Delle Luche
> Research fellow
> University of Plymouth
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.ucsd.edu/pipermail/ling-r-lang-l/attachments/20110730/f59e3e10/attachment.html
More information about the ling-r-lang-L
mailing list