What about PLS Regression in the structural model?
It is one possible solution when facing multicollinearity between exogenous latent variables.
Please participate in the poll.
PLS Regression in Structural Model
- joerghenseler
- PLS Expert User
- Posts: 39
- Joined: Fri Oct 14, 2005 9:59 am
- Real name and title:
- joerghenseler
- PLS Expert User
- Posts: 39
- Joined: Fri Oct 14, 2005 9:59 am
- Real name and title:
PLS Regression in Structural Model
In the very last step of PLS Path Modeling, (multiple) regressions are done, in which the latent variable scores serve as dependent/independent variables.
For several purposes (e.g., to overcome multicollinearity), it might come in handy to do a PLS regression (see e.g. Tenenhaus 1998) instead of usual (multiple) regressions.
For several purposes (e.g., to overcome multicollinearity), it might come in handy to do a PLS regression (see e.g. Tenenhaus 1998) instead of usual (multiple) regressions.
Last edited by joerghenseler on Tue Oct 25, 2005 4:09 pm, edited 1 time in total.
The problem, I think, is that such PLS regression does not really address the multicollinearity at the level of your latent variables and shares many (but not all) of the problems that plague Principle Component Regression.
On the positive side you do not have the problem of the arbitrary number of components and you know that they are chosen to optimally predict Y (unlike in PCR). On the negative side, if you want to decompose the resulting structural parameters back to your original latent variables you may find that the signs are "wrong" (ie., latent variables that should have a positive relationship to your dependent variable have a negative parameter instead). Or they may not--the problem is that you don't know! And in domains beyond something like customer satisfaction, you may not know a priori what signs your parameters should have and so you are very much at a loss!
I can not think of instances in the social sciences where you would not want to decompose the results back to your original laten variables so this is a big problem!
On the positive side you do not have the problem of the arbitrary number of components and you know that they are chosen to optimally predict Y (unlike in PCR). On the negative side, if you want to decompose the resulting structural parameters back to your original latent variables you may find that the signs are "wrong" (ie., latent variables that should have a positive relationship to your dependent variable have a negative parameter instead). Or they may not--the problem is that you don't know! And in domains beyond something like customer satisfaction, you may not know a priori what signs your parameters should have and so you are very much at a loss!
I can not think of instances in the social sciences where you would not want to decompose the results back to your original laten variables so this is a big problem!
John J. Sailors, PhD
Associate Professor of Marketing
The University of St. Thomas
Opus College of Business
Minneapolis, MN
Associate Professor of Marketing
The University of St. Thomas
Opus College of Business
Minneapolis, MN