Because those weights are all between -1 and 1, the scale of the factor scores will be very different from a pure sum. So each item’s contribution to the factor score depends on how strongly it relates to the factor.įactor scores are essentially a weighted sum of the items. Each item’s weight is derived from its factor loading. One approach to combining items is to calculate an index variable via an optimally-weighted linear combination of the items, called the Factor Scores. If those loadings are very different from each other, you’d want the index to reflect that each item has an unequal association with the factor. Some loadings will be so low that we would consider that item unassociated with the factor and we wouldn’t want to include it in the index.īut even among items with reasonably high loadings, the loadings can vary quite a bit. Each item’s loading represents how strongly that item is associated with the underlying factor. Part of the Factor Analysis output is a table of factor loadings. There are two similar, but theoretically distinct ways to combine these 10 items into a single index. So let’s say you have successfully come up with a good factor analytic solution, and have found that indeed, these 10 items all represent a single factor that can be interpreted as Anxiety. So we turn to a variable reduction technique like FA or PCA to turn 10 related variables into one that represents the construct of Anxiety.įA and PCA have different theoretical underpinnings and assumptions and are used in different situations, but the processes are very similar. You’re interested in the effect of Anxiety as a whole. Not only would you have trouble interpreting all those coefficients, but you’re likely to have multicollinearity problems.Īnd most importantly, you’re not interested in the effect of each of those individual 10 items on your outcome. You could use all 10 items as individual variables in an analysis–perhaps as predictors in a regression model. In other words, you may start with a 10-item scale meant to measure something like Anxiety, which is difficult to accurately measure with a single question. The reference-manual of SPSS (at the IBM-site) for oblique-rotations contains even formulae for the computation.One common reason for running Principal Component Analysis (PCA) or Factor Analysis (FA) is variable reduction. See wikipedia for rotation-methods in factor analysisĪn article with an example of comparision PCA and oblique rotation Moreover I think this works only iteratively and consumes one or more degrees of freedom in the statistical testing of the model. If I understand things correctly the software tries to "rectangularize" the factor-loadings by re-calculating their coordinates in an orthogonal, euclidean space (as for instance shown in your picture) into coordinates of a space whose axes are non-orthogonal perhaps with some technique known from multiple regression. In stat-software like SPSS (and possibly also in its freeware clone) PSPP one finds the equivalently called "oblique rotations", and instances of them named as "oblimin","promax" and something more. There are PCA-like procedures for the so-called "oblique" case. Hence, ICA cannot find the mixing matrix from the data. Imagine you have two independent components, $x_||$, which means that the probability distribution does not change under the rotation. When your sources are Gaussian then ICA cannot find the components. ICA returns an estimation of the mixing matrix and the independent components. the independent components are non-Gaussian. There are plenty of good tutorials in Internet, and quiet a few freely available implementations to try out (for example in scikit or MDP).Īs other algorithms, ICA is optimal when the assumptions for which it was derived apply. It is able to decompose non-orthogonal components (like in your case) by assuming that your measurements result from a mixture of statistically independent variables. Independent Component Analysis should be able to provide you with s good solution.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |