Published on Mon Nov 12 2018

On Asymptotic Covariances of A Few Unrotated Factor Solutions

Xingwei Hu

We provide explicit formulas for the asymptotic covariances of unrotated factor loading estimates and unique variance estimates. These formulas are extracted from least square, principal, iterative principal component, alpha or image factor analysis. Asimulation study shows that the formulas provide reasonable results.

0
0
0
Abstract

In this paper, we provide explicit formulas, in terms of the covariances of sample covariances or sample correlations, for the asymptotic covariances of unrotated factor loading estimates and unique variance estimates. These estimates are extracted from least square, principal, iterative principal component, alpha or image factor analysis. If the sample is taken from a multivariate normal population, these formulas, together with the delta methods, will produce the standard errors for the rotated loading estimates. A simulation study shows that the formulas provide reasonable results.

Fri Sep 26 2014
Machine Learning
Order-invariant prior specification in Bayesian factor analysis
The loading matrix is identified only up to orthogonal rotation. We show how a minor modification of the approach allows one to compute with the identifiable lower triangular loading matrix but maintain invariance properties under reordering.
0
0
0
Sun Aug 12 2018
Machine Learning
Robust high dimensional factor models with applications to statistical machine learning
Factor models are a class of powerful statistical models. They have been used to deal with dependent measurements that arise frequently from genomics and neuroscience to economics and finance. We show that classical methods can be tailored to many new problems and provide powerful tools for statistical estimation and inference.
0
0
0
Sat May 26 2012
Machine Learning
Sparse estimation via nonconcave penalized likelihood in a factor analysis model
The penalized likelihood procedure can be viewed as a generalization of the traditional two-step approach. Monte Carlo simulations are conducted to investigate the performance of our modeling strategy. A real data example is also given to illustrate our procedure.
0
0
0
Wed Apr 22 2015
Machine Learning
Rebuilding Factorized Information Criterion: Asymptotically Accurate Marginal Likelihood
Factorized information criterion (FIC) is a recently developed approximation for the marginal log-likelihood. This paper reconsiders FIC and fills theoretical gaps of previous FIC studies.
0
0
0
Thu Jan 18 2018
Machine Learning
Computation of the Maximum Likelihood estimator in low-rank Factor Analysis
Factor analysis is popularly used as a fundamental tool for dimensionality reduction. We reformulate the low rank ML Factor Analysis problem as a nonlinear nonsmooth semidefinite optimization problem. We propose fast and scalable algorithms based on difference of convex (DC) optimization.
0
0
0
Sat Nov 26 2011
Machine Learning
Learning a Factor Model via Regularized PCA
We consider the problem of learning a linear factor model. We propose a regularized form of principal component analysis (PCA) We demonstrate the superiority of resulting estimates to those produced by pre-existing factor analysis approaches.
0
0
0