This paper considers smooth principle component analysis for high di-
mensional data with very large dimensional observations p and moderate
number of individuals N. Our setting is similar to traditional PCA, but
we assume the factors are smooth and design a new approach to estimate
them. By connecting with Singular Value Decomposition subjected to
penalized smoothing, our algorithm is linear in the dimensionality of the
data, and it also favors block calculations and sequential access to memory.
Dierent from most existing methods, we avoid extracting eignefunctions
via smoothing a huge dimensional covariance operator. Under regularity
assumptions, the results indicate that we may enjoy faster convergence
rate by employing smoothness assumption. We also extend our meth-
ods when each subject is given multiple tasks by adopting the two way
ANOVA approach to further demonstrate the advantages of our approach.
Principal Component Analysis; Penalized Smoothing; Asymp-
totics; Multilevel; fMRI;