1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network...

75
EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: , R., & Ninness, S. (2012) Behavioral and Biological Neural Network Analyses: A Common Pathway toward Continue with T&F Chapter 13 nd please read the study below posted on our webpag

Transcript of 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network...

Page 1: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

EPS 651 Multivariate Analysis

Factor Analysis, Principal Components Analysis,

and Neural Network Analysis (Self-Organizing Maps)

For next week:

Ninness, C., Lauter, J. Coffee, M., Clary, L., Kelly, E., Rumph, M., Rumph, R., Kyle, R., & Ninness, S. (2012) Behavioral and Biological Neural Network Analyses: A Common Pathway toward Pattern Recognition and Prediction. The Psychological Record, 62, 579-598. TPR_VOL62 NO4.pdf

Continue with T&F Chapter 13and please read the study below posted on our webpage:

Page 2: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

T&F Chapter 13 --> 13.5.3 page 642

http://www.ats.ucla.edu/stat/spss/output/factor1.htm

Several slides are based on material from the UCLA SPSS Academic Technology Services

Page 3: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Principal components analysis (PCA) and Factor Analysis are methods of data reduction: 

Suppose that you have a dozen variables that are correlated.  You might use principal components analysis to reduce your 12 measures to a few principal components.  For example, you may be most interested in obtaining the component scores (which are variables that are added to your data set) and/or to look at the dimensionality of the data.  For example, if two components are extracted and those two components accounted for 68% of the total variance, then we would say that two dimensions in the component space account for 68% of the variance.  Unlike factor analysis, principal components analysis is not usually used to identify underlying latent variables. 

[direct quote from below].

http://www.ats.ucla.edu/stat/spss/output/factor1.htm

Page 4: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

If raw data are used, the procedure will create the original correlation matrix or covariance matrix, as specified by the user.  If the correlation matrix is used, the variables are standardized and the total variance will equal the number of variables used in the analysis (because each standardized variable has a variance equal to 1).  If the “covariance matrix” is used, the variables will remain in their original metric.  However, one must take care to use variables whose variances and scales are similar.  Unlike factor analysis, which analyzes the common variance, the original matrix in a principal components analysis analyzes the total variance.  Also, principal components analysis assumes that each original measure is collected without measurement error [direct quote]. 

FA and PCA: Data reduction methods 

http://www.ats.ucla.edu/stat/spss/output/factor1.htm

Page 5: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Factor analysis is a method of data reduction also – forgiving relative to PCA  Factor Analysis seeks to find underlying unobservable (latent) variables that are reflected in the observed variables (manifest variables).  There are many different methods that can be used to conduct a factor analysis (such as principal axis factor, maximum likelihood, generalized least squares, unweighted least squares). There are also many different types of rotations that can be done after the initial extraction of factors, including orthogonal rotations, such as varimax and equimax, which impose the restriction that the factors cannot be correlated, and oblique rotations, such as promax, which allow the factors to be correlated with one another.  You also need to determine the number of factors that you want to extract.  Given the number of factor analytic techniques and options, it is not surprising that different analysts could reach very different results analyzing the same data set.  However, all analysts are looking for a simple structure.  A simple structure is pattern of results such that each variable loads highly onto one and only one factor.  [direct quote]

http://www.ats.ucla.edu/stat/spss/output/factor1.htm

Spin Control

Page 6: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

6

FA vs. PCA conceptually

FA produces factors PCA produces components

FA

I1 I3I2

PCA

I1 I3I2

Page 7: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

7

Kinds of Research Questions re PCA and FA

What does each factor mean? Interpretation? Your callWhat is the percentage of variance in the data accounted for by the factors? SPSS & psyNet will show youWhich factors account for the most variance? SPSS & psyNet How well does the factor structure fit a given theory? Your call

What would each subject’s score be if they could be measured directly on the factors? Excellent question!

Page 8: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Kaiser-Meyer-Olkin Measure of Sampling Adequacy - This measure varies between 0 and 1, and values closer to 1 are better.  A value of .6 is a suggested minimum.  It answers the question: Is there enough data relative to the number of variables.

Bartlett's Test of Sphericity - This tests the null hypothesis that the correlation matrix is an identity matrix.  An identity matrix is a matrix in which all of the diagonal elements are 1 and all off diagonal elements are 0.  Ostensibly, you want to reject this null hypothesis.  This, of course, is psychobabble.Taken together, these two tests provide a minimum standard which should be passed before a factor analysis (or a principal components analysis) should be conducted.

should be> .6

should be< .05

Before you can even start to answer these questions using FA

Page 9: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

What is a Common Factor?

It is an abstraction, a “hypothetical construct” that relates to at least two of our measurement variables into a factorIn FA, psychometricians / statisticians try to estimate the common factors that contribute to the variance in a set of variables.Is this an act of logical conclusion, a creation, or a figment of a psychometrician’s imagination ? Depends on who you ask

Page 10: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

What is a Unique Factor?

It is a factor that contributes to the variance in only one variable.There is one unique factor for each variable.The unique factors are unrelated to one another and unrelated to the common factors.We want to exclude these unique factors from our solution.

Seems reasonable … right?

Page 11: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

11

Assumptions

Factor analysis needs large samples and it is one of the only draw backs

• The more reliable the correlations are the smaller the number of subjects needed

• Need enough subjects for stable estimates -- How many is enough

Page 12: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

12

Assumptions

Take home hint:• 50 very poor, 100 poor, 200 fair, 300 good, 500 very good and 1000+ excellent

• Shoot for minimum of 300 usually

• More highly correlated markers fewer subjects

Page 13: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Assumptions

No outliers – obvious influence on correlations would bias resultsMulticollinearity

In PCA it is not problem; no inversionsIn FA, if det(R) or any eigenvalue approaches 0 -> multicollinearity is likely

Page 14: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

The above Assumptions at Work:

Note that the metric for all these variables is the same

(since they employed a rating scale). So do we do we run the

FA as correlation or covariance

matrices / does it matter?

Page 15: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Sample Data Set From Chapter 13 (p. 617)Tabacknick and Fidell

Principal Components and Factor Analysis

Skiers Cost Lift Depth PowderS1 32 64 65 67S2 61 37 62 65S3 59 40 45 43S4 36 62 34 35S5 62 46 43 40

Variables

Keep in mind, multivariate normality is assumed when statistical inference is used to determine the number of

factors. The above dataset is far too small to fulfill the normality

assumption. However,even large datasets frequently violate this assumption and

compromise the analysis.Multivariate normality also implies that relationships among pairs of variables are linear. The analysis is degraded when

linearity fails, because correlation measures linear relationship and does not reflect nonlinear relationship.

Linearity among variables is assessed through visual inspection of scatterplots.

Page 16: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Cost Lift Depth PowderCost 1 -0.952990 -0.055276 -0.129999Lift -0.952990 1 -0.091107 -0.036248Depth -0.055276 -0.091107 1 0.990174Powder -0.129999 -0.036248 0.990174 1

Correlation matrix w/ 1s in the diag

Large correlation between Cost and Lift and another between Depth and Powder

Looks like two possible factors – why?

Equations – Extractions - Components

Page 17: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,
Page 18: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

L=V’RV => L = V’ R V EigenValueMatrix = TransposeEigenVectorMatrix * CorMat * EigenVecMat We are reducing to a few factors which duplicate the matrix? Does this seem reasonable?

Are you sure about this?

Page 19: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

In a two-by-two matrix we derive eigenvalues

with two eigenvectors each containing two elements

In a four-by-four matrix we derive eigenvalues

with eigenvectors each containing four elements

Equations – Extraction - Obtaining components

L=V’RV It is important to know how L is constructed

Where L is the eigenvalue matrix and V is the eigenvector matrix.This diagonalized the R matrix and reorganized the variance into eigenvaluesA 4 x 4 matrix can be summarized by 4 numbers instead of 16.

Page 20: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

With a two-by-two matrix we derive eigenvalues

with two eigenvectors each containing two elements

With a four-by-four matrix we derive eigenvalues

with eigenvectors each containing four elements

it simply becomes a longer polynomial

Remember this?

5 - 1 4 2 - ( 5 - ) * ( 2 -

) - ( 4 * 1 )

2 -5 + -2 + (5 * 2)

- ( 1 * 4 )

2 - 7 + 6 = 0

00-

5 1 4 2

Page 21: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

21

i - 7 +=

i-

Where a = 1, b = -7 and c = 6

an equation of the second

degree with two roots [eigenvalues]

2

(-7)2 - 4 (1) * (6)

2 (1)

(-7)2 - 4 (1) * (6) 2 (1)

- 7 =

5 - 1 4 2 -

-5 + -2 + (5 * 2)

- ( 1 * 4 )

( 5 - ) * ( 2 - )

- ( 4 * 1 )

= 6

= 1

1 2= 6 = 1

= 0

i - b - +

= b2 - 4 ac 2 a

2 - 7 + 6 = 0

Determinant

-( )

( )-

Page 22: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,
Page 23: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

From Eigenvalues to Eigenvectors

Page 24: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Equations – Extractions – Obtaining componentsR=VLV’

• SPSS matrix output

Careful here. 1.91 is correct,

but it appears as a “2” in the text

Skiers Cost Lift Depth PowderS1 32 64 65 67S2 61 37 62 65S3 59 40 45 43S4 36 62 34 35S5 62 46 43 40

Variables

Page 25: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Our original correlation matrix

Obtaining

V’ V

L = the eigenvalue matrix

Page 26: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Bartlett's Test of Sphericity - This tests the null hypothesis that the correlation matrix is an identity matrix.  An identity

matrix is matrix in which all of the diagonal elements are 1 and all off diagonal elements

are 0. 

Page 27: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Equations – Extraction – Obtaining Components

Page 28: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Other than the magic “2” below – this is a decent example

1.91

We have“extracted” two

Factors from four variablesUsing a small

data set

Page 29: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Following SPSS Extraction and Rotation and all that jazz… in this case, not much difference [others

data sets show big change]

Factor 1 Factor 2Cost -0.401 0.907Lift 0.251 -0.954Depth 0.933 0.351Powder 0.957 0.288

Here we see that Factor 1 is mostly Depth and Powder (Snow Condition Factor)Factor 2 is mostly Cost and Lift, which is a Resort FactorBoth factors have complex loadings

Page 30: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Skiers Cost Lift Depth PowderS1 32 64 65 67S2 61 37 62 65S3 59 40 45 43S4 36 62 34 35S5 62 46 43 40

Variables

This is a variation on your homework. Just use your own numbers and replicate the process.

(we may use this hypothetical data as part of a study)

Using SPSS 12, SPSS 20 and psyNet.SOM

Page 31: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Here is an easier way than doing it by hand:

Arrange data in Excel Format as below: SPSS 20

Page 32: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Select Data Reduction: SPSS 12

Page 33: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Select Data Reduction: SPSS 20

Page 34: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Select Variables Descriptives: SPSS 12

Page 35: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Select Variables and Descriptives: SPSS 20

Page 36: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Start with a basic run using Principal Components: SPSS 12

Eigenvalues over 1

Page 37: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Fixed number of factors

Start with a basic run using Principal Components: SPSS 12

Page 38: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Select Varimax: SPSS 12

Page 39: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Select Varimax: SPSS 20

Page 40: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Under Options, select exclude cases likewise and sort by size: SPSS 12

Page 41: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Under Options, select exclude cases likewise and sort by size: SPSS 20

Page 42: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Under Scores, select “save variables” and “display matrix”: SPSS 20

Page 43: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Watch what pops out of your ovenA real time saver

Page 44: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,
Page 45: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Matching psyNet PCA correlation matrix with SPSS FA

This part is the same but the rest of PCA goes in an entirely different direction

Page 46: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Remember these guys?

An MSA of .9 is marvelous, .4 is not too impressive – Hey it was a small sampleNormally, variables with small MSAs should be deleted

Kaiser's measure of sampling adequacy: Values of .6 and above are required for a good FA.

Page 47: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Looks like two factors can be isolated/extracted

which ones? and what shall we call them?

Page 48: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Here they are again // they have eigenvalues > 1

We are reducing to a few factors which duplicate the matrix?

Page 49: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Fairly Close

Page 50: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Rotations – Nice hints here

Page 51: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

SPSS will provide an Orthogonal Rotationwithout your help – look at the iterations

Page 52: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Extraction, Rotation, and Meaning of Factors

Orthogonal Rotation [assume no correlation among the factors]

Loading Matrix – correlation between each variable and the factor

Oblique Rotation [assumes possible correlations among the factors]

Factor Correlation Matrix – correlation between the factorsStructure Matrix – correlation between factors and variables

Page 53: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Oblique Rotations – Fun but not today

Factor extraction is usually followed by rotation in order to maximize large correlation and minimize small correlationsRotation usually increases simple structure and interpretability.The most commonly used is the Varimax variance maximizing procedure which maximizes factor loading variance

Page 54: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Rotating your axis “orthogonally” ~ sounds painfully chiropractic

Where are your components located on

these graphs?

What are theupperand

lower limitson each of

theseaxes?

Cost and Liftmay be a

factor,but they are

polar opposites

Page 55: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Factor weight matrix [B] is found by dividing the loading matrix [A] by the correlation matrix [R-1].

See matrix output 1B R A

Abbreviated Equations

Factors scores [F] are found by multiplying the standardized scores [Z] for each individual by the factor weight matrix [B]and adding them up.

F ZB

Page 56: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Abbreviated Equations

'Z FA

The specific goals of PCA or FA are to summarize patterns of correlations among observed

variables, to reduce a large number of observed variables to a smaller number of factors,

to provide an operational definition (a regression equation) for an underlying process

by using observed vari ables to test a theory about the nature of

underlying processes.

Page 57: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

You can also estimate what each subject would score on the “standardized variables.”

This is a revealing procedure—often overlooked.

Standardized variables as factors

Page 58: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

1.1447 0.96637 -0.41852 -1.11855 -0.574

1 2 3 4 5Cost 32 61 59 36 62Lift 64 37 40 62 46

Depth 65 62 45 34 43Powder 67 65 43 35 40

Predictions based on Factor analysis: Standard-Scores

Page 59: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

1.18534 -0.90355 - 0.70694 0.98342 -0.55827

1 2 3 4 5Cost 32 61 59 36 62Lift 64 37 40 62 46

Depth 65 62 45 34 43Powder 67 65 43 35 40

Predictions based on Factor analysis: Standard-Scores

Interesting stuff… what about cost?

Page 60: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

0.39393 -0.59481 - 0.73794 -0.64991 1.58873

Predictions based on Factor analysis: Standard-Scores

And this is supposed to represent ?

Page 61: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Skiers Cost Lift Depth PowderS1 32 64 65 67S2 61 37 62 65S3 59 40 45 43S4 36 62 34 35S5 62 46 43 40

Variables

SOM Classification of Ski Data

Skiers S1 S2 S3 S4 S5Cost 32 61 59 36 62Lift 64 37 40 62 46

Depth 65 62 45 34 43Powder 67 65 43 35 40

Vari

able

s

Page 62: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Transpose data before saving as a CSV file.

Page 63: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Transpose data to analyze by class/factors

4 rows 4 columnsIn CSV format

Page 64: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

SOM Classification of Ski Data

SOM classification 1:Depth and Powder

across 5 SS

Nice match with FA 1

1 2 3 4 5Cost 32 61 59 36 62Lift 64 37 40 62 46

Depth 65 62 45 34 43Powder 67 65 43 35 40

Cost -1.36772 0.835832 0.683862 -1.06379 0.911816Lift 1.27029 -1.14505 -0.87668 1.091376 -0.33994

Powder 1.285737 1.031973 -0.40602 -1.33649 -0.5752Depth 1.275638 1.125563 -0.52526 -1.12556 -0.75038

Page 65: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

1 2 3 4 5Cost 32 61 59 36 62Lift 64 37 40 62 46

Depth 65 62 45 34 43Powder 67 65 43 35 40

SOM classification 3: Lift across 5 SS

SOM classification 2: Cost across 5 SS

LiftClass/Factor

Near match with FA 2

Cost -1.36772 0.835832 0.683862 -1.06379 0.911816Lift 1.27029 -1.14505 -0.87668 1.091376 -0.33994Powder 1.285737 1.031973 -0.40602 -1.33649 -0.5752Depth 1.275638 1.125563 -0.52526 -1.12556 -0.75038

CostClass/Factor ??

Page 66: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

SOM classification 1:Depth and Powder

across 5 SSNice match with FA 1

Factor 1: Appears to addressDepth and Powder

This could be placed into a logistic regression and predict with reasonable accuracy

Page 67: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

SOM classification 3: Lift across 5 SS

Factor 2: Appears to address Lift

Page 68: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

SOM classification 2: Cost across 5 SS

Predictions based on Factor analysis: Standard-Scores

Factor Analysis Factor 3: ??

Page 69: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Center for Machine Learning and Intelligent Systems

Iris SetosaIris VersicolourIris Virginica

This dataset has provided the foundation for multivariate

statistics and machine learning

Page 70: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Transpose data before saving as a CSV file.

Page 71: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Transpose data to analyze by class/factors

4 rows 150 columnsIn CSV format

Page 72: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Factor Analysis: Factor 1

Factor Analysis: Factor 2

sepal length in cmsepal width in cmpetal length in cm

petal width in cm

Page 73: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

SOM Neural Network: Class 1

SOM Neural Network: Class 2

sepal length in cmsepal width in cmpetal length in cm

petal width in cm

Page 74: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Factor Analysis: Factor 1

SOM Neural Network: Class 1

sepal length in cmsepal width in cmpetal length in cm

This could be placed into a logistic regression and predict with near perfect accuracy

Page 75: 1 EPS 651 Multivariate Analysis Factor Analysis, Principal Components Analysis, and Neural Network Analysis (Self-Organizing Maps) For next week: Ninness,

Really ?? Look at the original

Everybody but psychologists seem to understand this