Teoria bàsica i exemples: regresssió simple, múltiple i...

65
Descripci´ o de dades i infer` encia estad´ ıstica Distribuci´ o bivariada: regressi´ o simple Regressi´ om´ ultiple Regressi´ o Log´ ıstica Second data set: multiple regression and logistic regression T EORIA B ` ASICA I EXEMPLES : REGRESSSI ´ O SIMPLE , M ´ ULTIPLE I LOG ´ ISTICA Albert Satorra M` etodes Estad´ ıstics, UPF, Hivern 2014 Albert Satorra

Transcript of Teoria bàsica i exemples: regresssió simple, múltiple i...

Page 1: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

TEORIA BASICA I EXEMPLES: REGRESSSIOSIMPLE, MULTIPLE I LOGISTICA

Albert Satorra

Metodes Estadıstics, UPF, Hivern 2014

Albert Satorra

Page 2: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

1 Descripcio de dades i inferencia estadıstica

2 Distribucio bivariada: regressio simple

3 Regressio multipleRobust s.e. (Optional)

4 Regressio LogısticaCase Influence statisticsMultiple regression and multicolinearity

5 Second data set: multiple regression and logistic regression

Albert Satorra

Page 3: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Fitxer de Dades

Mostra aleatoria de mida n = 800 d’una poblacioVariables: despesa, renda, genere (1/0, noi = 1), vot (1/0, partit A = 1)Fixer de dades es a la web (dues opcions .sav i el .txt):library(foreign)data=read.spss("http://www.econ.upf.edu/˜satorra/dades/M2014dadesSIM.sav")

data= read.table("http://www.econ.upf.edu/˜satorra/dades/M2013RegressioSamp.txt", header =T)

names(data)"Lrenda" "Ldespeses" "Genere" "Vot"

head(data)Lrenda Ldespeses Genere Vot

1 9.477 4.503 1 12 11.435 6.147 1 03 10.686 4.961 0 04 10.407 3.993 0 05 10.814 5.746 0 06 9.944 4.950 0 1

Albert Satorra

Page 4: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

sis files del fitxer de dades

Lrenda Ldespeses Genere Vot1 9.477 4.503 1 12 11.435 6.147 1 03 10.686 4.961 0 04 10.407 3.993 0 05 10.814 5.746 0 06 9.944 4.950 0 1

Albert Satorra

Page 5: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Analisi Univariant (repliqueu amb SPSS)Descripcio basica:

attach(data)renda=exp(Lrenda)

summary(renda)Min. 1st Qu. Median Mean 3rd Qu. Max.1306 11250 23970 36940 44270 528600

Mitjanes i desviacions estandard:

apply( data,2,mean)Lrenda Ldespeses Genere Vot

10.031189 4.978471 0.515000 0.550000

apply( data,2,sd)Lrenda Ldespeses Genere Vot

1.0032099 0.6615195 0.5000876 0.4978049

Destribucio univariant (renda, Ldespeses):

summary(renda)Min. 1st Qu. Median Mean 3rd Qu. Max.1306 11250 23970 36940 44270 528600

sd(renda) = 44358.15

summary(Ldespeses)Min. 1st Qu. Median Mean 3rd Qu. Max.

2.232 4.572 4.968 4.978 5.421 7.759sd(Ldespeses) =0.6615195

Diagrama de dispersio: Lrenda vs Ldespeses.

Albert Satorra

Page 6: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Renda: Mean, sd

Mean: 36940sd = 44358.15

> 44358.15/sqrt(800)[1] 1568.297

> 36890 + 2*(44358.15/sqrt(800))[1] 40026.59

36890 - 2*(44358.15/sqrt(800))[1] 33753.41

95\% IC: (33753.41, 40026.59)

de la mitjana poblacional

Albert Satorra

Page 7: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Histograma

Histograma (freq.) de variable renda

renda

Fre

quen

cy

0e+00 1e+05 2e+05 3e+05 4e+05 5e+05

010

020

030

040

050

060

0

Figure:

Albert Satorra

Page 8: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Histograma de la variable log de renda

Lrenda = log(renda)

Histograma (freq.) de la variable log de renda

log(renda)

Fre

quen

cy

7 8 9 10 11 12 13

050

100

150

Figure:

Albert Satorra

Page 9: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Comenteu, en aquesta base de dades:

1 Tipus de variables, tipus de distribucio de les variables continues

2 Variable X estandarditzada x∗ = xi−xsx

( scale(Lrenda) )

3 Inferencia sobre la renda mitjana µ de la poblacio (estimacio,interval de confianca, . . . )

4 Mida de mostra per una determinada precisio: inferencia sobre lamitjana de renda, vot =1, . . .

Albert Satorra

Page 10: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Relacio bivariant: diagrama de dispersio

●●

●●●●

● ●

●●● ●

●●

●●

●●

●●

●●

●●●●●

●●

●●●

●● ●

●●

●● ●

●●

● ●

●●

●●●●●●●

●●

●●●

● ●●●●

●● ●

●●

● ●●

● ●●

●●

●●●

● ●● ●

●●

●●

●●● ●

●●

● ●

●●

● ●

●●

●●

●●●

●●

●●

●●●●●

●●

●●

●●

● ●●

● ●●●

●●

● ●●●

●●

●●

●●

●●●

●●

●● ●

●●

●●

● ●●●

●●

●●

● ●●●

● ●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

● ●

● ●

●●

●●

●●

●● ●

● ●●

●●●

●● ●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●●

●●●

●●●

●●

● ● ●

●● ●

● ●●●

● ●●●

●●●

● ●●● ●

●●

●●●●

● ●

● ●

● ●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●●

●●

●●

●●

●●●

●●●

●●

●●●●

●●

●● ●

●●

●●

●●●

●● ●

● ●

●●

● ●

●●

●●● ●

●●

●●

●●

●●

● ●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●●●

●●●

●●

●●●

●●

● ● ●●

●●●●

●●

●●

●●●

●●●

●●

●●

●●●

●●

●●

0e+00 1e+05 2e+05 3e+05 4e+05 5e+05

050

010

0015

0020

00

renda

desp

eses

Figure: Diagrama de dispersio de despeses vs rendaAlbert Satorra

Page 11: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Relacio bivariant: diagrama de dispersio

●●

●●

● ●

●● ●

●●

●●

●●

●●

●●

●● ●

● ●

●●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●

● ●

● ●●

● ●

●● ●

● ●

●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●

●●

●●●

● ●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

0e+00 1e+05 2e+05 3e+05 4e+05 5e+05

34

56

7

renda

Ldes

pese

s

Figure: Diagrama de dispersio de Ldespeses vs rendaAlbert Satorra

Page 12: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Relacio bivariant: diagrama de dispersio

●●

●●

● ●

●● ●

●●

●●

●●

●●

●●

●● ●

● ●

●●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●

●●

●●

●●

●●

● ●

●● ●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●

● ●

● ●●

● ●

●● ●

● ●

●●

● ●

●● ●

●●

●●●

●●

●●

● ●

●●

●●

●●

●●

● ●

● ●

●●

●●●

● ●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

7 8 9 10 11 12 13

34

56

7

Lrenda

Ldes

pese

s

Figure: Diagrama de dispersio de Ldespeses sobre Lrenda, dadesestandarditzades

Albert Satorra

Page 13: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Coeficient de correlacio, r

> cor(renda,despeses)[1] 0.2613614> cor(renda,Ldespeses)[1] 0.32058> cor(Lrenda,Ldespeses)[1] 0.4385204

> round(cor(Lrenda,Ldespeses),2)[1] 0.44

> (cor(Lrenda,Ldespeses))ˆ2[1] 0.1923001

El coeficient de rcorrelacio r entre log de despesa i el log derenda es: r = 0.44

El quadrat r2 del coeficient de correlacio, el 0.1923, es elcoeficient de determinacio R2 del tema seguent, la regressio

Albert Satorra

Page 14: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Funcio esperanca condicionada: E(Y | X)Regressio lineal:

Regressio lineal simple:

Y = α+ βX + ε

on ε es independent (incorrelacionada) amb XRegressio lineal multiple:

Y = α+ β1X1 + β2X2 + · · ·+ βkXk + ε

on ε es independent (incorrelacionada) amb X1, . . . ,Xk

Nomenclatura: α es el terme independent (la constant, el “intercept”);els βs son coeficients de regressio. En la regressio multiple,β1, . . . , βk son coeficients de regressio parcial. El ε es el terme deperturbacio del model.

Albert Satorra

Page 15: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

●●

●●

● ●

●● ●

●●

●●

●●

●●

●●

●● ●

● ●

●●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●

●●

●●

●●

●●

● ●

●● ●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●

● ●

● ●●

● ●

●● ●

● ●

●●

● ●

●● ●

●●

●●●

●●

●●

● ●

●●

●●

●●

●●

● ●

● ●

●●

●●●

● ●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

−3 −2 −1 0 1 2 3

−4

−2

02

4

Efecte de Regressio

scale(Lrenda)

scal

e(Ld

espe

ses)

Y=Xregressio

Figure: Efecte de regressioAlbert Satorra

Page 16: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Figure: Dades de Francis Galton: (1822-1911): Recta de regressio deAlcada de Fills vs. Alcada Pare

Albert Satorra

Page 17: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Exemple de regressio simple (dades estandarditzades)

library(texreg)texreg(lm(scale(Ldespeses) ˜ scale(Lrenda)), scriptsize=T, stars=c(.05))

Model 1

(Intercept) 0.00(0.03)

scale(Lrenda) 0.44∗

(0.03)

R2 0.19Adj. R2 0.19Num. obs. 800*p < 0.05

Table: Regressio amb dades estandarditzades

Albert Satorra

Page 18: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Exemple de regressio simpleModel de regressio: Y = α + βX + ε, ε ∼ (0, σ2

ε),on Y = Ldespesa, X=Lrenda.

Estimacions de α i β i R2

a: 2.08∗∗∗

(0.21)b: 0.29∗∗∗

(0.02)

R2 0.19Adj. R2 0.19Num. obs. 800

***p < 0.001, **p < 0.01, *p < 0.05, ·p < 0.1

Table: Taula de resultats

19% de la variacio de Y ve explicada per la variacio de XEl coeficient de regressio de Y sobre X es positiu, 0.29, i altament significatiu (p < 0.001)Un increment de una unitat de X va associada a un increment de 0.29 del valor esperat de Y (variables expressades enlogaritmes)Coeficients beta: de Lrenda =

Albert Satorra

Page 19: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

El coeficient beta (coeficients de regressio estandarditzats)

Son els coeficients de regressio quan les variables sonestandarditzades; en aquest cas α = 0

coef.beta=0.28916*(1.0032099)/0.6615195[1] 0.4385179> (coef.beta)ˆ2[1] 0.192298> cor(Ldespeses,Lrenda)[1] 0.4385204> (cor(Ldespeses,Lrenda))ˆ2[1] 0.1923001

Albert Satorra

Page 20: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Robust s.e. (Optional)

Regressio Multiple

re=lm(Ldespeses ˜ Lrenda + Genere)texreg(re)

Model de regressio: Y = α + β1X1 + β2X2 + ε, ε ∼ (0, σ2ε),

on Y = Ldespesa, X1=Lrenda, X2=Genere

Table: Multiple regression

Estimates

(Intercept) 2.98∗

(0.20)Lrenda 0.23∗

(0.02)Genere −0.55∗

(0.04)

R2 0.35AR2 0.35n 800this is OLS analysis

Albert Satorra

Page 21: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Robust s.e. (Optional)

The linear multiple regression model (a bit of theory)

It assumes, the regression function E(Y | X) is lineal in its inputs X1, X2, . . . , Xk ; i.e.E(Y) = α + β1X1 + · · · + βkXk

β1 is the expected change in Y when we increase X1 by one unit ceteris paribus all the other variables being constant.

for prediction purposes, can sometimes outperform fancier more complicated models, specially in situations with smallsample size

it applies to transformed variables, so they encompass a large variety of functions for E(Y | X)

for the X’s variables, it requires them to be continuous or binary variables

we have Y = E(Y | X) + ε, where the disturbance term ε is a random variable assumed to be independent of X,typically with variance that does not change with X (homoscedastic residuals)

for the fitted model, we have Y = a + b1X1 + · · ·+ bkXk , where the bs are partial regression coefficients (obtainedusually by OLS), and e = Y − Y define de residuals

Note that E(Y | X1) is different than E(Y | X1, X2) or E(Y | X1, X2 . . . , Xk). So, the regression coefficient b1 forX1 will typically change depending on which additional variables, besides X1 , we are conditioning.

In causal analysis, researchers are interested in the change on Y1 when we change X1 . This is a complicated issue thatcan only be answer properly with more context regarding the design of the data collection. So far we have been dealingonly with a conditional expectation model (no elements have been introduced yet for proper causal analysis)

Albert Satorra

Page 22: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Robust s.e. (Optional)

Regressio Multiple

1 35% de variacio de Y es explicada per la variacio conjunta deLrenda i Genere

2 Comparem el coeficients de regressio de Lrenda de la regressiosimple i multiple: 0.29 versus 0.23

3 Interpretacio dels coeficients de regressio: coeficients deregressio parcials. Variacio de Y quan variem X1 ceteris paribus(control) les altres var. explicatives

4 La despesa difereix per genere ?

5 . . .

Albert Satorra

Page 23: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Robust s.e. (Optional)

4.0 4.5 5.0 5.5 6.0

-2-1

01

2

Fitted values

Residuals

lm(Ldespeses ~ Lrenda + Genere)

Residuals vs Fitted

164

305201

Figure: Residus versus valors ajustats

Albert Satorra

Page 24: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Robust s.e. (Optional)

library(faraway); prplot(re,1)

●●

●● ●

●●

●●

●●

●●

●●

●● ●

●●

●●

●●

● ●●

●●

● ●

●●●

● ●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

● ●

● ●●

●●

● ●

●●

●●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

7 8 9 10 11 12 13

01

23

4

Lrenda

beta

*Lre

nda+

res

Figure: Partial regression plot: Y versus X1Albert Satorra

Page 25: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Robust s.e. (Optional)

library(faraway); prplot(re,2)

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●

● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●●

●●

● ●

●●

● ●

0.0 0.2 0.4 0.6 0.8 1.0

−2

−1

01

2

Genere

beta

*Gen

ere+

res

Figure: Partial regression plot: Y versus X2Albert Satorra

Page 26: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Robust s.e. (Optional)

Exemple de regressio multiple: dades Paisos.sav

Pregunta: calories en la dieta afecta a l’esperanca de vida?Sintaxis de SPSS

Albert Satorra

Page 27: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Robust s.e. (Optional)

Lectura de dades

library(foreign)data= read.spss("http://www.econ.upf.edu/˜satorra/dades/PAISOS.SAV")attach(data)

names(data)CALORIES[(CALORIES == 9999)]=NA

Albert Satorra

Page 28: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Robust s.e. (Optional)

Valors missing?

> ESPVIDA[1] 46.4 52.1 47.5 39.0 50.7 53.5 44.9 50.2 55.6 43.5 47.5 56.5 45.6 47.3 51.0 46.5 60.4 46.0 47.4 65.2 50.4 55.7 48.0 66.7 48.9 45.0

[27] 65.5 55.0 47.6 49.4 61.5 56.0 68.5 44.5 56.0 51.5 71.9 67.7 53.7 60.5 63.6 51.0 62.7 62.1 70.4 59.4 49.3 66.3 56.0 64.7 67.6 55.8[53] 64.8 63.3 69.6 57.5 68.8 51.3 67.9 69.9 66.4 65.2 70.3 69.3 66.0 73.6 71.2 70.0 58.8 67.8 69.0 67.1 71.1 76.3 66.5 70.9 70.0 71.5[79] 67.5 71.5 73.6 64.9 72.8 76.0 71.3 73.8 70.2 66.3 67.6 62.9 70.8 70.5 71.7 69.0 72.5 70.8 71.6 67.5 53.5 70.0 72.0 72.1 75.6 69.6

[105] 71.1 77.6 74.6 69.7 71.6 77.0 73.1 75.5 75.3 76.5 77.6 78.6 70.5 74.8 77.6 76.2 74.9 77.5 77.4 77.4 76.4 76.9 73.8 75.7 76.2 76.0[131] 76.0 78.2 76.9 75.3 78.2 79.5 75.7 78.0 72.0 76.1 68.5 66.0 63.1 67.1 75.3 63.7 71.1 43.5 50.2 65.2 56.6 57.6 52.0 53.0 46.5 51.6[157] 55.4 47.0 74.2 48.3>

> CALORIES[1] 1680 2021 1610 1695 9999 1957 2162 1941 2019 2556 1989 2135 1827 1821 2259 2257 2395 2279 2387 2385 2125 2075 9999 2296 1931 2360

[27] 2380 2243 2532 1691 2316 2206 2729 2390 1897 2685 2275 2306 1989 2201 3336 2491 2755 2624 2222 2100 2265 2258 1981 2714 2509 2615[53] 2255 2985 2342 2706 2587 2297 3031 3252 2663 2744 2548 2678 1883 2607 3683 2670 2120 3333 2443 2897 3464 2889 3429 3609 2618 3092[79] 2861 2407 2670 2288 2239 2820 3609 2583 2696 2824 3380 2705 2884 2582 2622 3638 2750 3181 2589 2614 2511 2340 2295 2880 3223 9999

[105] 3298 3793 3414 2751 9999 3782 2791 3389 3779 3150 3567 3144 9999 3249 3186 3181 2535 3508 3163 3462 3947 3449 3295 3144 3496 3544[131] 3676 3518 3338 3622 2945 2909 9999 3565 9999 3238 3319 2122 3310 3175 2833 1899 2834 1523 2203 2250 1707 2598 2060 2202 1840 2065[157] 1640 1505 2745 9999

> CALORIES[(CALORIES == 9999)]=NA> CALORIES

[1] 1680 2021 1610 1695 NA 1957 2162 1941 2019 2556 1989 2135 1827 1821 2259 2257 2395 2279 2387 2385 2125 2075 NA 2296 1931 2360[27] 2380 2243 2532 1691 2316 2206 2729 2390 1897 2685 2275 2306 1989 2201 3336 2491 2755 2624 2222 2100 2265 2258 1981 2714 2509 2615[53] 2255 2985 2342 2706 2587 2297 3031 3252 2663 2744 2548 2678 1883 2607 3683 2670 2120 3333 2443 2897 3464 2889 3429 3609 2618 3092[79] 2861 2407 2670 2288 2239 2820 3609 2583 2696 2824 3380 2705 2884 2582 2622 3638 2750 3181 2589 2614 2511 2340 2295 2880 3223 NA

[105] 3298 3793 3414 2751 NA 3782 2791 3389 3779 3150 3567 3144 NA 3249 3186 3181 2535 3508 3163 3462 3947 3449 3295 3144 3496 3544[131] 3676 3518 3338 3622 2945 2909 NA 3565 NA 3238 3319 2122 3310 3175 2833 1899 2834 1523 2203 2250 1707 2598 2060 2202 1840 2065[157] 1640 1505 2745 NA Albert Satorra

Page 29: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Robust s.e. (Optional)

plot matricial

ESPVIDA

1500 3000

●●

●●●

●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●●●

●●●

●●●

●● ●●

●●● ●

●●●

●●

●●●● ●

●●● ●● ●●

●● ●●●

●●

●●

● ●● ●●●●●●

●● ● ●●●● ●●●●● ●

●● ●●

●●

●●

● ●●●

●●

●●●

●●

●●●

●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●● ●

●●●

●●●

●●●●●

●●●●

●●●

●●●

●●

● ●●

● ●●●

●● ●●

●●●●

●●

●●

● ●●●●●●●

●●

●●●●●●●●●●●●●●●●●●●●

●●

●●●

● ●●●

●●●

1.0 2.0 3.0

●●●

●●

●●●

●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●●● ●●

●●●

● ●●●●

●●● ●

●● ●

●●

●●●●●

●● ●●●●●●

●●●●

● ●

●●

● ●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●

●●

●●●●

●●●●

●●●

● ●●

●●

●●●

●●

●●●

●●

●●

●●

● ●

●●

●●

●●

●●

● ●

●●

●●

● ●●●

●●●

●●●

● ●●●

●●●●

●●●

●●●

●●● ●●

●●●●●●●

●●●●

● ●

●●

● ●●

●●●●●

●●

●●●●●●●●

● ●●●●●●●●●●●

●●

●●

●●

● ●●●

●●

0 40 80

●●●

●●

●●●

●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●● ●●●●

●●●

●●●●

●●●●

●●●

●●●●

●● ●●

● ●●●

●●●●

● ●●●

● ●

●●●●●●

●●●● ●

●●

● ●●● ●●●●

● ●●●● ●●●●●

●●

●●

●●●●

●●●●

●●

● ●●●

●●

●●●

●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●●●●●

●●●

●●●●

●●●●

●●●

●●

●●●● ●

●●● ●●● ●●

● ●●●

●●

●●

● ●●

●●●● ●●

●●

●●●●● ●●●

● ●●● ●●● ●● ●

● ●

●●

●●●

●●●●

●●●

0 40 80

4060

80

●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●● ●

●●●

●●●

● ●●●●

●●●●

●●●

●●

●●●

●●●

● ●●●

● ●●●

●●●●

●●

●●

●●●

●●●●●●

●●●● ●●●●●●

●●●●●●●●

●●●●

●●

●●●●

●●●●

●●●

●15

0030

00

●●

●●●

●● ●

●●

●●

●● ●●● ●● ●

● ●●●

●●

●●

●●●

●●●

●●● ●

●●●

● ●●

●●

●● ●●

●●

●●

●●

● ●

●●●

●●●●

●●●●●

●●●

●●●●

●●●

●●●●●●●●

●●

●●●

●●●

● ●

●●

●●

●●

● CALORIES

●●

●●●●●●

●●●●

●●●●● ●●●●

●●●●

●●

●●

●●●

●● ●

●●●●●

●●●

●●●

●●

● ●●●

●●

●●

●●

●●

● ●●

●●

●●

●●●●●

●●●

●● ●●

●●●

●●●●●●●●●

●●

●●●

●●●

●●

●●

●●

●●

●●

●●●●●●

●●●●

●●●●● ●●●

●●●●

●●

●●

●●●

●●●

●●● ●

●●●

●●●

●●

●●● ●

●●

●●

●●

● ●

●●●

●●

● ●

●●●●●

●●●

●●●●

●●●

●●●●●●●●●

●●

●●●

●●●

● ●

●●

●●

●●

●●

●●●

●●●

●●

●●

●● ●●● ●● ●

● ●●●

● ●

●●

●●●

●● ●

● ●● ●

●●●

● ●●

●●

●●●●

●●

●●

●●

● ●

●●●

●●●●

●●●●●

●●●

●●●●

●●●

●●

●●●

●●●●

●●

●●●

●●●

● ●

●●

●●

●●

●●

●●●●●●

●●●●

●●●●●●●●●

● ●●●

●●

●●

●●●●

●●●

●●●●●

●●●

●●●

●●

● ●●●

●●

●●

●●

●●

●●●

●●

●●

● ●●● ●

●● ●

●●● ●

●●

●●

●●●●

●●

●●

●●●

●●●

●●

●●

●●

●●

●●

●●●●●●

●●

●●

●●●●●●●●●

●●●●

●●

●●

●●●●

●●●

●●●●●

●●●

●●●

●●

●●●●

●●

●●

●●

● ●

●●●

●●●●

●●●● ●

●●●

●●●●

●●

●●

●●● ●

●●

● ●

●●●

●●●

●●

●●

●●

●●

●●

●●●

●●●

●●●●

●●● ●●●● ●●

● ●●●

● ●

●●

●●●

●●●

●●●●●

●●●

●●●

●●

● ●●●

●●

●●

●●

●●

●●●

●●

●●

● ●●●●

●●●

●●●●

●●●

●●●●●

●●

●●

●●

●●●

●●●

●●

●●

●●

●●

● ●●●

●● ● ●

● ●●●●● ●●●

● ●

●●

● ●

●●● ●● ●● ●

●●●

●● ●

● ●

●● ●●

●●

● ●● ●

● ●●

● ●

●●

●●

●●●

● ●

●●●

●●● ●●●

● ●

●●

●●● ●●

●●

●●

● ●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●

●● ●●

● ●●

●●

● ●●

●●

● ●●●●●●●

●●●● ●●●●●

●●●●

●●

● ●● ●● ●●●

●●●

●●●

●●

●●●●

●●

●● ●●

●●●

●●

●●

● ●

●● ●

●●

●● ●

●●●● ● ●

●●

● ●

●●●●●

●●

●● ● ●● ●

●●

●●● ●● ● ●●●● ●●●●● ●●● ●●●● ●

●● ●●

●●●

●●

●●●

SANITAT

●●●●

●●●●

●●●●●●●●●

●●

●●

●●

●●●●● ●●●

●●●

●● ●

●●

●● ●●

●●

●●●●

●●●

●●

●●

●●

● ●●

●●

●● ●

● ●● ●● ●

● ●

●●

●●● ●●

●●

●●

●●●●●●●

●●●●●●●●●●●●●●●●●●●●● ●●●●

●●●●

● ●●

●●

●●●

●●

● ●●●

● ●●●

● ●●●●● ●●●

● ●

● ●

● ●

●● ●● ● ●● ●

●● ●

●● ●

●●

●● ●●

● ●

●●● ●

● ●●

● ●

●●

●●

● ●●

●●

●●●

●●● ●●●

●●

●●

●●● ●●

●●

●●

●●●●●●●

●●● ●●●●●● ●●●●●●●●●●●● ●●● ●

●●●●

● ●●

●●

●●●

●●

●●●●

●●●●

●●●●●●●●●

●●

●●

● ●

●●●●● ●●●

●●●

●●●

● ●

●●●●

●●

●●●●

●●●

●●

●●

●●

●●●

●●

●●●

●●●● ●●

●●

●●

● ●●● ●

●●

●●

●●● ●●●●

● ●●● ●●●●● ●●●● ●●● ●●●●●●●●●

●● ●●

●●●

●●

● ●●

● ●

●●●●

●●●●

●● ●●●●●●●

●●

●●

●●

●●●●●●●●

●●●

●●●

●●

●●●●

●●

●●●●

●●●

● ●

●●

● ●

●●●

●●

●●●

●●● ●● ●

●●

●●

●●●● ●

● ●

●●

●●● ●●● ●

●●●●● ●●●● ●●● ●●● ●● ●● ●● ●●●●

● ●●●

●●●

●●

●●●

●●

040

80

●●●●

●●●●

●●●●●●● ●●

● ●

●●

● ●

●●●● ● ●●●

●● ●

● ●●

● ●

● ●● ●

● ●

●●●●

●● ●

●●

● ●

●●

● ●●

●●

●●●

●●●●● ●

● ●

● ●

● ●● ●●

● ●

●●

● ●●●●●●

●● ●●●●●●●●●●●●●●●●●● ●●●●●

● ●● ●

●●●

●●

●●●

●●

1.0

2.0

3.0

● ●●● ●●● ● ●●● ●●●●● ●●●

● ●●

●● ●●●● ●●

● ●●

●●●

●● ●●

●● ●●● ●● ●● ●● ●●●● ●

● ●●●● ●

● ●

●●

●●●●●

●●●●●●

●●●●

● ●●

● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●● ●● ●

● ●

●●●●● ●●●

● ● ●●●●●●● ●●●●● ●●●●●

●●

● ●●● ●● ●●

●● ●

●●●

●●●●

●● ●●●● ●● ●●● ●●●●●

● ● ●●● ●

●●

●●

●●●● ●

● ●● ●●●

●● ● ●● ●●

●● ● ●● ●●●●●● ●● ● ●●●● ●●●●● ●●● ●●

●● ●●●● ●

● ●

● ●●●●●●●

●●●● ●●●●● ●●●●●●●●●●

●● ●

● ●●●●●●●

●● ●

●●●

● ●●●

●● ●●●●●● ●●●● ●● ●●

● ●●● ●●

●●

● ●

●● ●● ●

● ●●● ●●

●●● ●

● ●●

●●●●●●●● ●● ●●●●●●●●●●●●●●●●●●●●

●●●● ●●●

●●

● ●●● ●●●●

NIVELL

● ●●● ●● ●●● ●● ●●●●● ●●●

● ●●

●● ●●● ●● ●

● ●●

●● ●

● ●● ●

●●● ●●●● ●● ●● ● ●●● ●

●●●●●●

●●

● ●

●●●● ●

●●●●●●

●●●●

●●●

●●●●●●●●● ●●●● ●●●●●● ●●●●●●●●●●●

●● ●● ●●●

●●

● ●●●●●●●

● ●●●● ●●●●●●●●●●●●●●●

●●●

●● ●●●●●●

●●●

●●●

●● ●●

●● ●●●●●●●●●● ●● ●●

●●●●●●

●●

● ●

●●●● ●

●●●●● ●

● ●●●

●●●

●●● ●●●● ●● ●● ●●● ●●●●● ●●●● ●●● ●●●●

●●●●●● ●

●●

●●●●● ●●●

● ●●●● ●●●●● ●●● ●●●●●●●

●●●

● ●●●●●●●

●●●

●●●

●●●●

●●●●●●●●●●●● ●●● ●

●● ●●●●

●●

● ●

●●●● ●

● ●●● ●●

● ●●●

●●●

●●● ●●● ●● ●●●●●●● ●●●● ●●● ●●● ●● ●● ●

●●●●● ●●

●●

●●●●●●●●

● ●●●● ●●●●● ●●●●●●●● ●●

● ●●

● ● ●●●●● ●

●●●

●● ●

●● ●●

● ●● ●●●●● ●● ●● ●● ●●

●●●● ●●

●●

● ●

●●●●●

●●● ●● ●

●●●●

●●●

● ●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●

●●●●● ●●

●●

●●●●●●●●

●●●

● ●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●● ●●●

●●

● ●

●●

●●

●●●●

●●

● ●●●

●●●●

●●●●

● ●●

●●

●●●●●●

●●

●●

●●●●●

●●●●●●●●●●●

●●●

●●

●●

● ●

●●

●●

●●

●●

●●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●● ●●●

●●

●●

●●

●●

● ● ●●

●●

●●●●

●●●

●●

●● ● ●● ●

●● ● ●● ●●●

●●

●● ● ●●

● ●●●●● ●●● ●●●

●●

●●

● ●

●●

●●

●●

●●●

●●

●●

●●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●● ●●●

●●

●●

●●

●●

●●●●

●●

●● ●●

●●●

●●

●●● ●

● ●●

●●●●●●●●

●●

●●

●●●●●

●●●●●●●●●●●

●●●

●●

●●

●●

●●

●●●●

●●

●●●

●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●

● ●●●●●

●●

●●

●●

● ●

●● ●●

●●

● ●●●●●●●●

●●●●

●●●

●●●●●●●●

●●●●

●●●●●

●●●●●●●●●●●

●●●

●●

●●

●●

●●

●●●●

●ALFAB

●●●

●●

●●

●●●

●●

●●●●

●●

●●

●●

●●

●●

●●

●●●●●●

●●

●●

●●

●●

● ●●●

●●

●● ●●

●●●●

●●●●

●●●

●●● ●●●● ●

●●

● ●

● ●●●●

●●●● ●●● ●●●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●●●●●

●●

●●

●●

●●

●● ●●

●●

●●●●

●●●

●●

●●●●

●●●

●●

● ●●● ●●

●●

●●

●● ●●●

●●● ●●● ●● ●● ●

●●●

●●

●●

●●

●●

●●●●

2060

100

●●●

●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●● ●●●●

●●

●●

●●

●●

●●●●

●●

● ● ●●●●

●●

●●●●

●●●

●●

●●●●●●

●●●●

●●●●●

●●●●●●●●●●●

●●●

●●

●●

●●

●●

●●●●

040

80

● ●●●

●● ● ●●● ●●●●● ●●●●

● ●● ●●●

●●● ●●

●● ●

● ●●● ●●● ●●

●● ●●

●●

● ●● ●● ●● ●

●●● ●●●

●●●●

●●●

●●

●●

●●

●●

●●●

●●

●●

●●●●●

● ●

●●●

●●●●●●

●●●●

●●

●●●

●●●●●

●●●●

●●

●●

●●

●●●

●●●●●

● ● ●●●●●●

●●●

● ●●●●●●● ●●●●● ●●●●●●

●●●● ●

● ●● ●●

●●●

●●●●● ●●●●

●●●●

●●

●● ●● ●●● ●

●●● ●●

●●

●● ●●

● ●

●●

●●

●●

●●

● ●●

●●

●●

●●●

●●

●●

● ●

●●●

●●

●●●

●●●

●●

●●

●●●

●●●●

●●

●●

●●

● ●●●●●

● ●●● ●●●●

●●●

●●●●

●●●● ●●●●●●●●●●●

●● ●●● ●

●●●●●

●●●

●●●● ●●●● ●

●●●●

●●

●●●● ●●●●

●●● ●●

●●

●●●●●●

●●

●●

●●

●●

●●●

●●

●●

●●●

● ●

●●

● ●●

●●● ●●●

●●●●

●●●●●

●●●●●

●●●●

●●

●●

●●

●●●

●●●●●●

●●●● ●●● ●

●●●

●●●●

●●●●●●●●●●●●●●●

●●● ●●●

●●●●●

●●●

● ●●●● ●● ●●

●● ●●

●●

●●●●●●●●

● ●●●●●●● ●●

●●●

●●

●●

● ●

●●

● ●●

●●

●●

●●●●●

● ●

●●●

●●● ●●●

●●●●

●●●●●

●●●●●

●●●●

●●

●●

●●

●●●

●●●●●●

●● ●●●●●●

●●●

● ●●●

● ●●● ●● ●●●●● ●●●●

● ●●● ●●

●● ●● ●

●● ●

● ●● ●●●● ● ●

●● ●●

● ●

●●● ●● ●● ●

●●●●●●

●● ●●

●●●

●●

●●

●●

●●

●●●

●●

●●

●●●

●●

● ●

●●●

●●● ●●●

●●●

●●●

●●

●●●●●

●●●

●●

●●

●●

●●●

● ●●●●

●● ●● ●●●●

●●●

DIARIS

●●●●

●●●● ●●● ●●●●●●●●●●●●● ●

●●●●●

●●●

●●●●●●●●●

●●●●

●●

●●●●●●●●

●●●●●

●●●●●

●●●

●●

●●

●●

●●

● ●●

●●

●●

●●●

● ●

●●

●●●

●●● ●●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●

● ●

●●●

●●●● ●

●●●●●●●●

●●●

●●●●

●●●● ●●●●●●●● ●●●

● ●●●● ●

●●●● ●

●●●

●●● ●● ● ●●●

● ●● ●

● ●

●●●● ●● ●●

●● ●●●●

●● ●●●● ●

●●

●●

●●

●●

● ●●

●●

●●

●●●

●●

● ●

●●●

●● ●● ●●

●●●●

●●

●●

●●●●●

●●

●●

●●

●●

●●

●●●

●●●● ●

●●●●●●●●

●●●

● ●●●

●● ● ●

● ●

●●

●● ●●● ●● ●●●●

● ●

●●● ●● ●● ●● ●●● ●●

● ●● ●●

● ●● ●●● ●● ●●

●●

●●

●●

●●● ●●

●●

●●●

●●

●●●

●●

●●●

● ●●●

●●

● ●

●●

● ●●●

●●

●●●●●

●●

●●●●●

●●

●●●

●●

●●●

●●

●●●●

●● ● ●●●●●

● ●●●

● ● ●●●●●●●

●●

●●

●●●●●●●●●●

●●

● ●● ●● ●●● ●●●●●●

●●●●●

●●● ●●●● ●●●

●●

●●

●●

●●● ●●

●●

●● ●

● ●

●●●

●●

●●●

●●●●

●●

●●

●●

●● ●●

●●

●●●

●●●●●

●●

●●

● ●●

●●

●●●

● ●●●

●● ●●● ●●●

●●●●

●●●●

●●●●

●●

●●

●●●●● ●●● ●●●

●●

●●●●●●●● ●●●● ●●●● ●●●●●● ●●●●●● ●

●●

●●

●●●●● ●●

●●

●●●

●●

●●●

●●

● ●●

●●●●

●●

●●

●●

● ●●●

●●

●●●●

●●

●●●●●

●●

●●●

●●●●●

●●

●●●●

●●●●● ●●●

●●●●

● ●●●●

●●●●

●●

●●

●●●●● ●●●●●●

●●

●●●●● ●●●● ●●●●●

● ●●●●

● ●●●●●●●●●●●●●●●● ●●●●

●●● ●●

●●

●●●

●●

●●●

● ●● ●

●●

● ●

●●●

●●●●

●●

●●●●●

●●●●●●●

●●

●●●

●●●●●

●●

●●●●

●●● ●●●●●●●●●

● ● ●●●

● ●●●

● ●

●●

●● ●●● ●● ●●● ●

● ●

●● ●● ● ●● ●● ●● ●●●

● ● ●●●

● ●●● ●●●● ●●

●●

●●

●●

●●●●●

●●

● ●●

●●

●●●

●●

●●●

●●●●

●●

● ●

●●●

●●●●

●●

●●●

●●

●●

●●●

●●

●●

●●●

●●●●●

●●

● ●●●

●●● ●● ●●●

●●●●

● ●●●●

●●●●

●●

●●

●●●●●●●●●●●

● ●

●●●●● ●●●●●●●●●●●● ●●

●●● ●●●●●●●●

●●

●●

●●●●●●

●●

●●●

●●

●●●

●●

●●●

●● ●●

●●

●●

●●●

●●●●

●●

●●●

●●

●●

● ●●

●●

●●

●●●

●●

●●

●●

●●●●

●●●●●●●●● ●●●

TV

040

80

●●●●

●●●●

●●

●●

●●● ●●● ● ●●●●

● ●

●●●● ● ●●●●●● ●●●

●●● ●●

●● ●● ●●●●● ●●

●●

●●

●●● ●●●

●●

● ●●

● ●

●●●

●●

●●●

● ● ●●

●●

● ●

●●

●●●●

●●

●●●●

●●●●

●●

●●

●●●

●●●●●

●●

●●●●

● ●●●●●●●●●●●

40 60 80

040

80

● ●●

●●●

●● ●●●

●●

●●

●●

●●

●●●

●●

●● ●●

●●

●●

●● ●

●●

●●

●●

●● ●

●●

●●

●●

● ●●

●●●●

●●

●●●

●●

●●

●●

●●

●●

●●●●●●●●●●

●●●●●●●●●●●●●●●●●

●●●

●● ●

● ●

● ● ●●●●●● ●●●

●●● ●●

●●

●●●●●

●●

●●

●●

● ●●

●●

●●●●

●●

●●

● ●●

●●

●●

●●

●●●

●●

●●

●●

●● ●

●●● ●

●●

● ●●

●●

●●

●●●

●●

●●

● ●●

●●●●●

●● ● ●●●● ●●●●● ●●● ●●

●●●●●

● ●

● ●●● ●●●●●●●

0 40 80

●●●

●●●●

●●●●●●●

●●

● ●

●●

●●●

●●●●●

●●

●●

●●●

●●

●●●●

●●●

●●

●●

●●

●●●

● ●● ●●●

● ●●

●●

●●

●●

●●●

●●●●●●●

● ●●●●

●●●●●●●●●●●●●●●●●

●●●●● ●

●●

●●●●●●● ●●●●

●●●●●

●●●●

●●●●●●●

●●

●●

●●

●●●

●●

●●●●

●●

●●

●●●

●●

●●●●

●●●

●●

●●

● ●

● ●●

●●●●

●●

●●●

●●

●●

●●

●●●

●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●

●●●●●●

●●

●●●●●●●●

●●●

●●

20 60 100

● ●●

●●●

●● ●●●●

●●

●●

●●

●●●

●●

●● ●●

●●

●●

●● ●

●●

●●

●●

●●●

●●

●●

●●

● ●●

●●● ●●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●● ●●●

●●●●●● ●●●●●●●●●●●

●●●

●● ●

●●

●●●● ●●●●●●●

●●●●●

●●●●

●●●●●●●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●● ●

●●● ●●

●●●

●●

● ●

●●

●●

●●●

●●●

●●● ●

● ●

● ●●●●● ●●●●●●● ●●●●

●●●

●●●

● ●

●●●●●●●● ●●●

● ●

0 40 80

●●●

●●●●

●●●●●

●●

●●

●●

●●

●●●

●●●●●●

●●

●●

●●●

●●

●●

●●

●●●

●●

●●

●●

● ●●

●●● ●

●●

● ●●

●●

● ●

●●

●●●

●●

●●●

●●

● ●●●●

●● ●●●● ●●● ●●● ●● ●● ●

●●●

●●●

●●

●●●●●●●●●●●

●●

AGRICULT

Figure: Matrix PlotAlbert Satorra

Page 30: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Robust s.e. (Optional)

Regressio simple: ESPEV vs. CALORIES

Model 1

(Intercept) 28.7629∗

(2.7159)CALORIES 0.0135∗

(0.0010)

R2 0.5481Adj. R2 0.5451Num. obs. 152

*p < 0.05

Table: Statistical models

length(ESPVIDA)= 160Albert Satorra

Page 31: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Robust s.e. (Optional)

Regressio simple: ESPEV vs. CALORIES, ALFAB

Model 1

(Intercept) 28.1428∗

(1.9458)CALORIES 0.0062∗

(0.0009)ALFAB 0.2714∗

(0.0227)

R2 0.7698Adj. R2 0.7667Num. obs. 152

*p < 0.05

Table: Statistical modelsAlbert Satorra

Page 32: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Robust s.e. (Optional)

ESPEV vs. CALORIES, ALFAB , NBAIX, NALT

Model 1

(Intercept) 49.0691∗

(3.0704)CALORIES 0.0030∗

(0.0008910)ALFAB 0.1170∗

(0.0268)NBAIXTRUE −6.6066∗

(1.3683)NALTTRUE 4.8071∗

(1.0609)HABMETG −0.0001752∗

(0.0000526)

R2 0.8589Adj. R2 0.8535Num. obs. 136*p < 0.05

Table: Statistical models

baix mitja alt : 47 54 59NALT = NIVELL == "alt"; NBAIX = NIVELL == "baix" ; HABMETG[HABMETG == 99999 ] =NA

table(NIVELL); re=lm(ESPVIDA CALORIES + ALFAB + NBAIX +NALT + HABMETG)Albert Satorra

Page 33: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Robust s.e. (Optional)

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

● ●

●● ●● ●

●●

●● ●

●●

1500 2000 2500 3000 3500 4000

−5

05

1015

CALORIES

beta

*CA

LOR

IES

+re

s

Figure: Grafic de regressio parcial: ESPVI versus CALORIES

Albert Satorra

Page 34: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Robust s.e. (Optional)

●●

● ●

●●

● ●

●●

●●

●●

●●

●●●

●●

20 40 60 80 100

−5

05

1015

ALFAB

beta

*ALF

AB

+re

s

Figure: Grafic de regressio parcial: ESPVI versus ALFAB

Albert Satorra

Page 35: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Robust s.e. (Optional)

(Optional) Regression with regular s.e.

http://diffuseprior.wordpress.com/2012/06/15/standard-robust-and-clustered-standard-errors-computed-in-r/

r1=lm(Ldespeses ˜ Lrenda + Genere)

Estimate Std. Error t value Pr(>|t|)(Intercept) 2.97911 0.19970 14.92 <2e-16 ***Lrenda 0.22736 0.01927 11.80 <2e-16 ***Genere -0.54637 0.03867 -14.13 <2e-16 ***

# get X matrix/predictorsX <- model.matrix(r1)# number of obsn <- dim(X)[1]# n of predictorsk <- dim(X)[2]# calculate stan errs as in the above# sq root of diag elements in vcovse <- sqrt(diag(solve(crossprod(X)) * as.numeric(crossprod(resid(r1))/(n-k))))

> se(Intercept) Lrenda Genere0.19969731 0.01927412 0.03866520

Albert Satorra

Page 36: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Robust s.e. (Optional)

(Optional) Regression with heteroscedastic robust s.e.

r1=lm(Ldespeses ˜ Lrenda + Genere)

X <- model.matrix(r1)n <- dim(X)[1]k <- dim(X)[2]

# residual vectoru <- matrix(resid(r1))# meat part Sigma is a diagonal with uˆ2 as elementsmeat1 <- t(X) %*% diag(diag(crossprod(t(u)))) %*% X# degrees of freedom adjustdfc <- n/(n-k)# like beforese <- sqrt(dfc*diag(solve(crossprod(X)) %*% meat1 %*% solve(crossprod(X))))

> se(Intercept) Lrenda Genere0.19980279 0.01945393 0.03799626

Albert Satorra

Page 37: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Robust s.e. (Optional)

(Optional) Regression with s.e. robust to clustering

# clustered standard errors in regression#by : http://thetarzan.wordpress.com/2011/06/11/clustered-standard-errors-in-r/cl <- function(dat,fm, cluster){

require(sandwich, quietly = TRUE)require(lmtest, quietly = TRUE)M <- length(unique(cluster))N <- length(cluster)K <- fm$rankdfc <- (M/(M-1))*((N-1)/(N-K))uj <- apply(estfun(fm),2, function(x) tapply(x, cluster, sum));vcovCL <- dfc*sandwich(fm, meat=crossprod(uj)/N)coeftest(fm, vcovCL) }

Albert Satorra

Page 38: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Robust s.e. (Optional)

(Optional) Regression with s.e. robust to clustering

r1=lm(Ldespeses ˜ Lrenda + Genere)summary(r1)

Estimate Std. Error t value Pr(>|t|)(Intercept) 2.97911 0.19970 14.92 <2e-16 ***Lrenda 0.22736 0.01927 11.80 <2e-16 ***Genere -0.54637 0.03867 -14.13 <2e-16 ***

clust= sample(1:40,800, replace=T)

> tabulate(clust)[1] 14 22 16 16 23 21 25 19 21 29 17 20 21 22 16 19 23 17 19 22 23 25 26 17 17

[26] 10 25 26 24 22 21 17 14 18 21 15 17 26 16 18

cl(cbind(Ldespesesa,Genere, clust), fit, clust)Estimate Std. Error t value Pr(>|t|)

(Intercept) 2.139410 0.126312 16.938 < 2.2e-16 ***Lrenda -0.182044 0.011657 -15.617 < 2.2e-16 ***Genere 0.459628 0.031029 14.813 < 2.2e-16 ***

Albert Satorra

Page 39: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Robust s.e. (Optional)

Material addicional de regressio simple i multiple

1 web del curs M2014M2012Setmanes12: Detalls de la regressio lineal simple imultiple + sintaxis SPSS

2 Idra UCLA:SPSS Web Books Regression with SPSS

Albert Satorra

Page 40: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Case Influence statisticsMultiple regression and multicolinearity

Variable depenent Y binaria

Fins ara Y era una variable continua

Regressio logıstica (i la regressio probit ) Y es binaria

Com en la regressio habitual, les variables explicatives poden sercontinues o binaries

Albert Satorra

Page 41: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Case Influence statisticsMultiple regression and multicolinearity

No serveix la regressio lineal ?

La relacio es no-lineal

Els terme d’error es heteroscedastics

El terme d’error no te distribucio normal

Exemple: Y = Vot, X = Lrenda

Coefficients:Estimate Std. Error t value Pr(>|t|)

(Intercept) 2.89760 0.15615 18.56 <2e-16 ***Lrenda -0.23403 0.01549 -15.11 <2e-16 ***Multiple R-squared: 0.2224

Y = 2.89− 0.23× Lrenda + ε, R2 = .22

Albert Satorra

Page 42: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Case Influence statisticsMultiple regression and multicolinearity

●●● ●

● ●● ● ●

●●

● ●●

● ●●

● ●●

●● ●

●● ● ●

●●● ●●

●●● ●

●●

●●

●●● ●● ●

●●

● ●● ●

● ●● ●●

● ●●

●● ●●

●●

●●

● ●

●●

●●●● ●

●●

● ●

● ●

●●●

● ●●

● ● ●●●

●● ● ●●

●●

● ●● ●

●●● ●●

●●

● ●

●●

●●

● ●

● ●

● ●

●●

● ●

● ● ●

●● ●● ●●

● ●

● ●

●●

●●

● ●●

● ● ●

●●

● ● ●

●●

● ●●●

●● ●

● ●

●● ●●

● ●

● ●●● ●

●●

●●

●●●●

●● ●

● ●

● ●

● ●●●

● ● ●

● ●

●●

●●

●● ●

●●

●●● ● ●

●● ●

●●● ●

● ● ●

●● ● ● ●●

●● ●

● ●

●●

● ●

●● ●●

●● ●

●● ●●●

● ●

●● ●

●● ●● ●●

●● ●●

● ●●

● ●

● ●●

●● ●●

● ● ●

●●

●●

●● ● ●

● ●

● ●●

●●

● ●● ●

● ●●

● ●● ●●

●●

● ●

●●

●● ●● ●

●●

●●

●●

●●

● ●

●●●● ●●● ●

●●●

●●●

● ●●

● ● ● ● ●

● ●

●●

●● ●

● ●

●● ● ● ●

● ●●

● ● ●●● ●

● ● ●● ●

● ● ●● ●●

●●

●●

● ●

● ●

● ● ●

● ●●

● ●●

●●

● ●●

●● ●●●● ●

● ●

●● ●

● ●

● ●

●●● ●

● ●

● ●

● ●

● ●

● ●

●●

●● ●

● ●

●● ●

●●●

● ●●● ●

● ● ●● ●

●● ●●

●● ●

● ●● ●●

● ●● ●

● ●

● ● ●

● ●

●●

● ●

●●

● ●● ●

●●

●●

●● ●

● ●●

● ●

● ●● ●

●● ●●●● ●

●●

●●

●●

●●

●● ● ●

● ●●

● ●

● ●● ● ●● ●● ● ●

●●● ●

● ●

●●

●●

● ●

●●

●● ● ●

● ●●

●●●

● ● ●●

●●

●●

● ●● ●

● ●●● ●

● ●

●●

●●

●●

●● ●●● ● ●

● ●

7 8 9 10 11 12 13

0.0

0.2

0.4

0.6

0.8

1.0

Lrenda

Vot

Figure: Vot vs. Lrenda

Albert Satorra

Page 43: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Case Influence statisticsMultiple regression and multicolinearity

Regressio logıstica (el model)

Suposem que Yi ∼ Bernoulli (πi)πi = P(Yi = 1), i = 1, . . . , nprobabilitats→ odds (probabilitats en contra)→ logit

π → o(odds) = π/(1− π)

→ L(logit) = ln (o)

Li = ln πi1−πi

↔ πi =1

1+e−Li

Model lineal logit: Li = β0 + β1X1 + β2X2 + β3X3

Model no-lineal de probabilitat:

πi =1

1 + e−(β0+β1X1+β2X2+β3X3)

Albert Satorra

Page 44: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Case Influence statisticsMultiple regression and multicolinearity

Ajust de la regressio logıstica

Exemple: Y = Vot, X = Lrenda

πi =1

1+e−Li

πi =1

1+e−12.389+1.208Lrendai

glm(Vot ˜ Lrenda, family = "binomial")oefficients:

Estimate Std. Error z value Pr(>|z|)(Intercept) 12.389 1.027 12.07 <2e-16 ***Lrenda -1.208 0.101 -11.96 <2e-16 ***

Number of Fisher Scoring iterations: 4

L = 12.389− 1.208× Lrenda

Albert Satorra

Page 45: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Case Influence statisticsMultiple regression and multicolinearity

●●● ●

● ●●● ●

●●

● ●●

● ●●

● ●●

●● ●

●● ● ●

●●● ●●

●●● ●

●●

●●

●●● ●● ●

●●

● ●● ●

● ●● ●●

● ●●

●● ●●

●●

●●

●●

●●

●●●● ●

●●

● ●

● ●

●●●

●●●

● ● ●●●

●● ● ●●

●●

● ●● ●

●●● ●●

●●

● ●

●●

●●

● ●

● ●

● ●

●●

● ●

● ● ●

●● ●● ●●

● ●

● ●

●●

●●

● ●●

● ● ●

●●

● ● ●

●●

● ●●●

●● ●

● ●

●● ●●

● ●

● ●●● ●

●●

●●

●●●●

●● ●

● ●

● ●

● ●●●

● ● ●

● ●

●●

●●

●● ●

●●

●●● ● ●

●● ●

●●● ●

● ● ●

●● ● ● ●●

●● ●

● ●

●●

● ●

●●●●

●● ●

●● ●●●

● ●

●● ●

●● ●● ●●

●● ●●

● ●●

●●

● ●●

●● ●●

● ● ●

●●

●●

●● ● ●

● ●

● ●●

●●

● ●● ●

● ●●

● ●● ●●

●●

● ●

●●

●● ●● ●

●●

●●

●●

●●

● ●

●●●● ●●● ●

●●●

●●●

● ●●

● ● ● ● ●

● ●

●●

●● ●

● ●

●● ● ● ●

● ●●

● ● ●●● ●

● ● ●● ●

● ● ●● ●●

●●

●●

● ●

● ●

● ● ●

● ●●

● ●●

●●

● ●●

●● ●●●● ●

● ●

●● ●

● ●

● ●

●●● ●

● ●

● ●

● ●

● ●

●●

●●

●● ●

● ●

●● ●

●●●

● ●●● ●

● ● ●● ●

●● ●●

●● ●

● ●● ●●

● ●● ●

● ●

● ● ●

● ●

●●

● ●

●●

● ●● ●

●●

●●

●● ●

● ●●

● ●

● ●● ●

●● ●●●● ●

●●

●●

●●

●●

●● ● ●

● ●●

● ●

● ●● ● ●● ●● ● ●

●●● ●

● ●

●●

●●

● ●

●●

●● ● ●

● ●●

●●●

● ● ●●

●●

●●

● ●● ●

● ●●● ●

● ●

●●

●●

●●

●● ●●● ● ●

● ●

6 8 10 12 14

0.0

0.2

0.4

0.6

0.8

1.0

Lrenda

Vot

linear modellogistic model

Figure: Vot vs. Lrenda: linear versus logistic fits

Albert Satorra

Page 46: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Case Influence statisticsMultiple regression and multicolinearity

Interpretacio dels parametres

L = 12.389− 1.208× Lrenda

exp(-1.208)= 0.2987943 → 0.2987943− 1 = −0.7012057Odds disminueixen en un 70% quan X → X + 1

(exp(β)− 1)× 100

% d’augment/decreixement odds

Albert Satorra

Page 47: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Case Influence statisticsMultiple regression and multicolinearity

Vot versus Lrenda + Genere

glm(formula = Vot ˜ Lrenda + Genere, family = "binomial")(Intercept) 12.2964 1.2207 10.07 <2e-16 ***

Lrenda -1.3238 0.1229 -10.77 <2e-16 ***Genere 2.6149 0.2017 12.97 <2e-16 ***

L = 12.2964− 1.3238× Lrenda + 2.6149× Genere

> 100*(exp( -1.3238 )-1)[1] -73.38779

> 100*(exp( 2.6149 )-1)[1] 1266.585

Albert Satorra

Page 48: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Case Influence statisticsMultiple regression and multicolinearity

Lrenda→ Lrenda+1 disminueix un 73% els odds de Vot = 1,controlant per genere

Els odds de Vot = 1 dels nois (Genere = 1) son un 1266%superiors que els de les noies (Genere = 0), controlant per Renda

Albert Satorra

Page 49: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Case Influence statisticsMultiple regression and multicolinearity

●●● ●

● ●●● ●

●●

● ●●

● ●●

● ●●

●● ●

●● ● ●

●●● ●●

●●● ●

●●

●●

●●● ●● ●

●●

● ●● ●

● ●● ●●

● ●●

●● ●●

●●

●●

●●

●●

●●●● ●

●●

● ●

● ●

●●●

●●●

● ● ●●●

●● ● ●●

●●

● ●● ●

●●● ●●

●●

● ●

●●

●●

● ●

● ●

● ●

●●

● ●

● ● ●

●● ●● ●●

● ●

● ●

●●

●●

● ●●

● ● ●

●●

● ● ●

●●

● ●●●

●● ●

● ●

●● ●●

● ●

● ●●● ●

●●

●●

●●●●

●● ●

● ●

● ●

● ●●●

● ● ●

● ●

●●

●●

●● ●

●●

●●● ● ●

●● ●

●●● ●

● ● ●

●● ● ● ●●

●● ●

● ●

●●

● ●

●●●●

●● ●

●● ●●●

● ●

●● ●

●● ●● ●●

●● ●●

● ●●

●●

● ●●

●● ●●

● ● ●

●●

●●

●● ● ●

● ●

● ●●

●●

● ●● ●

● ●●

● ●● ●●

●●

● ●

●●

●● ●● ●

●●

●●

●●

●●

● ●

●●●● ●●● ●

●●●

●●●

● ●●

● ● ● ● ●

● ●

●●

●● ●

● ●

●● ● ● ●

● ●●

● ● ●●● ●

● ● ●● ●

● ● ●● ●●

●●

●●

● ●

● ●

● ● ●

● ●●

● ●●

●●

● ●●

●● ●●●● ●

● ●

●● ●

● ●

● ●

●●● ●

● ●

● ●

● ●

● ●

●●

●●

●● ●

● ●

●● ●

●●●

● ●●● ●

● ● ●● ●

●● ●●

●● ●

● ●● ●●

● ●● ●

● ●

● ● ●

● ●

●●

● ●

●●

● ●● ●

●●

●●

●● ●

● ●●

● ●

● ●● ●

●● ●●●● ●

●●

●●

●●

●●

●● ● ●

● ●●

● ●

● ●● ● ●● ●● ● ●

●●● ●

● ●

●●

●●

● ●

●●

●● ● ●

● ●●

●●●

● ● ●●

●●

●●

● ●● ●

● ●●● ●

● ●

●●

●●

●●

●● ●●● ● ●

● ●

6 8 10 12 14

0.0

0.2

0.4

0.6

0.8

1.0

Lrenda

Vot

SimpleMult. HomesMult. Dones

Figure: Corbes logıstiques, marginals (reg. simple) i condicionals (regr.multiple)

Albert Satorra

Page 50: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Case Influence statisticsMultiple regression and multicolinearity

(Optional) More on logistic regression: lrm

lrm( Vot ˜ Lrenda + Genere, y =T, x=T)

Logistic Regression ModelModel Likelihood Discrimination Rank Discrim.Ratio Test Indexes Indexes

Obs 800 LR chi2 413.74 R2 0.540 C 0.8810 360 d.f. 2 g 2.372 Dxy 0.7631 440 Pr(> chi2) <0.0001 gr 10.718 gamma 0.764

max |deriv| 5e-07 gp 0.379 tau-a 0.378Brier 0.140

Coef S.E. Wald Z Pr(>|Z|)Intercept 12.2964 1.2207 10.07 <0.0001Lrenda -1.3238 0.1229 -10.77 <0.0001Genere 2.6149 0.2017 12.97 <0.0001

Albert Satorra

Page 51: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Case Influence statisticsMultiple regression and multicolinearity

(Optional) More on logistic regression: eb and the % ofincrement of odds

Suppose the fitted logistic regression

L = −2 + 2 ∗ x

where

Logit2 = −2 + 2 ∗ (x + 1)(eb − 1) ∗ 100 = (e2 − 1) ∗ 100 = 639%and an unit increase of x = 1, x→ x + 1.### when p is around 0.5

x= 1Logit1 = -2 + 2*xLogit2 = -2 + 2*(x+1)prob1= 1/(1+ exp( -Logit1))prob2= 1/(1+ exp(-Logit2 ))((prob2-prob1)/prob1)*100

> prob1[1] 0.5> prob2[1] 0.8807971

### p = molt baixax= 1Logit1 = -10 + 2*xLogit2 = -10 + 2*(x+1)

(exp(2) -1)*100prob1= 1/(1+ exp( -Logit1))prob2= 1/(1+ exp(-Logit2 ))((prob2-prob1)/prob1)*100

> prob1[1] 0.0003353501> prob2[1] 0.002472623

### p = molt altax= 1Logit1 = 1 + 2*xLogit2 = 1 + 2*(x+1)prob1= 1/(1+ exp( -Logit1))prob2= 1/(1+ exp(-Logit2 ))((prob2-prob1)/prob1)*100

> prob1[1] 0.9525741> prob2[1] 0.9933071

Albert Satorra

Page 52: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Case Influence statisticsMultiple regression and multicolinearity

Optional: Case influence

−3 −2 −1 0 1 2 3

−5

05

X

Y 1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

1718

1920

21

22

allexclude 2 and 11exclude 1exclude 1 and 14

Figure: Case influence in regressionAlbert Satorra

Page 53: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Case Influence statisticsMultiple regression and multicolinearity

(Optional): multicolinearity

datafile=read.table("/Users/albertsatorra/Rstudio/DataSets/RegressioMulticol.dat")reg=lm(Y ˜ X1+factor(X2)+X3)

summary(reg)

Coefficients:Estimate Std. Error t value Pr(>|t|)

(Intercept) 0.4987 1.4834 0.336 0.73751X1 -0.8042 3.3921 -0.237 0.81310factor(X2)1 3.1534 1.7499 1.802 0.07475 .factor(X2)2 5.4445 1.8216 2.989 0.00357 **factor(X2)3 7.3730 2.4295 3.035 0.00311 **X3 5.8343 3.3368 1.748 0.08365 .---Signif. codes: 0 ’***’ 0.001 ’**’ 0.01 ’*’ 0.05 ’.’ 0.1 ’ ’ 1

Residual standard error: 6.082 on 94 degrees of freedomMultiple R-squared: 0.433, Adjusted R-squared: 0.4028F-statistic: 14.36 on 5 and 94 DF, p-value: 1.961e-10

Albert Satorra

Page 54: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Case Influence statisticsMultiple regression and multicolinearity

Optional: Case influence

−3 −2 −1 0 1 2 3

−5

05

X

Y 1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

1718

1920

21

22

allexclude 2 and 11exclude 1exclude 1 and 14

Figure: Case influence in regressionAlbert Satorra

Page 55: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Case Influence statisticsMultiple regression and multicolinearity

(Optional) s.e. robust to cluster effects: library rms

library(rms )fit =lrm( Vot ˜ Lrenda + Genere, y =T, x=T)library(rms )length(Vot)

[1] 800# assume we have a variable clustclust= sample(1:40,800, replace=T)

robcov(fit,cluster=clust)

Logistic Regression ModelModel Likelihood Discrimination Rank Discrim.Ratio Test Indexes Indexes

Obs 800 LR chi2 413.74 R2 0.540 C 0.8810 360 d.f. 2 g 2.372 Dxy 0.7631 440 Pr(> chi2) <0.0001 gr 10.718 gamma 0.764

max |deriv| 5e-07 gp 0.379 tau-a 0.378Brier 0.140

Coef S.E. Wald Z Pr(>|Z|)Intercept 12.2964 1.3397 9.18 <0.0001Lrenda -1.3238 0.1359 -9.74 <0.0001Genere 2.6149 0.1790 14.61 <0.0001

Albert Satorra

Page 56: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Case Influence statisticsMultiple regression and multicolinearity

(Optional) s.e. robust to cluster effects: bootstrap

> bootcov(fit, cluster=clust)

Logistic Regression Model

lrm(formula = Vot ˜ Lrenda + Genere, x = T, y = T)

Coef S.E. Wald Z Pr(>|Z|)Intercept 12.2964 1.4004 8.78 <0.0001Lrenda -1.3238 0.1420 -9.32 <0.0001Genere 2.6149 0.1814 14.41 <0.0001

Albert Satorra

Page 57: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Case Influence statisticsMultiple regression and multicolinearity

Material addicional de regressio logıstica

1 web del curs, M2014:Slides Logit Regression, mes detalls sobre la regressio logistica+ altre material en la seccio de regressio logıstica.

2 Idra UCLA:

SPSS Data Analysis Examples Logit RegressionR Data Analysis Examples: Logit Regression

Albert Satorra

Page 58: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Fitxer de Dades

Mostra aleatoria de mida n = 1000 d’una poblacioVariables:data= read.table("http://www.econ.upf.edu/˜satorra/M/DadesRegressio2014.txt", header =T)

#data= read.spss("http://www.econ.upf.edu/˜satorra/M/dadesME2014.sav")#data=as.data.frame(data)

names(data)"Y1" "Y2" "X1" "X2" "X3" "X4" "X5" "X6"

head(data)> head(data)

Y1 Y2 X1 X2 X3 X4 X5 X61 -19.18 0 0.96 2.78 0.84 2.32 0 22 -19.66 0 4.25 -0.44 3.82 3.24 0 23 -24.35 1 2.47 1.04 2.85 3.23 0 44 -20.75 0 3.10 0.90 1.66 2.63 0 25 -22.46 0 2.60 1.36 1.84 2.36 0 36 -22.82 1 -0.17 3.95 0.77 2.49 1 3

Y1 is logexpensesY2 is votingX5 is home = 1X6 is categorialX1 to X4 are indicators related with income (latent)

Albert Satorra

Page 59: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Y1

0.0 0.4 0.8

● ●

●●

●●

●●●

● ●●●

● ●●

●●

●●

●●●

● ●

●●

●●

●●●●

●●●

●●

●●

●● ●●

●●●

●●

●●

●●

●●

●●●●●

●●

●●

●●●●

●●

● ●●●●●

●●

●●●

●●

●●●

● ●●●

●●

●● ●

●●●

●● ●

● ●

●●

●●

● ●

●●

● ●●

●●

●●

●●●●

●●●

●●●

● ●●●●●

●●

●●

●●

●●

●●●●

●●

●●

● ●●

●●

●●●

●●●

●●●●● ●

●●●

●●● ●

●●

● ●●

●●●

●●●

●●●●●

●●

●●

●●●

●● ●●●

● ●

●●●●

●●

●●

●●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●●●

●●●●

●●

●●

●●

●●

●● ●

●●

●●●

●●

●●

●●●

●●

●●●

●●

●●●

●●

●●

● ●● ●

● ●●

●●

●●

●● ●

●●

●●●

● ●

● ●●

●●

●●

●●

●●●●

●●

●●

●●

●●●

●●●●

●●●

●●

● ●

●●

●●●●

●●

●●

●●

●●●

●●

● ●●●●

●●●

●●

●●●

●●

●●●

●●

●●●●●

●●

●●

●●●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●●

●●●

●●●

●●

●●

●● ●

●●●

●●●●

●●●● ●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

● ●

●●●●

●●●

●●●●

●●●●

●●●●●

●●● ●

●●

●●●●

●●

●●

●●

●● ●

●●

● ●

●●●●

●●

●●

●●

●●

●● ●

●● ●

● ●●

●●●

●● ●●

●●

●● ●●

●●

●●

● ●

● ●

●●

●●

●●

●●

●●●

● ●

●●●

●●

●●●

●●●

●●

●●

●●

●●●

●●

●●●

● ●●

● ●

●●

●●

●●

● ●●

● ●●

●●

●●

●●●

● ●

●●

●●

●●

●●

●●●

●●

●●

● ● ●●

●●●

●●

●●

●●

●●

●●●

●●●

●●

● ●●●

●●

● ●●●

● ●●

●●

●●

●●

●●●

● ●●●

●●

●● ●

●● ●

●● ●

●●

●●

●●

● ●

●●

● ●●

●●

●●

●●

●●

●●●

●●●

●●●●

●●

●●

●●

●●

●●

●●●●

●●

●●

●●●

●●

●●●

●●●

●●

●●● ●

●●●

●● ●●

●●

● ●●

●●●

●● ●

●●●●

●●

●●

●●

●●

●● ●●●

● ●

●●● ●

●●

●●

●●●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●●

●●●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●●

●●

●●●

●●

●●

●● ●●

●●●

●●

●●

● ● ●

●●

●●●

● ●

● ●●

●●

●●

●●

● ●●●

●●

●●

●●

●●●

●● ●●

●●●

●●

● ●

● ●

●●●●

●●

●●

●●

●●●

●●

●●●●

●● ●

●●

●●●

●●

●●

●●

●●

● ●●●●

●●

●●

●●●

●●

●●

● ●

●●

●●

●●●

●●

●●

●●●

●●●

● ●●

●●

●●

●●●

●●●

●●

●●

●●● ●●

●●

●●

●●

● ●

●●

●●

●●

●●●

●●

● ●

●●●●

●●

●●

●●

●●●●

●●● ●●

●●●●

●●

●●●●

●●

●●

●●

●● ●

●●

●●

●●●●

●●

●●

● ●

●●

●● ●

●●●

●●●

●●

●●

●●●●

●●

●●●

● ●

● ●

● ●

●●

●●

●●

●●

●●

●●●

● ●

●●●

●●

●●●

● ●●

●●

●●

●●

●●●

●●

●● ●

● ●●

−1 1 3 5

●●

●●

●●

●●●

●●●

●● ●

●●

●●

●● ●

●●

●●

●●

●●

●●

●● ●

● ●

●●

●●●●

●●●

●●

●●

● ●

●●

●●●

●●●

● ●

●●● ●

●●

●●●●●●●

● ●

●●

●●

●●●

●● ●●

●●

●●●

●●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●

●●●● ●

●●

●●

●●

● ●

●●

●●

● ●

●●

●●

●● ●

●●

● ●●

● ●●

●●

● ●●●

●● ●

●●● ●

●●

●●●

●●●

●●●

● ●●●

● ●

●●

●●

●●

● ●● ●●

●●

● ●●●

●●

● ●

● ●●●

● ●

●●

● ●

●●

●●●

●●

●●

● ●

●●●●

●●●

● ●

●●

●●

● ●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●● ●

●●

●●●●● ●

● ●●

●●

●●

●●●

●●

●●●

●●

●● ●

●●

●●

●●

●●● ●

●●

●●

●●

●●●

●●● ●

●●●

●●

●●

●●

● ●●●

●●

●●

●●

● ●●

●●

●●● ●

● ●●

●●

● ●●

●●

●●

●●

●●

●●●● ●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●

● ●●

●●●

●●●

●●

●●

●●●

●● ●

●●

●●

●●●●●

●●

●●

● ●

●●

●●

●●

●●

●● ●

●●

●●

● ●●●

●●

●●

●●

●●●●

●●●●●

● ●● ●

●●

● ●●●

●●

●●

●●

●●●

●●

● ●

● ●●●

●●

●●

●●

●●

●●●

●●●

●●●

●●

●●

● ●●●

●●

●● ●

●●

●●

●●

● ●

●●

●●●

●●

●● ●

●●

●●●

●●

●● ●

●●●

●●

● ●

●●

●●●

●●

● ●●

●●●

●●

●●

●●

●●

● ●●

●●●

●●

●●

●● ●

● ●

●●

●●

●●

●●

●●●

●●

●●

●● ●●

●●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●●●

●●

● ●● ●

●●●

●●

●●

●●

●●●

● ●●●

●●●

● ●

●●●

●● ●

● ●

●●

●●

● ●

●●

● ●●

●●

● ●

●●

●●

●●●

●●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●●

●●●

●●

●●●●

●●●

●●● ●

●●

● ●●

●●●

●●●

●●●●

●●

●●

●●

●●

●● ●●●

●●

●●● ●

●●

●●

●●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●● ●

● ●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●●

●●

●●●●●●

● ●●

●●

●●

●● ●

●●

●●●

●●

● ●●

●●

●●

● ●

●●●●

●●

●●

●●

●●●

●●●●

● ●●

●●

●●

●●

●●●

●●

●●

●●

● ●●

●●

●●●●

● ●●

●●

●●●

●●

●●

●●

● ●

● ●●

●●

●●

●●

●● ●

●●

●●

● ●

●●

●●

●●

●●

●●

● ●●

●●●

● ●●

●●

●●

●●●

●●●

●●

●●

●●●● ●

●●

●●

●●

● ●

●●

●●

●●

●●●

● ●

●●

●●●●

●●●

●●●

●●●●

● ●● ●●

●●● ●

●●

●● ●●

●●

●●

●●

●● ●

●●

●●

●●●●

●●

●●

●●

●●

●● ●

●● ●

● ●●

●●

●●

●●●●

●●

●● ●

● ●

●●

● ●

● ●

●●

●●●

●●

●●●

●●

●● ●

●●

●●●

●●●

●●

●●

●●

●●

●●

●●●

●● ●

1.0 2.5 4.0

●●

●●

●●

●●

●●●

● ●●

●●

●●

●●●

●●

● ●

●●

●●

●●

●●●

●●

●●

● ● ●●

●●●

●●

●●

●●

●●

●● ●

●●●●

●●

●●●●

●●

●●●●

● ●●

●●

●●

●●

● ●●

●●●●

●●

●● ●

●● ●

● ●●

●●

●●

●●

● ●

●●

●●●

●●

● ●

●●

●●

●●●

●●

●●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

● ●●

●●●

●●

●●●●

●●●

●●●●

●●

● ●●

● ●●

●●●

●●●●●

● ●

●●

●●

●●

●● ●●●

● ●

●●●●

●●

●●

●●●

●●●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●

● ●●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●●

●●

●●●

●●

●●

●● ●●

●●●

●●

●●

●● ●

●●

●● ●

● ●

●●●

●●

●●

●●

● ●●●

●●

●●

●●

●●●

●● ●●

●●●●

●●

●●

● ●

●●●

●●

●●

●●

●●●

●●

● ●●●●

●● ●

●●

●●●

●●

●●●●

●●

● ●●

● ●

●●

●●

●●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●●

● ●●

●●●●

●●

●● ●

●●●

●●

●●

● ● ●● ●

●●

●●

●●

● ●

●●

●●

●●

●●●

●●

●●

●●●●

●●●

●●

●●

●● ●●

● ●●●●

●●● ●

●●●

●●●●

●●

● ●

●●

●● ●

●●

●●

● ●●●

●●

●●

● ●

●●

●● ●

●●●

● ●●

●●

●●

●●●●

●●

●●●

● ●

● ●

●●

●●

●●

●●

●●

●●

●● ●

●●

●●●

●●

●●●

●●●

●●

●●

●●

●●

●●

●● ●

● ●●

−28

−22

−16

●●

●●

●●

●●●

●●●●

●●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●●

● ●

●●

●●●●

●●●

●●

●●

●●

●●

●●●●●●●

● ●

●●●●

●●●●

●●●●●

●●

●●●

● ●

●●●

●●●●

●●

●● ●

●●●

● ●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

● ●●

●●●●●

● ●●

●●

●●

●●

●●

●●● ●

●●

●●

●● ●

●●

●●●

●●●

●●●●●●

●● ●

●●● ●

●●

● ●●

●●●

●●●

●●●●

●●

●●

●●●●

● ●●●●

●●

●●●●

●●

●●

●●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●●

● ●

●●

●●

● ●

●● ●

●●

●●

●●●●

●●

●●

●●

●●●●●

●●

●● ●

●●

●●

● ●●●

● ●●

●●

●●

●●●

●●

●● ●

●●

●●●

●●

●●

●●

●●● ●

●●

●●

●●

●●●

●● ●●● ●●●

●●

●●

●●

●●●●

●●

●●

●●

●●●

●●

●●● ●

●●●

●●● ●●

●●

●●

●●

● ●

●●●●●

●●

●●

●●●●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●●

● ●●

●●●

●●

●●

●●●

●●●

●●●

●●●● ●

●●

●●

● ●

●●

●●●

●●●

●●

●●●●●

●●

● ●●●

●●●

●●

●●

●● ●●

●●●●●

●●●●

●●

● ●●●

●●

●●

●●

●●●

●●

●●

● ●●●●

●●●

●●

●●

●●●●

●● ●

● ●●

●●

●●

●● ●●

●●

●●●

●●

●●

●●

●●

●●

●●●●

●●

●● ●

●●

●●●

●●

●●●

● ●●

●●

●●

●●

●●●

●●

● ●●

●●●

0.0

0.4

0.8

●●

●●● ●● ●

●● ●

● ●●

● ●

●●

●● ●●

●● ●●

● ●●● ●●

● ●●

● ●●

●●

● ●

●●

● ●

● ●

●●

●●● ●

●●

●●

● ●

●●●

● ● ●

●●

●●●●●

● ●

● ●

● ●●●●

●●●

●●

●●

●● ●●

●●

● ●● ●

● ●●

●●

●● ●●●

●● ●

● ●● ●●

●● ●

●● ● ●●●

● ●●●

● ● ●

● ●●● ●●●●

●● ●●

● ●

●● ●

●●

●●

● ●

● ●●

●●

●● ●

●● ●

●● ●

●●●

● ●● ●

●●

●●

●● ●●● ●●● ●●

●● ●

● ●

●●●●

●●●●

●●

●● ●

● ●●

● ● ●●

●● ●●●

● ●

●●

●●●●● ●

●●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●● ●

●●

●●

● ●

●● ●●

●● ●

●●

●●●●●

●●●●

● ●●● ●● ●

●●

● ●●

●● ●●

●●●

● ●

●●

● ●●

●● ●

●●●

● ●●●

●●

●●● ●●

●●●

● ●●●

● ●

● ●●

● ●● ●●

●●●

●●●●

●●

●●●●

● ●●

●● ●

●●

●●

● ●●

●●

●●

● ●

●●

●● ●●●

● ●

●● ●●● ●

● ●●● ●●● ●

● ●

●●

●●

●● ●● ●●

● ●●●●●

● ●

●●

●●●

● ●

●● ●

●● ●●●●●

●● ●

● ●

●●

●● ●

●●● ● ●

●●●● ●

●●● ●

● ● ●

●●

● ●

●● ●

●●

●●

●●●●● ●● ●●

●●

●● ●●

●● ●●●

●●

●●●

●●

●●●

●● ●●

●● ●●●●

●●

●● ●

● ●●

● ●●● ●●●

●● ●● ● ●●

●● ●● ●● ●●

● ●●

●●

● ● ●● ●

●● ●

●●●

● ●

●● ●

●●

●●

●●

●●

●● ●

●●●●

● ●●

●●

● ●

●●● ●● ●●

●●

●● ●●●●●

●●

●●

●●●

● ●●

●●

● ●

● ● ●

●●

● ●●

●●●

●●

●● ●●

●●

●●

●●●

●●

●●

● ●●

●● ●

● ●●

●●

● ●●

● ●●

●●●● ●●

●●●

● ●●

●●● ●● ●●

● ●● ●●

● ●

● ●

● ●●●●

● ●●●

● ●

●●●

● ●

●●●

● ●●

● ●●

●● ●● ●●

●●

● ●

●● ●●

● ● ●●●

●●

●●

●●

●●●

● ●

Y2

●●

● ●● ●●●

●●●

●● ●

● ●

●●

●●●●

●●●●

●● ●● ● ●

●● ●

●● ●

●●

● ●

●●

●●

●●

●●

●●● ●

● ●

●●

●●

● ●●

● ● ●

●●

●●●● ●

● ●

●●

● ● ●●●

●●●

●●

●●

●●● ●

●●

● ●● ●

●● ●

● ●

●● ●● ●

● ● ●

●● ● ●●

●●●

●● ●●●●

●●●●

● ● ●

●●● ●●●●●

●●●●

● ●

●●●

●●

● ●

●●

●●●

●●

●● ●

●● ●

●●●

●●●

●●● ●

●●

●●

●● ●●●●● ● ●●

●●●

●●

●●●●

●●● ●

●●

● ● ●

●●●

● ●●●

●● ●●●

● ●

● ●

●● ●● ●●

●●●

●●

●●

●●

●●

● ●●

● ●

●●

●●

●● ●

●●

●●

● ●

●● ●●

●●●

●●

● ●●●●

●●●●

●●● ●●● ●

●●

●●●

● ●●●

●● ●

● ●

●●

● ●●

●● ●

●●●

●● ●●

●●

●● ● ●●

● ●●

●● ●●

● ●

● ●●

●●● ●●

● ●●

● ●●●

● ●

●●● ●

● ● ●

●● ●

●●

●●

●●●

● ●

●●

●●

●●

● ●●● ●

● ●

● ● ●●● ●

●●●● ●● ● ●

● ●

● ●

● ●

●● ● ●●●

●●●● ●●

● ●

● ●

●●●

●●

● ●●

●●● ●● ●●

●● ●

●●

●●

● ●●

●● ● ●●

●●● ●●

●● ● ●

●● ●

●●

●●

●● ●

●●

●●

● ● ●●●●● ●●

● ●

●●●●

● ●●● ●

●●

●● ●

● ●

● ●●

●●●●

●●● ●● ●

●●

● ●●

● ●●

● ●●● ●● ●

●● ●● ●● ●

● ●●● ●● ●●

● ●●

●●

● ●● ● ●

● ●●

●●●

●●

●● ●

●●

● ●

●●

● ●

●● ●

●●●●

● ●●

●●

● ●

●●● ● ● ●●

●●

●●●● ● ●●

●●

●●

●●●

● ●●

●●

●●

● ●●

●●

●● ●

●●●

●●

●● ●●

●●

●●

●● ●

●●

●●

●●●

●●●

● ● ●

●●

●● ●

● ● ●

● ● ●●●●

●● ●

● ●●

● ●● ●● ● ●

●●● ●●

●●

●●

● ●●●●

●●●●

● ●

●● ●

●●

●● ●

● ●●

● ●●

●● ●● ●●

●●

● ●

●● ●●

● ●●●●

●●

●●

●●

●●●

●●

● ●

●●

●● ●● ● ●

●●●

● ●●

●●

● ●

● ●●●

● ● ●●

●●● ●●●

● ●●

● ●●

● ●

●●

● ●

● ●

●●

● ●

●●●●

●●

● ●

●●

●● ●

●●●

●●

● ●● ●●

●●

●●

●●●● ●

●● ●

● ●

● ●

● ● ●●

●●

●● ●●

●●●

●●

● ●● ●●

●●●

● ●●● ●

●●●

● ●● ●● ●

●●●●

●●●

● ● ●●●●● ●

●● ●●

●●

● ●●

●●

●●

● ●

●● ●

●●

●●●

● ●●

●●●

●●●

●● ●●

●●

● ●

● ●● ●●● ●●● ●

●● ●

● ●

● ●●●

● ● ●●

●●

●●●

●● ●

●● ●●

●●● ●●

●●

●●

● ●●●● ●

●● ●

●●

● ●

● ●

● ●

●●●

●●

● ●

● ●

● ●●

● ●

●●

●●

● ●●●

●●●

● ●

●●●●●

●●● ●

● ● ●●● ●●

●●

●● ●

●● ● ●

● ●●

●●

● ●

●● ●

● ●●

● ●●

●●● ●

●●

●●●● ●

●● ●

●●● ●

●●

●●●

●●●● ●

●● ●

●● ●●

●●

● ● ●●

●●●

● ●●

● ●

●●

●● ●

●●

● ●

●●

● ●

●● ●●●

●●

●●● ●●●

● ●● ●● ●●●

●●

●●

●●

● ●●● ●●

● ●● ●● ●

●●

●●

●● ●

●●

●●●

●● ●● ●● ●

● ●●

●●

● ●

●● ●

● ●●● ●

●● ●●●

● ●●●

●●●

● ●

●●

●●●

● ●

●●

●●●● ●●●● ●

●●

● ●●●

●●● ●●

● ●

●●●

●●

●●●

● ●● ●

●● ●● ●●

● ●

●●●

●● ●

●●● ●● ●●

● ●● ●●●●

●●●●● ●●●

●●●

● ●

●● ●●●

●● ●

●●●

●●

● ●●

● ●

●●

●●

●●

● ●●

●●● ●

●● ●

●●

●●

● ● ●●●● ●

●●

● ● ●●●●●

● ●

●●

● ●●

●● ●

● ●

● ●

●● ●

●●

●●●

● ●●

●●

● ●● ●

● ●

● ●

●●●

●●

●●

●● ●

● ●●

●●●

● ●

● ●●

●●●

●●● ●●●

●●●

●● ●

●● ●● ●●●

●●●● ●

●●

●●

●●● ● ●

●● ● ●

●●

●●●

● ●

● ●●

●● ●

●● ●

● ●● ●● ●

● ●

●●

●●●●

●●●●●

● ●

●●

● ●

●● ●

●●

● ●

●●

●●●●● ●

● ● ●

●●●

●●

● ●

●●● ●

●●● ●

● ●● ● ●●

● ●●

● ●●

●●

● ●

●●

●●

●●

●●

●● ●●

●●

●●

●●

●●●

● ●●

● ●

●● ●●●

●●

●●

●● ●● ●

●● ●

●●

●●

●● ●●

●●

●●●●

●● ●

●●

●● ●● ●

●●●

●● ●● ●

●● ●

●●●● ●●

● ●●●

● ●●

●● ●●● ●●●

● ●●●

●●

●●●

● ●

● ●

●●

●●●

●●

●●●

● ●●

●●●

●● ●

●●●●

●●

●●

●●●●●●●●●●

●●●

●●

●● ●●

● ●● ●

●●

● ●●

●●●

●● ●●

●●●●●

●●

●●

●● ●●●●

● ●●

●●

●●

●●

●●

●● ●

●●

●●

● ●

●●●

●●

●●

●●

●● ●●

●●●

●●

●●● ●●

●●● ●

●●●● ●● ●

● ●

● ●●

●●●●

●●●

●●

●●

● ●●

● ● ●

●● ●

●●● ●

●●

● ●●●●

●●●

● ●●●

●●

● ● ●

●●● ●●

● ●●

●●●●

●●

●●●●

● ●●

●●●

●●

●●

●●●

●●

●●

● ●

● ●

● ●●●●

● ●

● ●●● ●●

●●●●●●● ●

●●

●●

● ●

● ●● ● ●●

● ●● ●●●

●●

●●

●●●

● ●

●● ●

●●●●●●●

●●●

● ●

● ●

●●●

●● ● ●●

●● ●●●

●●●●

●● ●

●●

●●

● ●●

●●

●●

● ●●●● ●● ●●

● ●

●●●●

●● ●●●

●●

●● ●

● ●

● ●●

●● ●●

●●●●●●

● ●

● ●●

● ●●

● ●●● ●● ●

●●●●●●●

● ●●●●●● ●

●●●

●●

●● ●● ●

● ●●

●●●

●●

● ●●

●●

● ●

●●

●●

●●●

●● ●●

●●●

●●

●●

●● ●●●● ●

●●

●●● ●● ●●

●●

● ●

●● ●

● ●●

●●

● ●

●●●

●●

●● ●

●●●

● ●

●● ●●

●●

●●

●●●

●●

●●

● ●●

● ●●

● ●●

●●

● ● ●

● ● ●

●● ● ●● ●

●● ●

● ●●

● ●● ●● ●●

●● ● ●●

● ●

●●

●●●●●

●● ●●

● ●

● ●●

●●

●●●

●●●

●● ●

● ● ●● ●●

●●

●●

●● ●●

●●●● ●

●●

●●

●●

● ●●

●●

● ●

● ●

● ●●●● ●

●● ●

●● ●

● ●

●●

●●●●

● ●●●

● ●●● ●●

●● ●

●● ●

●●

● ●

●●

●●

●●

● ●

●● ●●

● ●

●●

●●

●● ●

●● ●

● ●

● ●●● ●

●●

●●

● ●●●●

●● ●

● ●

●●

● ●● ●

● ●

● ●●●

● ●●

●●

●●●●●

● ●●

●● ● ●●

●●●

●●●● ●●

●●●●

●● ●

●●●● ● ●●●

●● ●●

●●

●● ●

●●

●●

●●

● ●●

●●

●●●

● ●●

●●●

● ● ●

●●●●

● ●

●●

●● ● ●●● ●●●●

● ●●

●●

●● ●●

●●● ●

●●

● ●●

●●●

●●●●

●● ●● ●

● ●

●●

●● ●● ●●

● ●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●●

●●

●●

● ●

●●● ●

●●●

●●

● ●●●●

● ●●●

●●● ●●● ●

●●

● ●●

● ●●●

●●●

●●

●●

●●●

●● ●

●●●

●● ●●

●●

●● ●●●

● ●●

●● ●●

●●

● ● ●

●● ●●●

●●●

● ●● ●

●●

●● ●●

● ●●

●●●

●●

● ●

●●●

●●

●●

● ●

●●

●● ●● ●

● ●

● ●●●●●

●●●●● ● ● ●

● ●

●●

●●

●● ●● ●●

●● ●● ●●

● ●

● ●

●●●

● ●

● ● ●

● ●● ●●●●

●● ●

● ●

●●

●●●

●● ●●●

● ●● ●●

●● ●●

● ● ●

●●

●●

●● ●

●●

● ●

● ● ●● ●●● ●●

● ●

●● ●●

● ● ●● ●

●●

● ● ●

●●

● ●●

●● ●●

● ●● ●● ●

●●

●● ●

● ●●

● ●●● ●●●

●● ●●●● ●

● ●● ●●●●●

●●●

●●

●●●● ●

● ●●

● ● ●

●●

●● ●

●●

● ●

●●

●●

●●●

●●●●

● ●●

●●

●●

●●● ●●●●

●●

●●● ●●●●

●●

●●

●●●

● ●●

●●

●●

● ●●

●●

●● ●

● ●●

●●

●● ●●

●●

●●

●●●

● ●

● ●

●● ●

●●●

● ●●

●●

●● ●

●● ●

● ● ●● ●●

●● ●

● ● ●

● ●●●●● ●

●● ●●●

● ●

●●

● ●●● ●

●●●●

●●

●● ●

●●

●●●

● ●●

●●●

●● ●●●●

●●

●●

●● ● ●

●●●● ●

●●

●●

●●

●●●

●●

● ●

●●

●● ●● ●●

●●●

●●●

●●

● ●

●●● ●

●● ●●

●● ●●● ●

● ●●

●●●

● ●

●●

● ●

●●

●●

●●

● ●●●

●●

●●

●●

●● ●

●●●

●●

● ●●●●

●●

●●

●●● ●●

● ●●

● ●

●●

●● ●●

● ●

●●●●

● ●●

●●

●●●●●

●●●

● ●●● ●

● ●●

● ●●●● ●

●●●●

●●●

●● ●● ●● ●●

● ●●●

●●

● ●●

●●

●●

●●

●●●

●●

●● ●

● ●●

●●●

●●●

●●●●

● ●

● ●

● ●● ●●●●●●●

●● ●

●●

●●●●

●●●●

●●

●●●

●● ●

●●● ●

●●● ●●

●●

●●

●●●●●●

●●●

●●

● ●

●●

●●

●● ●

●●

● ●

●●

●●●

●●

●●

●●

● ●●●

● ●●

●●

●●● ●●

●●●●

●● ●● ●●●

● ●

●● ●

●●●●

●●●

●●

● ●

●●●

●●●

● ●●

●●●●

●●

● ●●● ●

●● ●

●● ●●

●●

●● ●

●●●● ●

●● ●

●● ●●

●●

● ●●●

●●●

● ●●

● ●

●●

●● ●

●●

● ●

●●

● ●

●● ●●●

●●

●●● ●●●

●● ●●● ●●●

●●

●●

●●

● ●●●●●

●●●● ●●

●●

●●

●●●

●●

●● ●

●●●● ●●●

● ●●

●●

●●

● ●●

● ●●● ●

● ●●● ●

● ●●●

●●●

● ●

●●

● ●●

● ●

● ●

●●●●●● ●● ●

● ●

●●●●

●● ●●●

● ●

●●●

●●

●●●

●●● ●

●●● ●●●

●●

● ●●

●●●

●● ●●● ●●

● ●● ●●●●

●●●●●●●●

●● ●

● ●

●●●●●

●● ●

●●●

●●

● ●●

● ●

●●

●●

● ●

● ●●

●●●●

●● ●

●●

●●

● ●●●●●●

●●

● ●●● ●●●

●●

●●

● ●●

●● ●

● ●

●●

●● ●

●●

●●●

● ●●

● ●

● ●● ●

● ●

● ●

●●●

●●

●●

●● ●

●●●

●●●

● ●

● ●●

●●●

●●● ●●●

●●●

●● ●

●● ●● ●●●

●●●● ●

●●

●●

●●● ●●

●●●●

●●

●● ●

●●

●● ●

●● ●

●● ●

● ●● ●● ●

●●

●●

● ●●●

●●●●●

● ●

● ●

● ●

●● ●

●●

●●

●●

●● ●●

●●

●●

●●

● ●

●●

●●

●●

●●●

● ●

●●

●●

● ●

● ●●

●● ●

●●

●●

●●

●●

●●●

●●

●●●

●●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●● ●

●●

●●●

●●

●●●●

●●

●●

●●

● ●●●●

●●

● ●

●●

●●

●● ●

●●●

●●●

●●

●●

●●●●

●● ●●

● ● ●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

● ●● ●

●● ●

●●●

●●●●

● ●

●●

●●

● ●

●●

●●

●●

●● ●●

● ●

●●

●●

● ●

●●●

●●

●●● ●●

●●●●

●● ●

●●

● ●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●●

●●

●●

●●

●●

●●

● ●●●

● ●

●●

● ●

●●

●●

●●●

●●

●●

●●

●●

●●

● ●●

●●

● ●

●●

●●

●●●●

●●

●●

● ●●

●●

●●

●●●

●●

●●

●●

●●

●●

●● ●

●●● ●

●●

●●

●●● ●

●●

● ●

● ●

●●●

●●●

●●

●●

●●●●●●●

●●

●●

●●

●●● ●

●●

●●

● ●

●●

●●

●●●●

●●

●● ●

●●

●●

●●●

● ●●

●●

●●

● ●●

●●

●●

●●●

●●

● ●

●●

●●●●

●●

● ●

●●

●●

●●

●●

●●

● ●●

●●●●

●●

●●

●●

●●

●●

●●

●●

● ●●

● ●

● ●●●

●●

●●

●●

● ●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●●

●●

● ●

●●

●●

●●

●●

●●●●

●●

●●

●●

●●

●●●●

●●

●●

●●

●●●

●●

●●●

●●● ●

● ●

● ●

●●

●●

●●●

●●

● ●●

●●●●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●●

●●●●

●●

●●●●

●●

●●

●●

●●● ●●● ●

●●

●●

●●

●●●

●●●

●●●

●●

●●

●● ●●

●●●●●

● ●●●

●●

●●●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●●●

●● ●

●●●●

●●●

●●

●●

●●

●●

●●

● ●

●●

●●● ●

●●

●●

●●

● ●

●●●

● ●

●●●●●●

●●

● ●

●●●

●●

● ●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●●

●●●

●●

●●●

●●

●●

●●

●●● ●

●●

●●

●●

●●●

●●

●●●

●●

● ●

●●

●●

●●

●●●

●●

●●

●●

●●

●●● ●●

●●

●●

●●●

● ●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●●●

●●

●●

●●● ●

●●

● ●

●●

●●●

●● ●

●●

● ●

●●●●●

● ●●

●●

●●

●● ●●

●●●●

● ●

●●

●●

● ●●

●●

●●●●

●●

● ●●

●●●

●●

● ●

●●●

●●

●●

●●●

●●

●●

●●

●●●●

●●

● ●

●●

●●

●●●

●●

●●●

● ●●●

●●

●●

●●

●●

● ●

●●

●●●

●●

●●

●●●

●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●●●

●●●●

●●

●●

●●

●●●

X1●

●●

●●

●●●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●●●

●●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●

●●

●●

●●●●

●●

●●

●●●

●●

●●●

●●

●●●●

●●

●●

●●

●●●●●

●●

●●

●●

●●

●●●

●●●

●●●

●●

●●

●●●●

●●●●

●● ●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●●●

●●

●●●

●●●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●● ●

●●

●●

●●

●●

●●●

●●

●●●●●

●●●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●●

●●

●●

●●

●●

●●

●●●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●● ●●

●●

●●

●●●●

●●

●●

●●

●●●

●●●

●●

●●

●●●●●

●●●

●●

●●

●●●●

●●

●●

●●

●●

●●

●●●

●●

●●●

●●

●●

● ●●●●●

●●

●●

●●●

●●

●●

●●●

●●

●●

●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●●●

●●

●●

●●

●●

●●●

●●

●●

●●●

●●

●●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●●

●●

●●

●●

●●

●●

●●

● ● ●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●

●●

●●

●●

●●●

●●●●

● ●

●●

● ●

●●

●●●

●●

● ● ●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●

●●●

●●

●● ●●

●●

●●

●●

●●●● ●

●●

●●

●●

●●

●● ●

●●●

●● ●

●●

●●

●● ●●

●●●

●●

● ●●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●●

●●●●

●● ●

●● ●

●●●●

● ●

●●

●●

●●

●●

● ●

●●

●●● ●

●●

●●

●●

● ●

●●●

● ●

● ●●●●

●●●●

●●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●●

●●●

●●

●●

●●

●●

●●

● ●● ●

● ●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

● ●●●

●●

●●

● ●●

●●

●●

●●●

●●

●●

●●

●●

●●

●● ●

●●● ●

●●

●●

●●● ●

●●

● ●

●●

●●●

● ●●

●●●

● ●

●●●●●

● ●●

●●

●●

●●●●

●●

● ●

●●

●●

●●

●●●

● ●

●●●

●●

●●

● ●●

●●●

●●

● ●

●●●

●●

●●

●●●

●●

●●

●●

● ●●●

●●

●●

●●

●●

●●

●●

●●

●● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●

● ●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●● ●

●●

●●

●●

●●

● ●

●●

●● ●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●

● ●●

●●●

●●

●●

● ●

●●

●●●

●●

●● ●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●

● ●

●●

●●

● ●

●●

●●

●●●

●●

●●●

●●

●● ●●

●●

●●

●●

● ●●●●

●●

● ●

●●

●●

●●●

●●●

● ● ●

●●

●●

● ●●●

●●●●

●●●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●

● ●●●

●●●

●●●

●● ●●

●●

●●

●●

●●

●●

● ●

●●

●●●

●●

●●

●●

●●

● ●●

● ●

● ●●● ●

●●

●●

●●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

● ●●

●●

●●

●●

●●

●●

● ●● ●

● ●

● ●

●●

●●

●●

●●●

●●

● ●

●●

●●

●●

●●●

●●

● ●

●●

●●

●●●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●● ●

● ●

●●

●● ●●

●●

●●

●●

●● ●

●● ●

●●

●●

● ●● ● ●

● ●●

●●

●●

●●● ●

●●

●●

● ●

●●

●●

● ●●

● ●

●●●

●●

●●

● ●●●●

●●

●●

●●●

●●

●●

●●●

●●

●●

●●

●●●●

●●

● ●

●●

●●

●●

● ●

●●

● ●●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●

● ●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●● ●

●●

● ●

●●

●●

−1

13

5

●●

●●

●●●●

●●

●●

●●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●●

●●●●

● ●

● ●

●●

●●

●●●

●●

●●●

●●●●

●●●

●●

●●

● ●

●●

● ●

●●

●●

●●

●●

●●

● ●

●●

●●● ●

●●

●●

●●●

●●

●●●●

●●

●●● ●

●●

●●

●●

●●●●●●●

●●

●●

●●

●●●

●●●

●●●

●●

●●

● ●●●

●●●●●

● ●●●

●●

●●●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●●●●●●

●●●

●● ●●

●●●

● ●

●●

●●

●●

●●

●●

●●

●●●●

●●

●●

●●

●●

●●●

● ●

● ●●●●

●●

● ●

●●●

●●

●●

●●

●●

●●

●●

●●●

●●●

●●

●●

●●

●●

● ●

●●

●●●●

●●●

●●

●●

●●

●●

●●

●●●●

● ●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●● ●●●●

●●

●●

●●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●●●

●●

●●

●● ●●

●●

●●

●●

●●●

●●●

●●●

●●

●●●●●

● ●●●

●●

●●

●●●●

●●●●

●●

●●

●●

●●●

●●

●●●●

●●

●●●●●●

●●

●●

●●●

●●

●●

●●●

●●

●●

●●

● ●●●

●●

●●

●●

●●

●●●

●●

●●

●●●●

●●

●●

●●

●●

● ●

●●

●●

●●●

●●

●●●

●●

●● ●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●●●

● ●

●●

●●

●●●

−1

13

5

●●

●●

●●

●● ● ●●●

●●●

●●

●●

●●●

●●

●●●

● ●

●● ●

● ●●

●●

● ●●

●● ● ●

●●

●●

●●●

●●

●●

●●

●●●

●●

● ●

●●

●●● ●

●●

●●

● ●●

●●

●● ●●●

●●

●●

●●

●●

●●

● ●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●● ●

● ●●●

●●●

●●

●●●

●●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●●

●●

● ●

● ●● ●

●●

●●●

●●

●●●●

●●

●●

●● ●

●●●●

●●●

● ●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●● ●●

●●

●●

●●●●

●●●

● ●

●●

●●

●●

●● ●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●●

●●

● ●●●

●●

●● ●●

● ●

●●

● ●

●●

●●

● ●●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●●● ●

●●

●●● ●

●●●

●● ●

● ●

●●

●●

●●●● ●

●● ●

●●

●●

●●

● ●●●

●●

●●

●●

●●

● ●

●● ●●

●●

●●

●●

●●

●●

●●

●●

●●●●

●●

●●

●●

●●●

●● ●

●●

●●

●●●●

●●●

●●●● ●

●●

● ●● ●●●

●●

●●

●●

●●

●●

●●●

●●

● ●

●●

●●●●

● ●

● ●

●●

●●

●●●

●●●

●●

●●●

●●

●●

●●

●●●●

●●

● ●

●●

● ●

● ●

● ●

●●

●●

●●●

● ●

●●

●●

●●

●●●

●●

●●

●● ●●

● ●●●

● ●

●●

● ●●

●●

●●

●●

●●●●●●

●●●

● ●

●●●●●

●●

●●●

●●

●● ●

●●●

●●

●●●

●●● ●

●●

●●

●●●

●●

●●●

●●

●●●

● ●

●●

●●

●●● ●●●

●●

●●●

●●

●●● ●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●●●

●●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●●

●●●●

●●●

●●

●● ●

●●●●●

●●

●●

●●●●

●●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●●

●●

●● ●

●●

● ●

●●●●

●●

●●●

●●

●●● ●●●●

●●

●● ●

●●●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●●●●

●●

●●

●●● ●

● ●●

● ●

●●

●●

●●●

●●●

●●

●●●

●●●●●

●●●

●●●

●●

●●

●●

●●●

●●●

●●●

●● ●●

●●●

●●● ●

●●

●●

●●

●●

●●

●●●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●● ●●

●●

●●●●●

● ●●

●●●

●●

● ●

●●

●●●●●

●●●

●●

●●

●●●●● ●

●●●

●●

●●

●●

●●

●●●●

●●●

●●

●●

●●

● ●

●●

●●● ●

●●

●●

●●

●●●

●● ●

●●

●●●

●●●

●●●●●

● ●●

●●

●● ●●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●● ●●

●●

● ●

●●

●●

●● ●

●●●

●●●

● ●●

●●

●●

●●

●●●

●●●

●●

●●

●●

●●

● ●

●●

●●

● ●●

●●

●●

●●●

●●

●● ●

●●

●●

●●●●

●●●●

●●

●●

●●●

●●

●●

●●

●●●●

●●

● ●●

●●

●●

●●●

●●

●●●

●●

●● ●

●●●

●●

●●●

●●●●

●●

●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●● ●

●●

●●

●●●

●●

●●●●

●●

●●

●●

●●

●●

●●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●●●

●●●

●●

●●●

●●●●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●●●

●●

●●●

●●

●●●●

●●●

●●

●●●

●●●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●●●●

●●

●●

●●●●

●●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●

●●●

●●●

●●

●● ●●

●●

●●●●

●●

●●

●●

●●

●●

● ●● ●

●●

● ●

●●

●●

●●

●●

●●

●●

●●●●

●●

●●●●

●●●

●● ●

●●

●●

●●

●● ●●●

●●●

●●

●●

●●

●●●●

●●

●●

●●

●●

●●

●●●●

●●●●

●●

●●

●●

●●

●●

●●●●

●●

●●●●

●●●

●●●

●●

●●

●●

●●

●●●

●●●● ●

●●

●●●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●●

●●

●●

●●

●●

●●●●

●●●

●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●●

●●

●●

●●

●●

●●●

●●

●●

●● ●●

●●●●

●●

●●

● ●●

X2●

●●

●●

●●

● ● ●●●●

●●●

●●

●●

●● ●

●●

● ●●

● ●

●●●

●●●

●●

●●●

●●●●

●●

●●

●● ●

●●

●●

●●

●● ●

● ●

●●

●●

●●● ●●●

●●

●●●

●●

●●● ●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●

●● ●

●●●●

●● ●

●●

●● ●

●●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●●

● ●

● ●

●●●●

●●

●● ●

●●

●●● ●

●●

●●

●● ●

●● ●●

●●●

● ●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●●●

●●

●●

●●●●

● ●●

● ●

●●

●●

●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●●

●●

●● ●●

●●

●●● ●

● ●

●●

●●

●●

●●

●●●●

●●

● ●

●●

●●●

●●

●●

●●

●●

● ●●●

●●

●●● ●●

●●●

●●●

● ●

●●

●●

●● ●● ●

●● ●

●●

●●

●●

●●● ●

●●

●●

●●

●●

●●

●●●●

●●

●●

●●

●●

●●

● ●

●●

●●● ●

●●

●●●●

●●●

●●●

●●

●●

●●

●●

● ●●

●●● ●●

●●

●● ●● ●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

● ●●●

● ●

●●

●●

●●

●● ●

●●●

●●

● ●●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●● ●●

●●

●●

●●

●●

●●●

●●

●●

●● ●

●●●●

●●

●●

●●●

● ●

●●

●●

●● ●●●●

● ●●

●●

●●

●●●

●●

●● ●

● ●

●●●

● ●●

●●

● ●●

●●● ●

●●

●●

●● ●

● ●

●●●

●●

●● ●

●●

●●

●●

●●●●

●●

●●

●●●

●●

●●●●

●●

●●

●●

●●

●●

●●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

● ●●●

● ● ●

●●

●●●

●●●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

● ●

● ●●●

●●

●●●

●●

● ●●●

●●

●●

●●●

●●●●

● ●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●●● ●

●●

●●

● ●●●

● ●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

● ●●

●●

●● ●●

●●●

●●● ●

● ●

● ●

●●

●●

●●

● ●●●

● ●

● ●

●●

●●

●●

●●

●●

●●

●●●●●

● ●

●●●●

●●●

●●●

● ●

●●

●●

●● ●●●

●● ●

● ●

●●

●●

● ● ●●

●●●

●●

●●

●●

●●

●●● ●

●●

●●

●●

●●

●●

●●

●●

● ●● ●●

●●

●●

●●

●●●

●● ●

●●●

●●●

●●

●●

● ●●

●●●● ●

●●

●●●●●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●●●

●●

● ●

●●

●●

●●●

● ●●

●●

● ● ●

●●

●●

●●

●●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●

●●

●● ●

●●● ●

● ●

●●

● ●●

●●

●●

●●

●●●●●●

●●●

●●

●●●

● ●

●●

●●●

●●

●●●

●●●

●●

●●●

●●●●

●●

●●

●●●

●●

●●●

●●

●●●

●●

●●

●●

● ●●●

● ●

●●

●●●

●●

●●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●●●●

●●

●●●

●●

●●

●●●

●●●●

●●●

●●

●●●●

●●●●●

●●

●●

●●●●

●●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●●

●●

●●●

● ●

●●

●●●●

●●

●●●

●●

●●●●

●●

●●

●●●

●●●●

●●●

● ●

●● ●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●●●●

●●

●●

●●● ●

●●●

●●

●●

●●

●●●

●●●

●●

●●●

●●

●●●

●●●

●●●

●●

●●

●●

●●●

●●●

●●

●● ●●

●●●

●●●●

● ●

●●

●●

●●

●●

●●● ●

●●

●●

●●

●●

●●

●●

●●

●●

● ●●●●

● ●

●●●●

●●●

●●●

● ●

●●

●●

●●●●●

●●●

●●

●●

●●

●● ●●

●●

●●

●●

●●●

●●

●●●●

●●●●

●●

●●

●●

●●

●●●

●●● ●●

●●

●●●●

●●●

●●●

●●●

●●●

●●●

●●●●●

●● ●

●●

●●●●●

●●

●●

●●

●●

●●

●●●

●●●

●●

●●

● ●●●

●●

●●

●●

●●

●● ●

●●●

●●

● ●●

●●

●●

●●

●●●●

●●●

●●

●●

●●

●●

●●

●●

●●●●

●●

●●

●●

●●

●●●

●●

●●

●●●

●●●●

●●

●●

●●●

●●●

●●●●

●●●●●●

●●●

● ●

●●

●●

●●

● ●

●●

●●

●●

●●●

●●

●●

●●

● ●

●● ● ●

●●

●●

●●

● ●●

●●

● ●●●

●●

●●

●●

●●

●●●

●●

● ● ●●●

●●

●●●

●●●

●● ●●

●●

●●

●●

● ●●

● ●

●● ● ●

●●

●● ●

●●

●●

●●

● ●●● ●

●●

●●

●●

●●

● ●

●●

●● ●

●●

●●

●●

● ●● ●

●●

●● ●●●●

● ●●

●●

●● ●

●●

●●

● ●

●●

●●●

●●

●●

●●

●●●

●●●

●●● ●

● ●

●●●

●●

●●

●●●

●●

●●

●●

●●

●● ●●

●●

●●

●●●

●●

●●

●●

●●

● ●●●

●●

●●●

●●

●● ●

●●

●●

●●

● ●

●●●

●● ●

●● ● ●●

●●

●●

●●

●●

● ●●

● ●

●●

●●

●●●

●●

●●

●●

●●●●●●●●

●●

● ●

●●●

●●

●●

●●

●●

● ●●

●●●

●●

●●

●●

●●

●●

●●●●●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●●●

●●

● ●●●

●●

●●

●●

●●

●●●

●●

●●

●●

● ●● ●

●●

●●

●●●

●●

●●●

●● ●● ●●

●●

●●

●●

●●

●● ●

●● ●

●●

● ●●●

● ●

●●

●●●

●●

●● ●●

●●

●●

●● ●●●

●●● ●

●●

●●●●●

●●

●●

●● ●

●● ●

●●

●●● ●

●●

●●

●●● ●●●●

●● ●

●●●

●● ●

●●●

●●

●●●

●●

●●

● ●●

●●

●●

● ●

●●

●●

● ●

●●

●●

●●

● ●●● ●●

●●

●●●

●●

●●

●●

● ●

●●

●●

●●●

●●

●●

●●●

● ●●

●●

● ●● ●●●

● ●

●●

●●

●●

●●●

● ●

●●●●●●

● ●

●●●

●●

●●

● ●

●●

●●●●●

● ●●

● ●●

● ●

●●●

●●●●●

●●●● ●●

●●●●●

●●

●●●

●●

● ●

●●

●●

●●

●●

●●

● ●

●●

●●

●●●

●●

●●

●●

●●●

●●

● ●●●

●●

●●

●●

●●

●●

●●

● ●● ●●

●●

●●●

●●●

●● ●●

●●

●●

●●

●●●

●●

●●●●

●●

● ●●

●●●

●●

●●

●●●●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●●●

●●

●●●●●●●

●●●

●●

●●●

●●

●●

●●

●●

●●●

●●

●●●

●●●

●●●●

●●

●●●●

●●

●● ●

●●

●●

●● ●

●●

●●

●●

●●

●●●●

●●

●●

● ●●

●●●

●●

●●

●●

●●●●

●●

●●●

●●

●●●

●●

●●

●●

● ●

●●●

●●●

●●●● ●

●●

●●

●●●

●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●●●● ●● ●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●●

● ●

●●●

●●

● ●

●●

●●●●●●●

●● ●

●●

● ●

●●

●●●

●●

●●

●●

●●● ●

●●

●●●●●●

●●●

●●

●●●

●●

●●

●●

●●

●●●●

●●

●●

●● ●

●●

●●●

●●●●●●

●●

● ●

●●

●●

● ●●

●●●

●●●●

● ●●●

●●

● ●●

●●

●●●●

●●●●●

● ●●●

●● ●●

●●

●●●●●

●●●

● ●

●● ●●

● ●●

●●

●●●●

● ●

●●

●● ●●●●●

●●●

●●●

●●●

● ●●

●●

●●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●●●

●●

● ●●● ●●

●●

●●●

●●

●●

● ●

● ●

●●

●●

● ●●

●●

●●●

●●●

●●●

●●

●●● ●●●●

●●

●●

●●

●●

●●●

●●

●●● ●●●

●●

●●●

●●●

●●

●●

●●

●●

●●●

●● ●

●●●

● ●

●●●

●●●

●●

●●● ● ●●

●●●

● ●

●●

●●●

●●

● ●

●●

●●

● ●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

● ●●

●●

● ● ●●

●●

●●

●●

●●

●●

●●

● ●●●●

● ●

●●●

●●●

●● ●●

●●

●●

●●

● ●●

●●

●● ●●

●●

● ●●

●●●

●●

●●

● ●●●●

●●

●●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●● ●

●●

●● ●●●●

● ●●●●

●●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●●

●● ●

●● ●●

● ●

●●●

●●

●●

●● ●

●●

●●

●●

●●

●●●●

●●

●●

●●●

●●●

●●

●●

●●

●●●●

●●

● ●●

●●

●●●

●●

●●

●●

● ●

●●●

● ●●

●●● ● ●

●●●

●●

●●

●●

●● ●

● ●

●●

●●

●●●

●●

●●

●●

●●●●●● ●●

● ●

●●

●● ●

●●

●●

●●

●●

●●●

●●●

● ●

●●●

●●

●●

●●

●●●

●● ●●

●●●

●●

● ●

●●

●●●

●●

●●

●●

●●● ●

●●

●● ●●

●●

●●

●●

●●

●●●

●●

●●

●●

● ●●●

●●

●●

●●●

●●

●●●

●●●● ● ●

●●

●●

●●

●●

●● ●

●●●

●●

●●●●●●

●●

● ●●

●●

●●● ●

●●

●●

●● ●●●

●●● ●

●●

●●●●●

●●

●●

●● ●●

●●●

●●

●●● ●

● ●

●●

●● ● ●●

●●●● ●

●●●

● ● ●

●●●

●●

● ●●

●●

●●

● ●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

● ●● ●●●

●●

●● ●●

●●

●●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●●

●●

●●● ●●●

●●

●●

●●

●●

●●●

●●

●● ●●

●●

● ●

●●●

●●

●●

● ●

●●

●●

●● ●

●● ●

● ●●

● ●

●●●

●● ●

●●

●●●●● ●

●● ●

●●

●●

●●●

● ●

●●

●●

●●

●●

●●

●●

● ●

● ●

● ●

●●●●

●●

●●

●●

●● ●

●●

●●●●

●●

● ●

●●

●●

●●●

●●●

●●●●●

●●

●●●

●● ●

●●● ●

●●

●●

●●

●●●

● ●

●●● ●

●●

●● ●

●●

● ●

●●

●●● ●●

●●

●●

●●

●●

● ●

●●

●●●

●●

●●

●●

●●●●

●●

● ●● ●●●

●●●

●●

●● ●

●●

● ●

● ●

●●

● ●●

●●

●●

●●

●●●

●●●

●●● ●

●●

●●●

●●

● ●

●●●

●●

●●

●●●

●●

● ●●●

●●

● ●

●●●

●●

●●

● ●

●●

●●●●

● ●

●●●

●●

●● ●

● ●

● ●

●●

●●

●● ●

●● ●

●● ●●●

●●

●●

●●

● ●

●●●

●●

●●

●●

●● ●

●●

●●

● ●

●● ●●●

●● ●

●●

●●

●●●

●●

●●

●●

●●

●● ●

●● ●

●●

●●●

●●

● ●

●●

●● ●● ●●

●● ●

●●

●●

●●

●● ●

●●

●●

●●

●●●●

●●

● ●●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●●●

●●

●●

●●●

●●

●● ●

● ●●●●●

●●

●●

●●

● ●

● ●●

●●●

●●

●●● ●●●

●●

●●●

●●

●●●●

●●●

●●

●●●●●

●●●●

●●

●●●

●●

●●

● ●

●●●●

●●●

●●

●● ●●

●●

●●

●●●● ●

●●● ●●

●● ●

●●●

●●●

●●

●●●

●●

● ●

●● ●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●● ●●●●

●●

●●●

●●

●●

● ●

●●

●●

●●

●●●

●●

●●

●●

●●●

●●

●● ●●●●●

●●

● ●

●●

●●

●●

● ●

● ●●●

● ●

●●

●● ●

●●

●●

●●

●●

●●

● ●●

●●●

●●●

X3●

●● ●

●●●

●●

●●● ● ●●

●● ●

● ●

●●

●●

● ●

● ●

●●

●●

● ●

●●

●●

●●

●●

●●

●●● ●

●●

●●

●●

●●●

●●

● ●●●

●●

● ●

●●

●●

●●

●●

●●●●●

● ●

● ●●

●●●

●●●●

●●

●●

●●

●●●

●●

●●●●

●●

● ●●

●●

●●

●●

●●●● ●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●●●

●●

●● ● ●●●

●●●

●●

●●●

●●

●●

● ●

●●

●●●

●●

●●

●●

● ●●

●●●

●● ●●

● ●

●●●

●●

●●

●● ●

●●

●●

●●

●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●

● ●●●

●●

● ●●

●●

●●●

●●

●●

●●

●●

●●●

● ●●

●● ●● ●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●●

●●

●●

●●

●●● ●●● ●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●●

●●

●●

●●

●●

●●

●●●● ● ●

●●●

●●

● ●

●●

●●●

●●

●●

●●

●●●●

●●

●● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●● ●

●●

●●

●●●

●●

●● ●

●● ●●● ●

●●

●●

●●

●●

●● ●

● ●●

●●

● ●●●

● ●

●●

●●●

●●

●●● ●

●●

●●●

● ●●●●

●● ●●

●●

● ●●

● ●

●●

●●

●●●

●● ●

●●

●●●●

● ●

●●

●●● ●●

●●

●●●

●●●

●●●

● ●●

●●

●●●

●●

●●

● ●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

● ●● ● ●●

●●

●●●

●●

● ●

●●

● ●

●●

●●

●● ●

●●

●●●

●●

●●●

●●

●●●● ●●●

●●

●●

●●

●●

●●

●●

●●●●

●●

●●

●●●

●●

● ●

●●

●●

●●

●● ●

● ● ●

● ●●

−1

13

5

●●●

●● ●●

●●●●●●

●●●●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

● ●

● ●

●●

●●●●

●●

●●

●●

●●●

●●

●●●

●●

● ●

●●

●●

●●●

●●●

●●● ●●

●●

● ●●

●●●

●●● ●

●●

●●

●●

●●●

● ●

●●●●

●●

●●●

●●

●●

●●

●●● ●●

●●

●●

●●

●●

● ●

●●

●● ●

●●

●●

●●

●●●●

●●

● ●● ●●●●

●●●

●●

●● ●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●●

●●●

●●●●

●●

●●●

●●

●●

●●●

●●

●●

●●●

●●

●●●●●

●●

●●

● ●●

●●●

●●

● ●

●●

●● ●●

●●

●●●

●●

●● ●

● ●

●●

●●

●●

●● ●

●●●

●● ●●●●

●●

●●

●●

● ●

●●●

●●

●●

●●

●●●

●●

●●

● ●

●● ●●● ●●●

●●

●●

●●●

●●

●●●

●●

●●

●● ●

●●●

●●

●●

●●

● ●

●●

● ●●● ●●●

●● ●

●●

●●

●●

●●●

●●

●●

●●

●●●●

●●

●●●●●●

●●

●●●

●●●●

●●

●●

● ●

●●●●

●●

●●

●●●

●●

●●●

●●●●●●

●●

●●

●●

● ●

● ●●

●●●

●●

●●●●

●●

●●

●●●

●●

●●●●

●●●●●

●●● ●●

●●●●

●●

●●●●●

●●

● ●

●●●●

●●●

●●

●● ●●

●●

●●

●●●● ●

●●● ●●

●● ●

●●●

●●●

●●

● ●●

●●

●●

●● ●

●●

● ●

●●

●●

●●

●●

●●

●●

●●●

●● ●●●●

●●

●●●

●●

●●

● ●

●●

●●

●●

● ●●

●●

●●

●●

●●●

●●

●● ●●●●

●●

●●

●●

●●

●●●

●●

●● ●●

● ●

●●

●● ●

●●●

●●

●●

●●

●●

● ●●

●●●

●●●

1.0

2.5

4.0

●●●

●●●

●●

●●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

● ● ●●

●●

● ●●

● ●

●●

●● ●●

●●

●●

●●

●●●

●●

●●

●●●

●●●

●●

●●

●●●●

● ●

●●

●●

●●

●●

●● ●

●●

●●●●

●●

●●●

●●

●● ●

● ● ●

●●●

● ●

●●

● ●●

●●

●●

●●

● ●●●

●●

●●

●●

●● ●

●●●● ●

● ●

●●● ●

●●

●●

● ●

●● ●

●●

●●●

●●

●●

●●●

●●●

●●

●●

● ●

●● ●

●●●

● ●

●●

●●

●●

● ●

● ●●●

●●

●●●

●●

●●

●●

●●

●●

● ●●

●●

●●

● ●

●●

●●

●●

●●●

●●●

●●

●●

●●●

●●

●●

● ●

●●

●●

● ●

●●

●●

●● ●●

●●

●●

●●

● ●

● ● ●

●●

●●●

●●

●● ●

●●

●●

● ●

●●

●●●

●●

●●

●●

●● ●

●●●●

●●

●●●

●●●

●●

● ●

●●

●● ●

●●

●●

●●●

●●●●

●●●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

● ●●

●●

●●

●●●●

●●

●●

●● ●

●●

●●

●●

● ●

●●

●●●

●●

●● ●●

●● ●

●●

● ●

●●

●●●

● ●●

●●

● ●●

●●●

●●

● ●●●

●●

●● ●

●●

●●●

●●●●

● ●●

●●●

●●

●●

●●●

●●●●

●●

● ●

●●

●●●

●●

●●

●● ●●

●●●

●●●

●●

●●●

●●

●●

● ●●

●●

●●

●●●●●

●●

●●

●●

●●●

●●

●●●

●●

● ●

●●●

●●

●●

●●●

●●

●●

● ●● ●

● ●

●●●

●●

●● ●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●●●

● ●

●●●

●●

● ●●

●●●

● ●

● ●●

●●

●●

●●

●●

●●●●

●●●

●●

●●●●

●●●

● ●

●●

●●

●●●

● ●●●●

● ●●

●●

●●

●●●●

●●

●●

● ●

● ●●

●●

●●

●●●

●●●●

●●●

●●

●●

● ●

●●

●●

●●

● ●

●●●

●●

●●●●

●●

●●●●●

●● ●

● ●●

● ●●

●●

●●

●●●

●●

●●

●●

●● ●●●

●●

●●●●●

●● ●●●

●●

●●●●

●●●

●●

●●

●●●

●●

●●●

●●

●●

●●●

●●●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●● ●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●●●●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●● ●

●●

●●

●●

● ●

●●●

●●

●●●

●●

●● ●

●●

●●

●●

●●

●●

●●●

● ●

●●

●● ●

●● ●●

●●

● ●●

●●●●●

●●

●●

●●●●●

●●●

●●●

●●● ●

●●● ●

●●●

● ●

●●●●

●●

●●

●●●●

●●

●●●

●●

●●

●●●●●●

●●

●●●

●●

●●

● ●

●●

●●

●●●

●●

●●● ●

●● ●

● ●

● ●

●●

●●●

● ●●

●●

●●●

●●●

●●

●●●●

●●

●●●

●●

●●●

●● ●●

●●●

●●●

● ●

●●●

●●●

●●●●

●●

●●

●●●

●●●●

●●

●●●●●●

● ●●

●● ●

●●

●●●●

●●

●●

●● ●

●●

●●

●●● ●●●

●●

●●

●●●

● ●

●●●

●●

●●

●●●

●●

●●●● ●

●●

●●●

●●●●

●●

●●

●●●● ●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●●●

●●

●●●

●●

● ●●

●●●

● ●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●●

● ● ●●

●●

● ●●

● ●

●●

●●●●

●●

●●

●●

● ●●

●●

●●

●●●

●●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●● ●

●●

● ●● ●

●●

●●●

● ●

●●●

● ●●

● ●●

● ●

●●

●●●

●●

●●

●●

● ●●●●

●●

●●

●●●

●● ●● ●

● ●

●●●●

●●●

●●

● ●

●● ●

●●●

●●●

● ●

●●

●●●

●●●

●●

●●

● ●

●●●

●●

●●●

●●

●●

●●

●●

●●●●

●●

●●●

●●

●●

●●

●●

●●

● ●●

●●

●●

● ●

●●

●●●

●●●

●●●

●●

●●

●●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●●●

●●

●●

●●

●●

● ●●

●●

●●●

● ●

●● ●

●●

●●

●●

●●●

●●

●●

● ●

●●

●● ●

●● ●

●●

● ●●

●●●

●●

●●

●●

●●●

● ●

●●

●●●

●●●●

●●● ●

●●●

● ●

●●

●●●

●●

●●

●● ●

●●

● ●

● ●●

●●

●●

●●● ●

●●

●●

●●●

●●

●●

●●

● ●

●●

●● ●

●●

●●●●

●● ●

●●

●●

●●

●●●

●●●

●●

● ●●

●●●

●●

● ●●●

●●

●●●

●●

●● ●

● ●●●

● ●●

●●●

● ●

●●

●●●

●●●●●

● ●

● ●

●●

● ●●

●●

●●

●●●●

● ●●

●●●

●●

●● ●

●●

●●

●● ●

●●

●●

●●● ●●●

●●

●●

●●●

● ●

●● ●●

●●

● ●

●●●

●●

●●

●●●

●●

●●

● ●● ●

● ●

●●

●●

●● ●

●●

● ●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●● ●●●

● ●

●●●

●●

● ●●

●● ●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

● ●

●●

● ●

●●●

●●●●

●●

●●●

●●

●●

●●●●

●●

●●

●●

●●●

● ●

●●

● ●●

●●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

● ●●

●●

●●●●

●●

● ●●

●●

●●●

●● ●

●●●

●●

●●

●●●

●●

● ●

●●

●●●●

●●

●●

●●

● ● ●

●●●●●

●●

●●●●

●●

●●

●●

●●●

●●

● ●●

●●

● ●

●● ●

●●●

●●

●●

●●

●●●

●●

●● ●

●●

●●

●●

● ●

● ●●●

● ●

●●●

●●

● ●

●●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●●

●●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●● ●

●●

●●

●●

●●

●● ●

●●

●● ●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●●●

●●

●●●

● ●●

● ●

● ●

●●

●● ●

●●

●●

●●●

●●● ●

●● ●●

●●

●●

●●

●●●

●●

●●

●●●

●●

●●

●●●

●●

●●

●● ●●●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●● ●

●●●

● ●

●●

●●

●●●

●● ●

●●

●●●

●● ●

●●

●●●●

●●

●● ●

●●

●●●

●● ● ●

●●●

●●●

●●

●●

● ●●

●●● ●

●●

●●

●●

●●●

●●

●●

●●●●

●● ●

●●●

● ●

●●●

● ●

●●

● ●●

●●

●●

●●●● ●●

● ●

●●

●● ●

●●

●●●

●●

●●

● ●●

●●

●●

● ●●

●●

●●

●● ●●

●●

●●

●●● ●●

●●●●

●●

●●

●●

●●

●●

●●

● ●

● ●

● ●

● ●●●●

●●

● ●●

●●

● ●●

●●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●●

● ●●●

●●

●●●

●●

●●

●●●●

●●

●●

● ●

● ●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●●

●●

● ●●●

●●

●●●

● ●

●●●

● ●●

● ●●

●●

●●

● ●●

●●

● ●

●●

●● ●●●

●●

●●

●●●

●● ●●●

●●

●●●●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●●

●●●

●●

●●

●●

●●●

●●

●● ●

●●

●●

●●

●●

●●●●

●●

●● ●

●●

●●

●●

●●

●●

●●●

●●

●●

● ●

●●

●●

●●

●●●●●

●●

●●

●●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●● ●

●●

● ●

●●

● ●

●●●

●●

●●●

● ●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●●

●● ●

●●

●●●

●●●

●●

●●

●●

●● ●

● ●

●●

●●●

● ●●●

●●●●

●●

● ●

●●

●●

●●

●●

●●●

●●

● ●

● ●●

●●

●●

●●●●

●●

●●

●● ●

●●

●●

●●

●●

●●

●● ●

●●

●● ●●

●● ●

●●

● ●

●●

●●●

● ●●

●●

●●●

●●●

●●

●●●●

●●

●● ●

●●

●●●

● ●● ●

●●●

●●●

● ●

●●

●●●

●● ●●

●●

●●

●●●

●●●

●●

●●

●●●●

● ●●

●● ●

●●

●●●

●●

●●

● ●●

●●

●●

●●● ●●

●●

●●

●●

●●●

●●

●●●

●●

● ●

● ●●

●●

●●

●●●

●●

●●

● ●● ●

● ●

●●

●●●●●

●●● ●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●●●●

●●

●●●

●●

X4●●●

●● ●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●● ●

●●

● ●

●●

● ●

●●●

●●●●●

●●●

●●

●●

●●●●

●●

●●

●●

●●●

●●

●●

●●●

●●

●●

●●●

●●●●

●●

●●

●●

●●

●●

●●●

●●

●●●●

●●

●●●●●

●●●

●● ●

●●●

●●

●●

●●●

●●

● ●

●●●

●●●●●●

●●

●●

● ●●

●●●● ●

●●

●●●●

●●●

●●

●●

●●●

●●

●●●

●●

●●

●●●

●●●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●●

●●

● ●●●

●●

●● ●

●●

● ●

●●

●●

●●

●●●

●●●

●●

●●

●●

●●

●●

●●●

●●●

●●

●●

●●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●●●●●

● ●

●●

●●

●● ●

●●●

●● ●

●●

●●●

●●

●●

●●

●●●

●●●

●●

●●

●●

●●●

●●●●

●●

●●●

● ●●

● ●

●●

●●

●● ●

●●

●●●

●●●

● ●●●

●●●●

●●

● ●

●●●●

●●

●●●

●● ●

●●

●●

●●●

●●

●●

●● ●●

●●●

●●●

●●

● ●

●●

●●

●●

●●●

●●

●●● ●

●●●

●●

●●

●●

● ●●

●● ●

●●

●●●

●●●

●●

●● ●●

●●

●●●

●●

●●●

●●● ●

●●●

●●●

●●

●●

● ●●

●●●●●

●●

●●

●●

●●●●

●●

●●●●●

●●●

●●●

● ●

●●●●

● ●

●●●

●●●

●●

●●

●●●●●●●

● ●

●●

●● ●

● ●

●●●

●●

●●

●●●

● ●

●●●● ●

●●

●●●

●● ●●

●●

●●●

●●● ●●

●●●●

●●

●●

●●

●●

●●

● ●

● ●

● ●

● ●

●●●● ●

●●

● ●●

●●

−28 −22 −16

● ●●● ●●

● ●

●● ● ●●● ●●●●

●●●

● ● ●●

● ●

●●

●●

● ●

●●

● ●

● ●●

●●

●●

●●

● ●

● ● ●

● ●

●●

● ●●

●●

●●●●●●● ●●

● ●

●●

● ●●●

●●●

●●●

●● ●●

●● ●

●●

●●

●●

● ●

●●●●

● ●

●●

● ●● ● ●●

●●

● ●●

●●●

●●●●

● ●

● ●

●●

● ●

● ● ●

●●

●●●

● ●● ●

●●

● ●●●

● ●●

● ●●●

●●

● ●

● ●●

●●

●●

● ●

● ●

●● ●

●●

● ●● ●

●●

●● ●●●

●●

●●

●●

●●

●●●●●

●●●●●

●●●

● ●●

●●

● ●

●●

●●●

● ●●●●

● ●

● ●● ●●●

●●

●●●

●●●●●

● ●

●●

●● ●

●●

● ●●

●● ●●● ●● ●

● ●●●

●●●

● ●

●●

●●

● ●

●●

●● ●●● ●

●●

●●●●

● ●

●●

● ●

● ●

● ●● ●●

●●●

●●●

●● ● ●

●●●

●●

●●

●●

● ●●

●●●●

● ●

● ●

●●

●●●

●●

●● ●●

●●

●● ●

●●

●● ●

● ●

●●

● ●

●●●

● ●

●●

●● ●●●

●●●● ●

●●

●●

●●

●●

● ●

●●

● ●

● ●

● ●

● ●●● ●●●

●●

●● ●

● ●

● ●

●●●●

● ●●●

●●

●●

●●●

●●

●●●●

●●

● ●●●

●●

●●

● ●

● ●

● ●

●●

●●

●● ●

● ●

●●●●

● ●

●● ●●

●●

●●

● ●

● ●●●● ● ●

● ●● ●●

● ●

●● ●

● ●

● ●

●●

● ●●●

● ●

●● ● ●

●●

● ●

●●●

● ●

●●

●● ●● ●● ●●●

● ●

●●●

● ●● ●

●●

●● ●●●

●●●

● ●●

●● ●●●●● ●

●●

●●●●●●●

●● ●

● ●

●●

●●

●● ●●●●

● ●

●●

● ●●

●●

●● ●● ●●

●●●●

●●

●● ●●

●●●

● ●●

● ●●●

●●●

●●●

● ●

●●

●●●●

● ●

●●

●●

● ●● ●

●●

● ●●●

●●

● ●

●●

●●

●●

● ●

● ●

●● ●

● ●●

● ●● ●

● ●●

●● ●●

●● ●●●

● ●

●●●

● ●●

●●

●●

●●

●● ●●● ●

●● ●

● ●●●

● ●

●●

● ● ●

● ●● ●●●●

●●

●●●● ●●●● ●●

●● ●

●● ●●

●●

●●

●●

● ●

●●

● ●

●●●

●●

●●

●●

●●

●● ●

● ●

●●

● ●●

●●

● ●●● ●●●● ●

● ●

●●

●● ●●

●● ●

●●●

● ●●●

●●●

●●

●●

●●

●●

● ●●●

●●

●●

●●●●● ●

●●

● ●●

●●●

● ●●●

● ●

●●

●●

●●

●●●

● ●

●●●

●● ●●

●●

●●● ●

●● ●

●●● ●

● ●

●●

●●●

● ●

●●

●●

●●

●●●

●●

●●●●

●●

●●●●●

●●

●●

●●

●●

● ●●●●

●●●● ●

● ●●

● ●●

● ●

●●

●●

●●●

● ●●●●

●●

● ●● ●●●

● ●

●● ●

● ●●●●

●●

●●

●● ●

●●

●● ●

●● ●●● ●●●

● ●● ●

● ●●

●●

●●

● ●

●●

● ●

●● ●●● ●

●●

●●●●

●●

●●

● ●

●●

●●●●●

●●●

● ●●

● ●● ●

●●●

●●

●●

● ●

●●●

● ●●●

●●

● ●

●●

●● ●

● ●

●●●●

●●

●● ●

●●

● ●●

●●

●●

● ●

●● ●

●●

●●

●●● ●●

● ●● ●●

● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●●●●●●

●●

● ●●

●●

● ●

●●● ●

●● ●●

●●

●●

●●●

● ●

●●● ●

● ●

●●● ●

●●

●●

● ●

●●

●●

● ●

●●

● ●●

●●

●●●●

●●

●●●●

●●

●●

● ●

●●●● ●● ●

● ●● ●●

●●

●●●

● ●

●●

●●

●●● ●

● ●

●●●●

●●

● ●

● ●●

●●

●●

●●●●●●●● ●

● ●

●● ●

●●●●

●●

●● ●●●

●● ●

●● ●

●● ●● ●● ●●

●●

● ●●●●● ●

●● ●

●●

●●

●●

●●●● ●●

●●

●●

●● ●

●●

● ●● ●●●

●●● ●

●●

●●● ●

●●●

●● ●

●●● ●

● ●●

● ●●

●●

●●

●● ●●

●●

●●

●●

● ●●●

●●

●●●●

●●

●●

●●

● ●

●●

●●

● ●

●●●

●● ●

●● ●●

●●●

●●● ●

● ●●●●

●●

●●●

●● ●

●●

● ●

●●

●● ●●●●

●●●

●●●●

●●

●●

●●●

−1 1 3 5

●● ●●● ●

●●

●●●● ●●● ● ●●

●●●

●● ●●

● ●

●●

●●

● ●

●●

● ●

●● ●

●●

●●

● ●

●●

●●●

● ●

●●

● ●●

●●

●● ●● ●●●● ●

● ●

●●

● ●●●

●● ●

●●●

●● ●●

●● ●

●●

●●

●●

●●

● ●●●

●●

●●

● ●● ●●●

● ●

● ●●

●● ●

● ●● ●

●●

● ●

●●

●●

● ●●

● ●

●●●

● ● ●●

●●

●●● ●

● ●●

●●●●

●●

●●

●●●

● ●

●●

● ●

●●

●●●

●●

●●● ●

●●

●●●● ●

●●

●●

●●

● ●

● ●●●●

●●● ● ●

● ●●

●● ●

● ●

● ●

●●

●●●

●●● ●●

●●

● ●● ●●●

●●

●●●

● ●● ●●

● ●

●●

● ●●

● ●

●● ●

●● ●●● ●● ●

● ●● ●

● ●●

●●

●●

●●

● ●

●●

● ● ●●●●

●●

●●●●

●●

●●

● ●

●●

●● ●●●

●● ●

● ●●

●● ● ●

●●●

●●

●●

●●

●● ●

● ●●●

● ●

●●

●●

●● ●

●●

●●●●

● ●

●● ●

●●

●● ●

●●

●●

●●

● ● ●

● ●

●●

● ●● ●●

● ●●●●

● ●

● ●

● ●

●●

●●

●●

● ●

●●

● ●

●●●●●●●

●●

●●●

●●

● ●

●●● ●

●●● ●

●●

● ●

● ●●

● ●

●●● ●

●●

●● ●●

● ●

● ●

● ●

● ●

●●

● ●

●●

●●●

●●

● ● ●●

●●

●●●●

● ●

●●

● ●

●●● ●●● ●

● ●●● ●

●●

●●●

● ●

●●

●●

●●●●

●●

●●● ●

●●

●●

● ● ●

● ●

● ●

● ●●● ●● ●●●

● ●

●●●

●● ● ●

● ●

●●●●●

● ●●

● ●●

●● ●● ●●●●

●●

● ●●●●● ●

●●●

● ●

●●

●●

● ● ●●●●

●●

● ●

●●●

●●

●●●●●●

●● ●●

●●

●●● ●

●●●

● ●●

●● ●●

● ●●

●●●

●●

● ●

●● ●●

●●

●●

●●

●●●●

● ●

● ● ●●

● ●

●●

●●

●●

● ●

● ●

● ●

●● ●

●● ●

●● ●●

● ●●

●●● ●

● ●●●●

●●

●●●

●● ●

●●

●●

●●

●● ●●● ●

●●●

●●●●

●●

●●

● ●●

● ● ●●● ●●

● ●

●●●●●● ●●● ●

● ●●

●●● ●

●●

● ●

●●

●●

● ●

●●

● ●●

● ●

● ●

●●

● ●

●●●

●●

● ●

●● ●

●●

● ●●●● ●●●●

●●

● ●

●●●●

● ●●

●● ●

●●● ●

●●●

● ●

●●

● ●

●●

●● ● ●

●●

● ●

●● ●●●●

●●

●● ●

● ●●

●● ●●

●●

●●

● ●

●●

●● ●

●●

●●●

●●● ●

● ●

● ●●●

●●●

●●●●

●●

● ●

●● ●

●●

●●

●●

●●

●●●

●●

●● ●●

● ●

●●● ●●

● ●

● ●

●●

●●

●● ●●●

● ● ●●●

●●●

● ●●

●●

●●

●●

●● ●

●● ●●●

● ●

●● ●●● ●

●●

●●●

●● ●● ●

●●

● ●

●●●

●●

●●●

●●● ● ●● ●●

●● ●●

●● ●

● ●

●●

● ●

●●

●●

●●● ●●●

●●

●●● ●

● ●

● ●

●●

●●

●●● ● ●

● ●●

●● ●

●●●●

● ● ●

● ●

●●

● ●

●●●

●●● ●

●●

●●

●●

● ●●

●●

●●●●

●●

● ●●

●●

● ●●

●●

● ●

●●

●●●

●●

●●

●● ●● ●

●● ●●●

●●

●●

●●

●●

● ●

● ●

●●

● ●

●●

● ●●● ●● ●

● ●

● ●●

●●

●●

●● ●●

●●●●

●●

●●

●● ●

●●

● ● ●●

●●

● ●●●

●●

●●

●●

●●

●●

●●

● ●

●●●

● ●

●●●●

●●

● ●●●

●●

● ●

●●

● ●●●● ●●

●● ●●●

●●

● ●●

●●

● ●

● ●

●● ● ●

●●

● ● ●●

● ●

●●

●●●

●●

●●

●●●●● ●●●●

●●

● ●●

● ●●●

●●

●● ●●●

●● ●

●● ●

●●● ●●●●●

● ●

●● ●●● ●●

●●●

●●

●●

● ●

●●● ●●●

● ●

●●

●●●

● ●

● ●●●●●

●●●●

● ●

●● ●●

●●●

●●●

●●● ●

●● ●

●●●

●●

●●

● ●●●

● ●

●●

●●

●● ●●

●●

●●● ●

●●

●●

●●

● ●

●●

●●

●●

●●●

●●●

●●●●

●●●

● ●●●

●●● ● ●

● ●

● ●●

● ●●

● ●

●●

● ●

● ●● ● ●●

●● ●

●●●●

●●

●●

●●●

−1 1 3 5

●●●●●●

● ●

● ● ●● ●●●●●●

●●●

●● ●●

●●

●●

● ●

●●

● ●

● ●

● ●●

●●

●●

●●

●●

●●●

● ●

●●

●●●

● ●

●●●● ●●●● ●

●●

●●

●● ● ●

●● ●

● ●●

● ●●●

●●●

● ●

●●

● ●

●●

● ●● ●

●●

●●

●●●●● ●

●●

● ●●

●● ●

● ●●●

●●

●●

●●

● ●

●●●

● ●

●●●

●●●●

●●

●●● ●

●● ●

●●●●

●●

●●

●●●

● ●

●●

●●

● ●

●●●

● ●

●●●●

●●

●●●●●

●●

●●

●●

●●

● ●● ●●

● ●● ●●

● ●●

●● ●

● ●

●●

●●

●● ●

● ●● ●●

●●

● ●●● ●●

● ●

●●●

● ●●●●

●●

●●

● ● ●

●●

●● ●

●● ●● ●●●●

● ●●●

● ●●

●●

●●

● ●

● ●

● ●

● ● ●●● ●

●●

●●● ●

●●

●●

●●

● ●

●●●●●

●●●

●●●

● ●● ●

●●●

● ●

● ●

● ●

●●●

● ●●●

●●

● ●

●●

●● ●

●●

●●●●

● ●

●● ●

●●

●●●

●●

●●

● ●

●● ●

●●

●●

●● ●●●

● ●● ● ●

●●

●●

● ●

● ●

●●

●●

● ●

●●

●●

● ●●● ●● ●

●●

●●●

●●

● ●

●●●●

●● ●●

●●

●●

●●●

● ●

● ●● ●

●●

●●●●

● ●

●●

● ●

●●

●●

● ●

●●

●● ●

●●

● ●●●

● ●

●●●●

●●

●●

● ●

●●● ●●● ●

● ●● ● ●

● ●

●● ●

● ●

●●

●●

●● ●●

● ●

●●● ●

●●

● ●

●●●

●●

●●

● ●●●●●● ●●

● ●

●●●

● ●● ●

● ●

●●●●●

● ●●

●● ●

●●●● ●●●●

●●

●●●● ●●●

●●●

●●

●●

● ●

●●● ●●●

●●

● ●

●● ●

●●

● ●● ●● ●

● ●● ●

●●

●● ●●

●●●

●● ●

●● ●●

● ●●

● ●●

● ●

●●

●● ●●

●●

●●

●●

●● ●●

●●

● ● ●●

● ●

●●

●●

● ●

● ●

● ●

● ●

● ● ●

●● ●

●● ●●

●●●

●●● ●

● ●● ●●

●●

●● ●

●● ●

●●

●●

●●

●●●●●●

●●●

●●● ●

●●

● ●

●●●

● ●●● ●● ●

● ●

●● ●●●●● ● ●●

● ●●

●● ●●

●●

● ●

●●

● ●

●●

● ●

●● ●

●●

●●

● ●

●●

●● ●

●●

● ●

●●●

● ●

●● ● ●●●●●●

● ●

●●

● ●● ●

●●●

●●●

●●●●

●● ●

●●

● ●

● ●

●●

●●● ●

●●

●●

● ●●●●●

●●

● ●●

●●●

● ●● ●

●●

● ●

● ●

●●

●●●

● ●

●●●

● ● ●●

●●

● ●●●

●●●

● ● ●●

●●

●●

● ●●

● ●

●●

●●

●●

●●●

● ●

●●●●

●●

●●● ●●

●●

●●

● ●

●●

● ●● ●●

●●● ●●

●●●

●● ●

● ●

●●

●●

● ●●

●●● ●●

●●

● ● ●● ●●

●●

●●●

● ●● ●●

●●

●●

●● ●

●●

● ●●

●● ●●● ●●●

● ●● ●

● ●●

●●

●●

●●

●●

●●

● ● ●●●●

●●

● ●●●

●●

●●

●●

● ●

●● ●●●

●●●

● ●●

● ●● ●

●●●

●●

●●

●●

●● ●

● ●●●

●●

●●

● ●

●● ●

●●

●●● ●

●●

●●●

● ●

●● ●

●●

● ●

●●

●● ●

● ●

● ●

●● ● ●●

●●●● ●

●●

● ●

● ●

●●

●●

●●

● ●

●●

● ●

● ●●●● ●●

●●

●●●

●●

● ●

●●●●

●● ● ●

● ●

● ●

●●●

● ●

●●●●

●●

●●● ●

● ●

● ●

● ●

●●

● ●

●●

●●

●●●

●●

● ● ●●

●●

●● ●●

● ●

●●

●●

●● ● ●●● ●

● ●●● ●

●●

●● ●

●●

●●

●●

● ●●●

●●

●●● ●

●●

●●

●● ●

●●

● ●

● ●● ●●●●● ●

● ●

●● ●

●●● ●

● ●

● ●● ● ●

●●●

●●●

●●●●● ●● ●

●●

● ●●●●●●

●● ●

●●

●●

●●

●●●● ●●

●●

●●

●●●

●●

● ●●●●●

●●●●

●●

● ●●●

●●●

●● ●

●● ●●

●●●

●●●

●●

● ●

●● ●●

● ●

● ●

● ●

● ●●●

●●

●● ●●

● ●

● ●

●●

●●

● ●

●●

● ●

● ●●

●●●

● ●●●

● ●●

● ●●●

● ●●●●

●●

●●●

●● ●

●●

●●

● ●

●● ●●●●

● ●●

●●● ●

● ●

●●

● ●●

0.0 0.4 0.8

0.0

0.4

0.8

X5

Figure: Matrix plot of the new data set

Albert Satorra

Page 60: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Multiple regression

fit1 = lm(Y1 ˜ X1+ X2+X3+X4+X5+factor(X6))summary(fit1)

Coefficients:Estimate Std. Error t value Pr(>|t|)

(Intercept) -22.98561 0.71205 -32.281 <2e-16 ***X1 0.20459 0.17081 1.198 0.2313X2 -0.28337 0.16639 -1.703 0.0889 .X3 -0.04619 0.05681 -0.813 0.4163X4 0.00525 0.11536 0.046 0.9637X5 -0.07055 0.12551 -0.562 0.5742factor(X6)2 2.83711 0.14656 19.359 <2e-16 ***factor(X6)3 -0.08415 0.12816 -0.657 0.5116factor(X6)4 0.02103 0.12875 0.163 0.8703

Residual standard error: 1.437 on 991 degrees of freedomMultiple R-squared: 0.454, Adjusted R-squared: 0.4496F-statistic: 103 on 8 and 991 DF, p-value: < 2.2e-16

Albert Satorra

Page 61: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Multiple regression (excluding X2)

fit1 = lm(Y1 ˜ X1+X3+X4+X5+factor(X6))summary(fit1)

Coefficients:Estimate Std. Error t value Pr(>|t|)

(Intercept) -24.074327 0.313959 -76.680 < 2e-16 ***X1 0.472673 0.066398 7.119 2.09e-12 ***X3 -0.042238 0.056819 -0.743 0.457X4 0.002787 0.115466 0.024 0.981X5 -0.100423 0.124402 -0.807 0.420factor(X6)2 2.822129 0.146432 19.273 < 2e-16 ***factor(X6)3 -0.070031 0.128018 -0.547 0.584factor(X6)4 0.028213 0.128803 0.219 0.827-Residual standard error: 1.438 on 992 degrees of freedomMultiple R-squared: 0.4524, Adjusted R-squared: 0.4485F-statistic: 117.1 on 7 and 992 DF, p-value: < 2.2e-16

Albert Satorra

Page 62: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Diagnostic in linear multiple regression (Fox’s library(car))

library(car)fit1 = lm(Y1 ˜ X1+X2+X3+X4+X5+factor(X6))

# Evaluate Collinearityvif(fit) # variance inflation factorssqrt(vif(fit)) > 2 # problem? VIF > 4 ?

ncvTest(fit1)Non-constant Variance Score TestVariance formula: ˜ fitted.valuesChisquare = 0.2619034 Df = 1 p = 0.6088155

> durbinWatsonTest(fit1)lag Autocorrelation D-W Statistic p-value1 -0.03516656 2.069103 0.286

Alternative hypothesis: rho != 0

## Multicolineality> vif(fit1)

GVIF Df GVIFˆ(1/(2*Df))X1 14.969943 1 3.869101X2 14.524505 1 3.811103X3 1.136309 1 1.065978X4 1.632153 1 1.277557X5 1.907518 1 1.381129factor(X6) 1.401702 3 1.057895

# Evaluate Nonlinearity# component + residual plotcrPlots(fit1)

Albert Satorra

Page 63: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

−1 0 1 2 3 4 5

−4

−2

02

4X1

Com

pone

nt+

Res

idua

l(Y1)

●●

●●●

●●●

● ●

● ●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●

●●

●●

●●● ●

●●

● ●

●●●

● ●

●●

●●

●●

●●

●●● ●

●●

●●

●●

●● ●

●●

● ●

●●

●●● ●

● ●

● ●

●●

●● ●

● ●●●

● ● ●

●●

● ●

● ●

●●●

●●

●●

●●

●● ● ●

●●

●●

●●

●●

●●

●●

●●

●●

●● ●

● ●

●●

●●

● ●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●

● ●●

● ●●

●●

● ●

●●

●●

●●

●●

● ●●

●●

●●● ●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●●

●●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●●

● ●

●●

● ●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●● ●

●● ●

●●

● ●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●● ●

●●

●●

●●

●●●

●●

●●

● ●

●●

●●

●●

● ●●

●●

●●

● ●

●●

●●

●●

●●●

●●

●●

●●

● ●

● ●●

●●

●●

● ●

●● ●

●●

●●

●●

−1 0 1 2 3 4 5

−4

−2

02

4

X2

Com

pone

nt+

Res

idua

l(Y1)

●●

●●● ●

●● ●

●●

●● ●

●●

● ●

●●

●●

●●

● ●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●● ●●

●●

● ●●

● ●●

●●

●●

●●

● ●●

●●

●●

●●

●●●

●●● ●

●●●

●●

●●

●●

●● ●

●●

●●

● ●

● ●●●

●●

● ●●

●●

●●

● ●

●●

● ●

● ●

●●●

●●

● ●

●●

●●

●●

● ●

● ●●●

●●

●●

● ●

●●

● ●

● ●

●●

●●

●●

●●

●●

●●

●● ●

●● ●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

● ●

●●

●●

●●

●●

●●

● ●

●●●

●● ●

●●

●●

● ●

●●

●●

●●

●●

● ●

●●● ●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●●

●●●

● ●

●●●

●●

●●

●●

● ●

●● ●●

● ●●

●●

●●

●●●

●●

●●

●●

●● ●●

●●

●●

●●

● ●

●●

●●

●●●●●

● ●

●●

●●

●●

●●

●● ●

●●

● ●

●●●

●●●

●● ●

● ●

●●

●●●

●●

●●

●●

●●

−1 0 1 2 3 4

−4

−2

02

4

X3

Com

pone

nt+

Res

idua

l(Y1)

●●

●●●

●●

●●●

●●

●●

●●

●●

● ●

●●●

●●

●●

●●

● ●●

●●

● ●●

● ●

●●

●●

● ●

● ●

●●● ●

●●

●● ●

●●

●●

●●●

● ●

●●

●●

● ●●

●●

●●

●●●

●●

●●

● ●

●● ●

●●

●●

●●

●●

●●

●●

●● ●●

●●

●●

●●

●●

● ●

● ●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●

●●

● ●●

●●

●●

●●

●●

●●

● ●●

●●

● ●

●●●

●●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●

● ●

● ●

●●

● ●

●● ●

● ●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●●

● ●

● ●

● ●

●●

●● ●

●●

● ●

●●

● ●

● ●

●●

●●●

●●

●●

●●

●●

●●

● ●

●●

●●

● ●

●● ●

●●

●●

●●

●●●●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●

1.0 1.5 2.0 2.5 3.0 3.5 4.0

−4

−2

02

4

X4

Com

pone

nt+

Res

idua

l(Y1)

●●

●●●

● ●

● ●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●●

●●

●●

●●●

● ●

●●

●●

●●

● ●

● ●

●●● ●

● ●

●● ●

●●

● ●

●●

● ●

● ●

● ●

●●●

●●

●●

● ● ●

●●

●●

●●

●●

● ●

● ●●

●●

●●

●●

● ●

●●

●●

● ● ●

●●

●●

●●

●●

● ● ●

●●

●●

● ●

●●

●●

●●

●●

●●

● ●

● ●

●●

●●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

● ●●● ●

●● ● ●

●●

●●

● ●●

● ●

● ●

●●

●●

●●

●●●

● ●

●●●

●●

●●

●●

●●

● ●

●●

●●

●● ●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●● ●●

●●

●●

●●

●●

● ●

●●

● ●

●● ●

●●

●●

●●

● ●●

●●

●●

●●

●●

● ●

●●

●● ●●

●●

●●

● ●

●●

●●

●●●

●●

●●

● ●

●● ●

●●

●●

● ●

●● ●●

●●

● ●

●●

●●

● ●

●●

●●

●●

●●

0.0 0.2 0.4 0.6 0.8 1.0

−4

−2

02

4

X5

Com

pone

nt+

Res

idua

l(Y1)

●●

●● ●

●●

●●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

● ●

●●●

●●

●●

●●●

● ●

●●

●●

●●

●●

●●

●●●●

● ●

● ●●

●●●

●●

●●●●

●●

●●

●●

●●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●●●

●●

●●

● ●●

●●

●●

●●

●●

●●

●●●●

●●

●●

● ●

●●

●●

●●

●●●

●●

●●

●●

●● ●

●●

●●

●●

●●

●●●●

●●

●●

●●

●●

● ●

●●

●●

● ●●●●

●●●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●●●

●●●

●●

●●

●●

●●

●●●

●●

●●

●● ●●

●●

●●

●●

●●

● ●

●●

●●

● ●●

● ●

●●

●●

●●

●●●

●●

●●

●●●●

●●

● ●

● ●●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

● ●●

● ●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

1 2 3 4

−4

−2

02

46

factor(X6)

Com

pone

nt+

Res

idua

l(Y1)

Component + Residual Plots

Figure: component + residual plot

Albert Satorra

Page 64: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Regression with a principal component

> F1 = princomp(cbind(X1,X2,X3,X4))$score[,1]> fit1 = lm(Y1 ˜ F1+X5+factor(X6))> summary(fit1)

Call:lm(formula = Y1 ˜ F1 + X5 + factor(X6))

Residuals:Min 1Q Median 3Q Max

-4.6418 -0.9351 0.0393 0.9297 4.1060

Coefficients:Estimate Std. Error t value Pr(>|t|)

(Intercept) -23.20763 0.11828 -196.204 < 2e-16 ***F1 0.30851 0.03699 8.341 2.43e-16 ***X5 -0.10188 0.12490 -0.816 0.415factor(X6)2 2.82883 0.14666 19.288 < 2e-16 ***factor(X6)3 -0.06937 0.12767 -0.543 0.587factor(X6)4 0.02032 0.12875 0.158 0.875---Signif. codes: 0 ’***’ 0.001 ’**’ 0.01 ’*’ 0.05 ’.’ 0.1 ’ ’ 1

Residual standard error: 1.439 on 994 degrees of freedomMultiple R-squared: 0.4509, Adjusted R-squared: 0.4482F-statistic: 163.3 on 5 and 994 DF, p-value: < 2.2e-16

Albert Satorra

Page 65: Teoria bàsica i exemples: regresssió simple, múltiple i ...satorra/dades/M2014Regressio.pdfDescripci´o de dades i infer `encia estad ´ıstica Distribuci´o bivariada: regressi

Descripcio de dades i inferencia estadısticaDistribucio bivariada: regressio simple

Regressio multipleRegressio Logıstica

Second data set: multiple regression and logistic regression

Logistic Regression

library(rms)lrm(formula = Y2 ˜ X1 + X2 + X3 + X4 + X5 + factor(X6))

Model Likelihood Discrimination Rank Discrim.Ratio Test Indexes Indexes

Obs 1000 LR chi2 405.30 R2 0.445 C 0.8420 537 d.f. 8 g 2.003 Dxy 0.6851 463 Pr(> chi2) <0.0001 gr 7.412 gamma 0.686

max |deriv| 1e-07 gp 0.341 tau-a 0.341Brier 0.162

Coef S.E. Wald Z Pr(>|Z|)Intercept -2.0695 1.2476 -1.66 0.0972X1 0.0250 0.3004 0.08 0.9336X2 1.3035 0.2989 4.36 <0.0001X3 -0.0816 0.0967 -0.84 0.3987X4 -0.0963 0.2039 -0.47 0.6365X5 -0.1624 0.1986 -0.82 0.4134X6=2 -2.6178 0.2996 -8.74 <0.0001X6=3 0.3319 0.2102 1.58 0.1144X6=4 0.5034 0.2117 2.38 0.0174

Albert Satorra