Playing with features for learning and prediction Jongmin Kim Seoul National University.
-
Upload
rosaline-fletcher -
Category
Documents
-
view
212 -
download
0
Transcript of Playing with features for learning and prediction Jongmin Kim Seoul National University.
![Page 1: Playing with features for learning and prediction Jongmin Kim Seoul National University.](https://reader035.fdocuments.in/reader035/viewer/2022070323/56649e245503460f94b1212f/html5/thumbnails/1.jpg)
Playing with features forlearning and prediction
Jongmin KimSeoul National University
![Page 2: Playing with features for learning and prediction Jongmin Kim Seoul National University.](https://reader035.fdocuments.in/reader035/viewer/2022070323/56649e245503460f94b1212f/html5/thumbnails/2.jpg)
Problem statement
• Predicting outcome of surgery
![Page 3: Playing with features for learning and prediction Jongmin Kim Seoul National University.](https://reader035.fdocuments.in/reader035/viewer/2022070323/56649e245503460f94b1212f/html5/thumbnails/3.jpg)
Predicting outcome of surgery
• Ideal approach
. . . .
?
Training Data
Predicting out-come
surgery
![Page 4: Playing with features for learning and prediction Jongmin Kim Seoul National University.](https://reader035.fdocuments.in/reader035/viewer/2022070323/56649e245503460f94b1212f/html5/thumbnails/4.jpg)
Predicting outcome of surgery
• Initial approach– Predicting partial features
• Predict witch features?
![Page 5: Playing with features for learning and prediction Jongmin Kim Seoul National University.](https://reader035.fdocuments.in/reader035/viewer/2022070323/56649e245503460f94b1212f/html5/thumbnails/5.jpg)
Predicting outcome of surgery
• 4 Surgery– DHL+RFT+TAL+FDO
flexion of the knee( min / max )
dorsiflexion of the ankle( min )
rotation of the foot( min / max )
![Page 6: Playing with features for learning and prediction Jongmin Kim Seoul National University.](https://reader035.fdocuments.in/reader035/viewer/2022070323/56649e245503460f94b1212f/html5/thumbnails/6.jpg)
Predicting outcome of surgery
• Is it good features?
• Number of Training data– DHL+RFT+TAL : 35 data– FDO+DHL+TAL+RFT : 33 data
![Page 7: Playing with features for learning and prediction Jongmin Kim Seoul National University.](https://reader035.fdocuments.in/reader035/viewer/2022070323/56649e245503460f94b1212f/html5/thumbnails/7.jpg)
Machine learning and feature
DataFeature
representationLearningalgorithm
Featurerepresentation
Learningalgorithm
![Page 8: Playing with features for learning and prediction Jongmin Kim Seoul National University.](https://reader035.fdocuments.in/reader035/viewer/2022070323/56649e245503460f94b1212f/html5/thumbnails/8.jpg)
• Joint position / angle• Velocity / acceleration• Distance between body parts• Contact status• …
Features in motion
![Page 9: Playing with features for learning and prediction Jongmin Kim Seoul National University.](https://reader035.fdocuments.in/reader035/viewer/2022070323/56649e245503460f94b1212f/html5/thumbnails/9.jpg)
Features in computer vision
SIFT Spin image
HoG RIFT
Textons GLOH
![Page 10: Playing with features for learning and prediction Jongmin Kim Seoul National University.](https://reader035.fdocuments.in/reader035/viewer/2022070323/56649e245503460f94b1212f/html5/thumbnails/10.jpg)
Machine learning and feature
![Page 11: Playing with features for learning and prediction Jongmin Kim Seoul National University.](https://reader035.fdocuments.in/reader035/viewer/2022070323/56649e245503460f94b1212f/html5/thumbnails/11.jpg)
Outline
• Feature selection• - Feature ranking• - Subset selection: wrapper, filter, embedded• - Recursive Feature Elimination• - Combination of weak prior (Boosting)• - ADAboosting(clsf) / joint boosting (clsf)/ Gradi-
entboost (regression)
• Prediction result with feature selection
• Feature learning?
![Page 12: Playing with features for learning and prediction Jongmin Kim Seoul National University.](https://reader035.fdocuments.in/reader035/viewer/2022070323/56649e245503460f94b1212f/html5/thumbnails/12.jpg)
Feature selection
• Alleviating the effect of the curse of dimensionality
• Improve the prediction performance• Faster and more cost-effective• Providing a better understanding of
the data
![Page 13: Playing with features for learning and prediction Jongmin Kim Seoul National University.](https://reader035.fdocuments.in/reader035/viewer/2022070323/56649e245503460f94b1212f/html5/thumbnails/13.jpg)
Subset selection
• Wrapper
• Filter
• Embedded
![Page 14: Playing with features for learning and prediction Jongmin Kim Seoul National University.](https://reader035.fdocuments.in/reader035/viewer/2022070323/56649e245503460f94b1212f/html5/thumbnails/14.jpg)
Feature learning?
• Can we automatically learn a good feature represen-tation?
• Known as: unsupervised feature learning, feature learning, deep learning, representation learning, etc.
• Hand-designed features (by human):• 1. need expert knowledge• 2. requires time-consuming hand-tuning.
• When it’s unclear how to hand design features: au-tomatically learned features (by machine)
![Page 15: Playing with features for learning and prediction Jongmin Kim Seoul National University.](https://reader035.fdocuments.in/reader035/viewer/2022070323/56649e245503460f94b1212f/html5/thumbnails/15.jpg)
Learning Feature Representations
• Key idea: • –Learn statistical structure or correlation of the
data from unlabeled data • –The learned representations can be used as fea-
tures in supervised and semi-supervised settings
![Page 16: Playing with features for learning and prediction Jongmin Kim Seoul National University.](https://reader035.fdocuments.in/reader035/viewer/2022070323/56649e245503460f94b1212f/html5/thumbnails/16.jpg)
Learning Feature Representations
EncoderDecoder
Input (Image/ Features)
Output Features
e.g.Feed-back /generative /top-downpath
Feed-forward /bottom-up path
![Page 17: Playing with features for learning and prediction Jongmin Kim Seoul National University.](https://reader035.fdocuments.in/reader035/viewer/2022070323/56649e245503460f94b1212f/html5/thumbnails/17.jpg)
Learning Feature Representations
σ(Wx)Dz
Input Patch x
Sparse Features z
e.g.
• Predictive Sparse Decomposition [Kavukcuoglu et al., ‘09]
Encoder filters W
Sigmoid function σ(.)
Decoder filters D
L1 Spar-sity
![Page 18: Playing with features for learning and prediction Jongmin Kim Seoul National University.](https://reader035.fdocuments.in/reader035/viewer/2022070323/56649e245503460f94b1212f/html5/thumbnails/18.jpg)
Stacked Auto-Encoders
En-coder
De-coder
Input Image
Class label
Features
En-coder
De-coder
Features
En-coder
De-coder
[Hinton & Salakhutdinov Science ‘06]
![Page 19: Playing with features for learning and prediction Jongmin Kim Seoul National University.](https://reader035.fdocuments.in/reader035/viewer/2022070323/56649e245503460f94b1212f/html5/thumbnails/19.jpg)
At Test Time
En-coder
Input Image
Class label
Features
En-coder
Features
En-coder
[Hinton & Salakhutdinov Science ‘06]
• Remove decoders• Use feed-forward
path
• Gives standard(Convolutional)Neural Network
• Can fine-tune with backprop
![Page 20: Playing with features for learning and prediction Jongmin Kim Seoul National University.](https://reader035.fdocuments.in/reader035/viewer/2022070323/56649e245503460f94b1212f/html5/thumbnails/20.jpg)
Status & plan
• Data 파악 / learning technique survey…
• Plan : 11 월 실험 끝• 12 월 논문 writing• 1 월 시그랩 submit• 8 월에 미국에서 발표
• But before all of that….
![Page 21: Playing with features for learning and prediction Jongmin Kim Seoul National University.](https://reader035.fdocuments.in/reader035/viewer/2022070323/56649e245503460f94b1212f/html5/thumbnails/21.jpg)
Deep neural net vs. boost-ing
• Deep Nets:• - single highly non-linear system• - “deep” stack of simpler modules• - all parameters are subject to learning
• Boosting & Forests:• - sequence of “weak” (simple) classifiers that are lin-
early combined to produce a powerful classifier• - subsequent classifiers do not exploit representa-
tions of earlier classifiers, it's a “shallow” linear mix-ture
• - typically features are not learned
![Page 22: Playing with features for learning and prediction Jongmin Kim Seoul National University.](https://reader035.fdocuments.in/reader035/viewer/2022070323/56649e245503460f94b1212f/html5/thumbnails/22.jpg)
Deep neural net vs. boost-ing