Introduction to Deep Learning (I2DL)

27
I2DL: Prof. Niessner Introduction to Deep Learning (I2DL) 1 Exercise 4: Simple Classifier

Transcript of Introduction to Deep Learning (I2DL)

Page 1: Introduction to Deep Learning (I2DL)

I2DL: Prof. Niessner

Introduction to Deep Learning (I2DL)

1

Exercise 4: Simple Classifier

Page 2: Introduction to Deep Learning (I2DL)

I2DL: Prof. Niessner

Today’s Outline• The Pillars of Deep Learning• Exercise 4: Simple Classifier

– Housing Dataset– Submission 2

• Backpropagation• Outlook: Lecture 5 + Exercise 5

2

Page 3: Introduction to Deep Learning (I2DL)

I2DL: Prof. Niessner

The Pillars of Deep Learning

3

Page 4: Introduction to Deep Learning (I2DL)

I2DL: Prof. Niessner

The Pillars of Deep Learning

4

Data Model Solver

Dataset

Dataloader

Network

Loss/Objective

Optimizer

Training Loop

Validation

Page 5: Introduction to Deep Learning (I2DL)

I2DL: Prof. Niessner

The Pillars of Deep Learning

5

Data

Dataset

Dataloader

Exercise 3: Dataset and Dataloader

Page 6: Introduction to Deep Learning (I2DL)

I2DL: Prof. Niessner

The Pillars of Deep Learning

6

Model Solver

Network

Loss/Objective

Optimizer

Training Loop

Validation

Exercise 4: Simple Classifier

Exercise 5: Simple Network

Exercise 6: Hyperparameter Tuning

Page 7: Introduction to Deep Learning (I2DL)

I2DL: Prof. Niessner

Goal: Exercise 4

7

Solver

Optimizer

Training Loop

Validation

• Goal: Trainings process• Skip: Model Pillar• Simplified Model: Classifier which is a 1-

Layer Neural Network

Page 8: Introduction to Deep Learning (I2DL)

I2DL: Prof. Niessner

Goals: Exercises 5++

8

Model

Network

Loss/Objective

• Ex 3 + 4: Dataloading and Trainings process

• Ex 5++: Expand the exercises to more interesting model architectures

Page 9: Introduction to Deep Learning (I2DL)

I2DL: Prof. Niessner

The Pillars of Deep Learning

9

Data Model Solver

Dataset

Dataloader

Network

Loss/Objective

Optimizer

Training Loop

Validation

Can be implemented once and used in multiple projects

Page 10: Introduction to Deep Learning (I2DL)

I2DL: Prof. Niessner

Exercise 4: Simple Classifier

10

Page 11: Introduction to Deep Learning (I2DL)

I2DL: Prof. Niessner

Overview Exercise 4• One Notebook

– Logistic regression model

• Submission 2– Several implementation tasks in the notebook– Submission file creation in Notebook

11

Fixed Deadline:Nov 17, 2022 15:59

Page 12: Introduction to Deep Learning (I2DL)

I2DL: Prof. Niessner

Housing Dataset

12

• Housing Dataset: Data of ~1400 houses including 81 features like Neighborhood, GrLivArea, YearBuilt, etc.

• Simplified model: 1 input feature to predict the house price

Page 13: Introduction to Deep Learning (I2DL)

I2DL: Prof. Niessner

Submission 4 - Classifying House Prices

13

ML Model 𝑴𝑴 𝐱 = 𝐲

Expensive 𝑦 = 1

Low-priced 𝑦 = 0ML Model 𝑴𝑴 𝐱 = 𝐲

Page 14: Introduction to Deep Learning (I2DL)

I2DL: Prof. Niessner

Solver

3rd Pillar of Deep Learning

14

Training Data

Validation Data

Loss

Optimizer

Model

Page 15: Introduction to Deep Learning (I2DL)

I2DL: Prof. Niessner

Backpropagation

15

𝐿L(𝑦, %𝑦)M! x = $𝑦

d$𝑦/d𝜃 d𝐿/d𝜃d𝐿/d(𝑦

x

"𝑦

Forward pass

Backward pass

Binary Cross Entropy Loss:

Page 16: Introduction to Deep Learning (I2DL)

I2DL: Prof. Niessner

Backpropagation

16

𝐿L(𝑦, %𝑦)M! x = $𝑦

d$𝑦/d𝜃 d𝐿/d𝜃d𝐿/d(𝑦

x

"𝑦

Forward pass

Backward pass

𝜃'() = 𝜃' − 𝜆 ⋅ 𝛻*LOptimization with gradient descent:

Page 17: Introduction to Deep Learning (I2DL)

I2DL: Prof. Niessner

Backpropagation

17

Page 18: Introduction to Deep Learning (I2DL)

I2DL: Prof. Niessner

Model

18

• Input: representing our data with N samples and

D+1 feature dimensions

• Output: Binary labels given by

• Model: Classifier of the form

• Sigmoid function: with

• Weights of the Classifier:

Page 19: Introduction to Deep Learning (I2DL)

I2DL: Prof. Niessner

Forward Pass

19

Sample

Page 20: Introduction to Deep Learning (I2DL)

I2DL: Prof. Niessner

Input Data X

20

Page 21: Introduction to Deep Learning (I2DL)

I2DL: Prof. Niessner

Forward Pass

21Forward Pass

Sample

Page 22: Introduction to Deep Learning (I2DL)

I2DL: Prof. Niessner

Backward Pass

22Backward Pass

Sample

Page 23: Introduction to Deep Learning (I2DL)

I2DL: Prof. Niessner

Backward Pass

23

• Backward Pass: Derivative of function with respect to weights

of our Classifier

• Attention: Make sure you understand the dimensions here

• Step 1: Forward + Backward Pass for one sample

• Step 2: Forward + Backward Pass for N samples

Page 24: Introduction to Deep Learning (I2DL)

I2DL: Prof. Niessner

Outlook

24

Page 25: Introduction to Deep Learning (I2DL)

I2DL: Prof. Niessner

Upcoming Lectures

• Next lecture:Lecture 5: Stochastic Gradient Descent

• Next Thursday: Exercise 5: Two-layer Neural Network

25

Model

Network

Loss/Objective

Page 26: Introduction to Deep Learning (I2DL)

I2DL: Prof. Niessner

Summary• Monday 15.11 : Watch Lecture 5

– Scaling Optimization to Large Data, Stochastic Gradient Descent

• Wednesday 17.11 15:59: Submit exercise 4– Simple Classifier

• Thursday 18.11: Tutorial 5– Neural Networks

26

Page 27: Introduction to Deep Learning (I2DL)

I2DL: Prof. Niessner

See you next week J

27