TensorFlow, with Application Introduction to Neural Nets in · Introduction to Neural Nets in...

23
Introduction to Neural Nets in TensorFlow, with Application Jackson Barkstrom (with advisor Michael Lee)

Transcript of TensorFlow, with Application Introduction to Neural Nets in · Introduction to Neural Nets in...

Page 1: TensorFlow, with Application Introduction to Neural Nets in · Introduction to Neural Nets in TensorFlow, with Application Jackson Barkstrom (with advisor Michael Lee) ... we could

Introduction to Neural Nets in TensorFlow, with Application

Jackson Barkstrom (with advisor Michael Lee)

Page 2: TensorFlow, with Application Introduction to Neural Nets in · Introduction to Neural Nets in TensorFlow, with Application Jackson Barkstrom (with advisor Michael Lee) ... we could

Example Task: Image Classification:Using the MNIST Databaseof Handwritten Digits

Page 3: TensorFlow, with Application Introduction to Neural Nets in · Introduction to Neural Nets in TensorFlow, with Application Jackson Barkstrom (with advisor Michael Lee) ... we could

[0,0,0,1,0,0,0,0,0…………….….1,0,0,0,0,0]

Process:Reshaping Data and Implementing A Neural NetNote: we could look at the data in two dimensions and use a Convolutional Neural Network, and TensorFlow

can handle as many as four dimensions. But for our simplified example, we converted the data to one

dimension.

Page 4: TensorFlow, with Application Introduction to Neural Nets in · Introduction to Neural Nets in TensorFlow, with Application Jackson Barkstrom (with advisor Michael Lee) ... we could

Introduction:What is a Neural Net?: The Basic Framework

0.1

0.8

1

8

Page 5: TensorFlow, with Application Introduction to Neural Nets in · Introduction to Neural Nets in TensorFlow, with Application Jackson Barkstrom (with advisor Michael Lee) ... we could
Page 6: TensorFlow, with Application Introduction to Neural Nets in · Introduction to Neural Nets in TensorFlow, with Application Jackson Barkstrom (with advisor Michael Lee) ... we could

Mathematical Code Walkthrough

Specifying Input (x) and Output (y)

Page 7: TensorFlow, with Application Introduction to Neural Nets in · Introduction to Neural Nets in TensorFlow, with Application Jackson Barkstrom (with advisor Michael Lee) ... we could

The Neural Net in Code

Page 8: TensorFlow, with Application Introduction to Neural Nets in · Introduction to Neural Nets in TensorFlow, with Application Jackson Barkstrom (with advisor Michael Lee) ... we could

The Math, explained

Page 9: TensorFlow, with Application Introduction to Neural Nets in · Introduction to Neural Nets in TensorFlow, with Application Jackson Barkstrom (with advisor Michael Lee) ... we could
Page 10: TensorFlow, with Application Introduction to Neural Nets in · Introduction to Neural Nets in TensorFlow, with Application Jackson Barkstrom (with advisor Michael Lee) ... we could

Softmax Activation Function: Great for One-Hot Encoding

The sum of all the entries adds to one, and the largest entry will be nearly one while the smaller ones will be nearly zero.

Page 11: TensorFlow, with Application Introduction to Neural Nets in · Introduction to Neural Nets in TensorFlow, with Application Jackson Barkstrom (with advisor Michael Lee) ... we could

Gradient Descent Optimization

Cross entropy error function

Page 12: TensorFlow, with Application Introduction to Neural Nets in · Introduction to Neural Nets in TensorFlow, with Application Jackson Barkstrom (with advisor Michael Lee) ... we could

Results: 96% Accuracy After Feeding in Data

Note: We could probably improve these results by using a Convolutional Neural Network

Page 13: TensorFlow, with Application Introduction to Neural Nets in · Introduction to Neural Nets in TensorFlow, with Application Jackson Barkstrom (with advisor Michael Lee) ... we could

An Application: Predicting Fires in Charlottesville, Virginia

Note: It’s best to use neural networks when we have a LOT of data, and for this example we didn’t

have enough data to make the neural network the best approach. However, using a neural net for

this is still an interesting application as far as learning is concerned.

Page 14: TensorFlow, with Application Introduction to Neural Nets in · Introduction to Neural Nets in TensorFlow, with Application Jackson Barkstrom (with advisor Michael Lee) ... we could

How I Made the Data

● Started with four datasets: one of all commercial homes, one of all residential homes, one of all fires before 2016, and one of all fires after 2018.

● Modified addresses, cleaned things up a lot, and merged fires to the commercial and residential datasets by addresses.

● One-hot encoded all categorical variables, and normalized all numerical variables.

Page 15: TensorFlow, with Application Introduction to Neural Nets in · Introduction to Neural Nets in TensorFlow, with Application Jackson Barkstrom (with advisor Michael Lee) ... we could
Page 16: TensorFlow, with Application Introduction to Neural Nets in · Introduction to Neural Nets in TensorFlow, with Application Jackson Barkstrom (with advisor Michael Lee) ... we could

We have to create a “dataset” object with TensorFlow to feed data into, and we have to specify how we want iterate

through the data. Other than that, it’s almost exactly the same code as before!

Page 17: TensorFlow, with Application Introduction to Neural Nets in · Introduction to Neural Nets in TensorFlow, with Application Jackson Barkstrom (with advisor Michael Lee) ... we could

100%Of addresses were usually predicted NOT to have fires. This is not surprising! But don’t worry, we still have something...

Page 18: TensorFlow, with Application Introduction to Neural Nets in · Introduction to Neural Nets in TensorFlow, with Application Jackson Barkstrom (with advisor Michael Lee) ... we could

Our neural net outputs its confidence in its predictions, and we can use this to generate arbitrary “risk levels.”

Page 19: TensorFlow, with Application Introduction to Neural Nets in · Introduction to Neural Nets in TensorFlow, with Application Jackson Barkstrom (with advisor Michael Lee) ... we could

~8%Of commercial addresses in our highest risk category had fires within the next 2 years! Around 5% of

homes in our middle risk category, and around 2.5% of homes the lowest risk category also had fires

within the next two years.

Insight: we can tell that some houses have triple the fire

risk of others! This is useful!

Page 20: TensorFlow, with Application Introduction to Neural Nets in · Introduction to Neural Nets in TensorFlow, with Application Jackson Barkstrom (with advisor Michael Lee) ... we could

~6%Of residential addresses in our highest risk category had fires within the next 2 years. Around 4% of

homes in our middle risk category, and around 2.5% of homes the lowest risk category also had fires

within the next two years. But this doesn’t make sense… we have more data for residential homes both in

number and in detail, so shouldn’t we do better here?

Overfitting is a likely culprit here. It would be useful to use a validation set.

Page 21: TensorFlow, with Application Introduction to Neural Nets in · Introduction to Neural Nets in TensorFlow, with Application Jackson Barkstrom (with advisor Michael Lee) ... we could

Possible Improvements

● Train/test splitting by time (checking if our model can help us understand what will happen in the future)

● Methodically eliminate less useful columns to prevent overfitting● Improve how we normalize columns--the difference between 1 and 20

in a column could be very meaningful, but if we have a gigantic value in the column it could this difference look like nothing to the neural net.

● Change an activation function? Add more layers?

Page 22: TensorFlow, with Application Introduction to Neural Nets in · Introduction to Neural Nets in TensorFlow, with Application Jackson Barkstrom (with advisor Michael Lee) ... we could

We Can Do a Lot with Neural Nets

Both regression and classification!

(as long as we have a lot of data and we figure out a good way to format the input and the output)

Page 23: TensorFlow, with Application Introduction to Neural Nets in · Introduction to Neural Nets in TensorFlow, with Application Jackson Barkstrom (with advisor Michael Lee) ... we could

Thanks for Listening!