Creating ASP.net Applications With N-Tier Architecture - CodeProject
Convolutional Neural Network Workbench - CodeProject
Click here to load reader
-
Upload
hector-triana -
Category
Documents
-
view
113 -
download
7
Transcript of Convolutional Neural Network Workbench - CodeProject
9/7/2014 Convolutional Neural Network Workbench - CodeProject
http://www.codeproject.com/Articles/140631/Convolutional-Neural-Network-MNIST-Workbench 1/8
10,720,671 members (82,205 online) 590 Sign out
Member 9960792
home quick answers discussions features community help Search for articles, questions, tips
Articles » General Programming » Algorithms & Recipes » Neural Networks
Article
Browse Code
Bugs / Suggestions
Stats
Revisions (195)
Alternatives
Comments (173)
View this article'sWorkspace
Fork this Workspace
Add your ownalternative version
About Article
Type Article
Licence CPOL
First Posted 28 Dec 2010
Views 200,281
Bookmarked 260 times
, +
Related Articles
Neural Network for Recognitionof Handwritten Digits
A Neural Network on GPU
Online handwriting recognitionusing multi convolution neuralnetworks
Large pattern recognition
Rate:Filip D'haene, 8 Jul 2014
Convolutional Neural Network Workbench
A workbench to create, train, and test convolutional neural networks against the MNIST and CIFAR-10 datasets
Download CNNWB Sources
Download Setup
Introduction
This article is about a framework in C# 4.0 that allows to create, train, and test convolutional neural networks against
the MNIST and the CIFAR-10 dataset of 10 different natural objects. I initially based me on an article by Mike O'Neill
on the The Code Project and gradually added new features that I've found interesting in research documents found on
the internet. Dr. Yann LeCun's paper: Gradient-Based Learning Applied to Document Recognition is a great paper to
get a better understanding of the principles of convolutional neural networks and the reason why they are so
successful in the area of machine vision.
The Code
The main goal of this project was to build a more flexible and extendable managed version of Mike O'Neill's excellent
C++ project. I've included and used the splendid WPF TaskDialog Wrapper from Sean A. Hanley, the Extended WPF
Toolkit and for unzipping the CIFAR-10 dataset the open-source SharpDevelop SharpZipLib module. Visual Studio
2012/2013 and Windows 7 are the minimum requirements. I made maximal use of the parallel functionality offered in
C# 4.0 by letting the user at all times choose how many logical cores are used in the parallel optimized code parts with
a simple manipulation of a sliderbar next to the View combobox.
Using the Code
Here is the example code to construct a LeNet-5 network in code (see the InitializeDefaultNeuralNetwork()
function in MainViewWindows.xaml.cs):
Collapse | Copy Code
NeuralNetwork cnn = new NeuralNetwork(DataProvider, "LeNet-5", 10, 0.8D, LossFunctions.MeanSquareError, DataProviderSets.MNIST, TrainingStrategy.SGDLevenbergMarquardt, 0.02D);cnn.AddLayer(LayerTypes.Input, 1, 32, 32);cnn.AddLayer(LayerTypes.Convolutional, ActivationFunctions.Tanh, 6, 28, 28, 5, 5);cnn.AddLayer(LayerTypes.AveragePooling, ActivationFunctions.Tanh, 6, 14, 14, 2, 2);
bool[] maps = new bool[6 * 16] { true, false,false,false,true, true, true, false,false,true, true, true, true, false,true, true, true, true, false,false,false,true, true, true, false,false,true, true, true, true, false,true, true, true, true, false,false,false,true, true, true, false,false,true, false,true, true, true, false,true, true, true, false,false,true, true, true, true, false,false,true, false,true, true, false,false,true, true, true, false,false,true, true, true, true, false,true, true, false,true, false,false,false,true, true, true, false,false,true, true, true, true, false,true, true, true};
cnn.AddLayer(LayerTypes.Convolutional, ActivationFunctions.Tanh, 16, 10, 10, 5, 5, new Mappings(maps));cnn.AddLayer(LayerTypes.AveragePooling, ActivationFunctions.Tanh, 16, 5, 5, 2, 2);cnn.AddLayer(LayerTypes.Convolutional, ActivationFunctions.Tanh, 120, 1, 1, 5, 5);cnn.AddLayer(LayerTypes.FullyConnected, ActivationFunctions.Tanh, 10);cnn.InitializeWeights();
4.93 (87 votes)
C# Windows
.NET Visual-Studio
Dev QA XAML
WPF Beginner
Intermediate
articles
9/7/2014 Convolutional Neural Network Workbench - CodeProject
http://www.codeproject.com/Articles/140631/Convolutional-Neural-Network-MNIST-Workbench 2/8
system using multi neuralnetworks
Multiple convolution neuralnetworks approach for onlinehandwriting recognition
Neural Network for Recognitionof Handwritten Digits in C#
Neural Networks on C#
An Introduction to EncogNeural Networks for C#
AForge.NET open sourceframework
Designing And Implementing ANeural Network Library ForHandwriting Detection, ImageAnalysis etc.- The BrainNetLibrary - Full Code, SimplifiedTheory, Full Illustration, AndExamples
AI: Neural Network forBeginners (Part 3 of 3)
Brainnet 1 - A Neural NetwokProject - With Illustration AndCode - Learn Neural NetworkProgramming Step By Step AndDevelop a Simple HandwritingDetection System
C# Application to Create andRecognize Mouse Gestures(.NET)
An Introduction to EncogNeural Networks for Java
Library for online handwritingrecognition system usingUNIPEN database.
AI : Neural Network forbeginners (Part 1 of 3)
Mouse gestures recognition
Creating Optical CharacterRecognition (OCR) applicationsusing Neural Networks
Neural Cryptography
AI : Neural Network forbeginners (Part 2 of 3)
Related Research
Gartner: Magic Quadrant forOn-Premises Application
Platforms
Expanding active decision-making: The power of
integrating business rules andevents
Toad Oracle: Tips to SimplifyDatabase Administration and
Development
Design View
In Design View, you can see how your network is defined and get a good picture of the current distribution of weight
values in all the layers concerned.
Training View
In Training View you can train the network. The 'Play' button gives you the 'Select Training Parameters' dialog where
you define all the training parameters. The 'Training Scheme Editor' button gives you the possibility to make training
schemes to experiment with. At any time the training can be paused or aborted. The 'Star' button will forget (reset) all
9/7/2014 Convolutional Neural Network Workbench - CodeProject
http://www.codeproject.com/Articles/140631/Convolutional-Neural-Network-MNIST-Workbench 3/8
Learn Agile: Ten Tips forLaunching and Testing High
Quality Apps for the AmericanMarket
the learned weight values.
Testing View
9/7/2014 Convolutional Neural Network Workbench - CodeProject
http://www.codeproject.com/Articles/140631/Convolutional-Neural-Network-MNIST-Workbench 4/8
In Testing View you get a better picture of the testing (or training samples) which are not recognized correctly .
Calculate View
In Calculate View we test a single testing or training sample with the desired properties and get a graphical view of all
the output values in every layer.
9/7/2014 Convolutional Neural Network Workbench - CodeProject
http://www.codeproject.com/Articles/140631/Convolutional-Neural-Network-MNIST-Workbench 5/8
Final Words
I would love to see a GPU integration for offloading the highly parallel task of learning the neural network I made an
attempt to use a simple MVVM structure in this WPF application. In the Model folder you find
the NeuralNetwork and DataProvider class which provide all the neural network code and deals with loading and
providing the necessary MNIST and CIFAR-10 training and testing samples. Also a NeuralNetworkDataSet class is
used to load and save neural network definitions. The View folder contains four different PageViews and a global
PageView which acts as the container for all the different views (Design, Training, Testing and Calculate). Hope
there's someone out there who can actually use the code and improve on it. Extend it with an unsupervised learning
stage for example (encoder/decoder construction), implement better loss-functions and more training strategies
(conjugate gradient, l-bgfs, ...), add more datasets and use new activation fuctions, etc.
History
1.0.3.7: (07-08-14)
BugFix: slow speed resolved in Testing view
Added SGDLevenbergMarquardtModA training strategy. This can be used with a softmax output layer
Posibility to save the weights while training. Just click on Pause and then Save/Save as...
Various smaller fixes and optimizations
1.0.3.6: (05-02-14)
Choice between four Training Strategies:
SGDLevenbergMarquardt
SGDLevenbergMarquardtMiniBatch
SGD
SGDMiniBatch
BugFix: Derivative of ReLU's activation functions
Added SoftSign activation function
Overall 20% more training performance over previous version
Faster binary save of the network weights
Various smaller fixes and optimizations
1.0.3.5: (03-13-14)
StochasticPooling and L2Pooling layers now correctly implemented
Native C++ implementation + managed C++/CLI wrapper
1.0.3.4: (12-10-13)
Support for Stochastic Pooling layers
Much faster binary load and save of a cnn
ReLU activation function now working as expected
1.0.3.3: (11-26-2013)
Bugfix: Download datasets now working as expected
Bugfix: Softmax function corrected
Bugfix: DropOut function corrected
1.0.3.2: (11-15-2013)
Bugfix: Local layer and Convolutional layer now works properly
Bugfix: Cross Entropy loss now works better (in combination with a SoftMax activation function in a final fully
connected layer)
Added LeCun's standard LeNet-5 training scheme
1.0.3.1: (11-09-2013)
Now the last min/max display preference is saved
Added some extra predefined training parameters and schemes
Bugfix: Average error not showing correctly after switching between neural networks with a different objective
function
Bugfix: Sample not always showing correctly in calculate view
Bugfix: The end result in testing view is not displaying the correct values
1.0.3.0: (11-06-2013)
Supports dropout
Supports the Local layer type (Same as the convolution layer but with non-shared kernel weights)
Supports padding in the Local and Convolution layers (gives ability to create deep networks)
Supports overlapping receptive fields in the pooling layers
Supports weightless pooling layers
Supports all the commonly used activation functions
Supports Cross Entropy objective function in combination with a SoftMax activation function in the output layer
9/7/2014 Convolutional Neural Network Workbench - CodeProject
http://www.codeproject.com/Articles/140631/Convolutional-Neural-Network-MNIST-Workbench 6/8
Filip D'haeneSoftware Developer
Belgium
No Biography provided
Add a Comment or Question Search this forum Go
Ability to specify a density percentage to generate mappings between layers far more easily
Improved DataProvider class with much reduced memory footprint and a common logical functionality
shared across all datasets (easier to add datasets)
Much improved UI speed and general functionality
License
This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)
About the Author
Article Top
Comments and Discussions
Profile popups Spacing Relaxed Noise Medium Layout Normal Per page 25 Update
First Prev Next
CatchExAs 3hrs 40mins ago
Filip D'haene 2hrs 34mins ago
MikeMNN 13-Jun-14 7:24
Filip D'haene 13-Jun-14 16:41
MikeMNN 16-Jun-14 8:20
Filip D'haene 17-Jun-14 9:30
MikeMNN 18-Jun-14 4:09
Filip D'haene 18-Jun-14 18:08
Member 10627683 12-Apr-14 6:53
Filip D'haene 14-Apr-14 13:43
Member 10627683 14-Apr-14 21:40
wsnzone 20-Mar-14 5:47
My vote of 5
Re: My vote of 5
Change database [modified]
Re: Change database
Re: Change database
Re: Change database
Re: Change database [modified]
Re: Change database
Please guide me to training CNNs on Sound spectrum
Images
Re: Please guide me to training CNNs on Sound spectrum
Images [modified]
Re: Please guide me to training CNNs on Sound spectrum
Images
rbf wrong
9/7/2014 Convolutional Neural Network Workbench - CodeProject
http://www.codeproject.com/Articles/140631/Convolutional-Neural-Network-MNIST-Workbench 7/8
Permalink | Advertise | Privacy | Mobile Web04 | 2.8.140705.1 | Last Updated 8 Jul 2014
Article Copyright 2010 by Filip D'haeneEverything else Copyright © CodeProject, 1999-2014
Layout: fixed | fluid
Filip D'haene 20-Mar-14 13:10
wsnzone 24-Mar-14 15:25
Ido Freeman 17-Mar-14 8:27
Filip D'haene 17-Mar-14 8:43
Member 10642225 8-Mar-14 4:53
Filip D'haene 11-Mar-14 15:37
Member 10372081 2-Mar-14 1:06
Filip D'haene 6-Mar-14 17:41
Member 10334053 17-Dec-13 5:23
Filip D'haene 30-Dec-13 13:39
Member 10088555 21-Jan-14 19:47
tintinkool1988 27-Nov-13 22:59
Filip D'haene 29-Nov-13 7:49
Last Visit: 31-Dec-99 23:00 Last Update: 9-Jul-14 8:46 Refresh 1 2 3 4 5 6 7 Next »
General News Suggestion Question Bug Answer Joke Rant Admin
Use Ctrl+Left/Right to switch messages, Ctrl+Up/Down to switch threads, Ctrl+Shift+Left/Right to switch pages.
Re: rbf wrong
Re: rbf wrong
Zip file corrupted
Re: Zip file corrupted
Please guide me to implement GPU accelerated version
Re: Please guide me to implement GPU accelerated version
Why didn't CNN-CIFAR-10 use higher image resolution on
input layer? [modified]
Re: Why didn't CNN-CIFAR-10 use higher image resolution on
input layer?
how can i store the training setting and training result in my
database?
Re: how can i store the training setting and training result in my
database?
Re: how can i store the training setting and training result in
my database?
Question about time for each epoch
Re: Question about time for each epoch
9/7/2014 Convolutional Neural Network Workbench - CodeProject
http://www.codeproject.com/Articles/140631/Convolutional-Neural-Network-MNIST-Workbench 8/8
Terms of Service