This is a fully connected neural network implementation from the ground up, without using any library. It was made for practice purposes, by following Neural Networks and Deep Learning, and features both a (really really slow) Python3 and (fast) C++ implementation. It allows creating networks with multiple layers. It supports multiple activation and cost functions and allows defining more by the user, by extending the relevant class. Optimization is achieved with stochastic gradient descent (optionally keeping track of momentum), with the possibility to fine-tune multiple parameters: epochs, mini batch size, eta, regularization and momentum. By training with 100 hidden layers on the MNIST training data set, roughly 97%-98% of the test digits were correctly classified.
-
Notifications
You must be signed in to change notification settings - Fork 1
Stypox/neural-network
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
About
Fully-connected neural network trained using derivatives
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published