Skip to content

Conversation

YF-Tung
Copy link

@YF-Tung YF-Tung commented Feb 18, 2019

An NN should always have activation functions (like relu), or otherwise it's just a trivial linear model.
By applying relu, accuracy for neural_network.py: 92% -> 95%,
and accuracy for neural_network_raw.py: 92% -> 94%
Also the learning rate is too large, even for an example. In practice it's typical to set it somewhere between 1e-2 and 1e-4.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant