Skip to content

Conversation

Elvis-ever
Copy link

Go through the practice results, it looks that NN need an activation function (like relu), otherwise it's just a trivial linear model.By applying relu, the accuracy is neural_network.py: 92% -> 95%,
Accuracy for neural_network_raw.py: 92% -> 94%.And the learning rate is too large, even for example. In practice, it is usually set between 1e-2 and 1e-4.

@xy-always
Copy link

xy-always commented Aug 30, 2022 via email

@jiandandema
Copy link

jiandandema commented Aug 30, 2022 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants