Skip to content
Better HN
Top
New
Best
Ask
Show
Jobs
Search
⌘K
undefined | Better HN
0 points
cglace
9y ago
0 comments
Share
Using all inputs and 6 layers of varying sizes. After about 500 iterations.
http://i.imgur.com/x1MOpvl.jpg
0 comments
default
newest
oldest
visarga
9y ago
Just 100 iterations, learning rate 0.03, activation tanh, regularization L2, rate 0.01. The network is 8,8,8 neurons per layer.
j
/
k
navigate · click thread line to collapse