backpropagation - Large values of weights in neural network -


i use q-learning neural network approimator. , after several training iteration, weights acquire values in range 0 10. can weights take such values? or indicate bad network parameters?

weights can take values. when you're propagating large number of iterations; connections need 'heavy', 'heavier'.

there plenty examples showing neural networks weights larger 1. example.

also, following image, there no such thing weight limits:

image legend


Comments

Popular posts from this blog

commonjs - How to write a typescript definition file for a node module that exports a function? -

openid - Okta: Failed to get authorization code through API call -

thorough guide for profiling racket code -