backpropagation - Large values of weights in neural network -
i use q-learning neural network approimator. , after several training iteration, weights acquire values in range 0 10. can weights take such values? or indicate bad network parameters?
weights can take values. when you're propagating large number of iterations; connections need 'heavy', 'heavier'.
there plenty examples showing neural networks weights larger 1. example.
also, following image, there no such thing weight limits:
Comments
Post a Comment