python - How to register a custom gradient for a operation composed of tf operations -


more have simple fprop composition of tf operations. want override tensorflow gradient computation own gradient method using registergradient.

what's wrong code?

import tensorflow tf tensorflow.python.framework import ops  @ops.registergradient("myopgrad") def frop_grad(op, grad):     x = op.inputs[0]     return 0 * x  # 0 out see difference:  def fprop(x):     x = tf.sqrt(x)     out = tf.maximum(x, .2)     return out  = tf.variable(tf.constant([5., 4., 3., 2., 1.], dtype=tf.float32)) h = fprop(a) h = tf.identity(h, name="myop") grad = tf.gradients(h, a)  g = tf.get_default_graph() g.gradient_override_map({'myop': 'myopgrad'}):     tf.session() sess:         sess.run(tf.initialize_all_variables())         result = sess.run(grad)  print(result[0]) 

i want see zeros in print, instead getting:

[ 0.2236068   0.25000003  0.28867513  0.35355341  0.5       ] 

you need define op within scope of with g.gradient_override_map({'myop': 'myopgrad'})

also, need map identity rather name myop new gradient.

here full code:

import tensorflow tf tensorflow.python.framework import ops  @ops.registergradient("myopgrad") def frop_grad(op, grad):     x = op.inputs[0]     return 0 * x  # 0 out see difference:  def fprop(x):     x = tf.sqrt(x)     out = tf.maximum(x, .2)     return out  = tf.variable(tf.constant([5., 4., 3., 2., 1.], dtype=tf.float32)) h = fprop(a)  g = tf.get_default_graph() g.gradient_override_map({'identity': 'myopgrad'}):     h = tf.identity(h, name="myop")     grad = tf.gradients(h, a)  tf.session() sess:     sess.run(tf.initialize_all_variables())     result = sess.run(grad)  print(result[0]) 

output:

[ 0.  0.  0.  0.  0.] 

Comments

Popular posts from this blog

commonjs - How to write a typescript definition file for a node module that exports a function? -

openid - Okta: Failed to get authorization code through API call -

thorough guide for profiling racket code -