Tensorflow LSTM: Flexible number of hidden units -
i trying train different numbers of hidden units lstm works values. these values have same last input dimension numfreqs. need implement input , output layer? couldn't find clear solution when googled this.
num_hidden = 512 numfreqs = 2048 batch_size = 10 numwindows = 686 input_data = tf.placeholder(tf.float32,[batch_size,numwindows,numfreqs]) target_data = tf.placeholder(tf.float32,[batch_size,numwindows,numfreqs]) cell = tf.contrib.rnn.lstmcell(num_hidden, forget_bias=1.0) val = tf.nn.dynamic_rnn(cell, input_data, dtype=tf.float32) netoutput = val[0] weight = tf.variable(tf.truncated_normal([numwindows,num_hidden])) bias = tf.variable(tf.zeros([num_hidden])) prediction = tf.add(tf.multiply(netoutput,weight),bias) cost = tf.reduce_mean(tf.square(prediction - target_data))
if try have different number of num_hidden can't calculate cost because shape of prediction [10,686,512] , shape of target [10,686,2048]. if change target_data shape , input have 512 last dimension downsample data much. still want have output of 2048 frequencies.
Comments
Post a Comment