python 2.7 - dimensions in batch normalization -
i'm trying build generalized batch normalization function in tensorflow.
i learn batch normalization in article found kind.
i have problem dimensions of scale , beta variables: in case batch normalization applied each activations of each convolutional layer, if have output of convolutional layer tersor size:
[57,57,96]
i need scale , beta have same dimension convolutional layer output, correct?
here's function, program works don't know if correct
def batch_normalization_layer(batch): # calculate batch mean , variance batch_mean, batch_var = tf.nn.moments(batch, axes=[0, 1, 2]) # apply initial batch normalizing transform scale = tf.variable(tf.ones([batch.get_shape()[1],batch.get_shape()[2],batch.get_shape()[3]])) beta = tf.variable(tf.zeros([batch.get_shape()[1],batch.get_shape()[2],batch.get_shape()[3]])) normalized_batch = tf.nn.batch_normalization(batch, batch_mean, batch_var, beta, scale, 0.0001) return normalized_batch
from documentation of tf.nn.batch_normalization
:
mean, variance, offset , scale expected of 1 of 2 shapes:
in generality, can have same number of dimensions input x, identical sizes x dimensions not normalized on (the 'depth' dimension(s)), , dimension 1 others being normalized over. mean , variance in case typically outputs of tf.nn.moments(..., keep_dims=true) during training, or running averages thereof during inference.
in common case 'depth' dimension last dimension in input tensor x, may 1 dimensional tensors of same size 'depth' dimension. case example common [batch, depth] layout of fully-connected layers, , [batch, height, width, depth] convolutions. mean , variance in case typically outputs of tf.nn.moments(..., keep_dims=false) during training, or running averages thereof during inference.
with values (scale=1.0 , offset=0) can provide value none
.
Comments
Post a Comment