tf.layers.Dense() is widely used in models built by tensorflow. In this tutorial, we will use some examples to show how to use tf.layers.Dense().
tf.layers.Dense()
tf.layers.Dense() is defined as:
__init__( units, activation=None, use_bias=True, kernel_initializer=None, bias_initializer=tf.zeros_initializer(), kernel_regularizer=None, bias_regularizer=None, activity_regularizer=None, kernel_constraint=None, bias_constraint=None, trainable=True, name=None, **kwargs )
It will implement the operation: outputs = activation(inputs * kernel + bias)
Here is an explain:
To understand the structure of tf.layers.Dense(), you can read this tutorial:
Understand Dense Layer (Fully Connected Layer) in Neural Networks – Deep Learning Tutorial
Important Parameters explained
units: dimensionality of the output, for example: 64
activation: can be relu, sigmoid, tanh etc al
use_bias: use bias in activation(inputs * kernel + bias) or not
kernel_initializer: Initializer function for the weight matrix. If None (default), weights are initialized using the default initializer used by tf.get_variable()
bias_initializer: Initializer function for the bias.
trainable: Boolean, if True also add variables to the graph collection GraphKeys
name: String, the name of the layer.
reuse: Boolean, whether to reuse the weights of a previous layer by the same name.
How to use tf.layers.Dense()?
We will use an example to show you how to use it.
import tensorflow as tf x = tf.random_normal([5,3]) y = tf.layers.dense(inputs=x, units=10, activation=tf.nn.relu) print(y) with tf.Session() as sess: sess.run(tf.global_variables_initializer()) print(sess.run(y))
In this example code, the inputs = x, the shape of x is 5*3
units = 10, which means the dimensionality of the output is 5*10.
tf.layers.Dense() will create two tensorflow variables:
w, weight, the shape of it is 3*10
b, bias, the weight of it is 10
Run this code, you will get this result:
y is:
Tensor("dense/Relu:0", shape=(5, 10), dtype=float32)
The value of y is:
[[0.19549479 0. 0.04906832 0. 0. 0.45005807 0. 0. 0.10907209 0. ] [0.16909868 0. 0. 0.5597176 0.06139323 0.8804685 0.63529086 0. 0.12375151 0. ] [0.51807237 0. 0.9474018 0.8525198 0. 0.9306468 1.0625012 0. 0.49360478 1.0925933 ] [0. 0.68350804 0. 0.9666059 0.8174535 0. 0.77449316 0.4195258 0. 1.035249 ] [0. 0.57167256 0. 0. 0.6504928 0. 0. 0.07965806 0. 0. ]]
How to regularize weights in tf.layers.Dense()?
The simplest way is to get all trainable weights in tf.layers.Dense(). Here is an example:
for n in tf.trainable_variables(): print(n.name) print(n)
Run this code, you may get this result:
dense/kernel:0 <tf.Variable 'dense/kernel:0' shape=(3, 10) dtype=float32_ref> dense/bias:0 <tf.Variable 'dense/bias:0' shape=(10,) dtype=float32_ref>
From the result, we can find:
The name of weight is dense/kernel:0 in tf.layers.Dense().
In order to regularize weights in tf.layers.Dense(), we can read this tutorial:
Multi-layer Neural Network Implements L2 Regularization in TensorFlow – TensorFLow Tutorial