Understand tf.contrib.layers.fully_connected(): How to Use and Regularization – TensorFlow Tutorial

By | March 29, 2021

In tensorflow, tf.contrib.layers.fully_connected() allows us to create a fully connected layer. In this tutorial, we will discuss some details on it.

tf.contrib.layers.fully_connected()

tf.contrib.layers.fully_connected() is defined as:

tf.contrib.layers.fully_connected(
    inputs,
    num_outputs,
    activation_fn=tf.nn.relu,
    normalizer_fn=None,
    normalizer_params=None,
    weights_initializer=initializers.xavier_initializer(),
    weights_regularizer=None,
    biases_initializer=tf.zeros_initializer(),
    biases_regularizer=None,
    reuse=None,
    variables_collections=None,
    outputs_collections=None,
    trainable=True,
    scope=None
)

It will add a fully connected layer in our model.

Important parameters

inputs: the input tensor

num_outputs: dimensionality of the output, for example: 100

activation_fn: can be relu, sigmoid, tanh etc al

tf.contrib.layers.fully_connected() vs tf.layers.Dense()

tf.layers.Dense() also can create a fully connected layer, however, there are some differences between them.

Understand tf.layers.Dense(): How to Use and Regularization – TensorFlow Tutorial

tf.contrib.layers.fully_connected() will call tf.layers.Dense() in its source code. Here is an example:

tf.contrib.layers.fully_connected() vs tf.layers.Dense()

Weight in tf.contrib.layers.fully_connected() is initialized by xavier_initializer(). However, weight in tf.layers.Dense() is initialized by glorot_uniform_initializer().

We will use an example to show you how to use and regularize weight in tf.contrib.layers.fully_connected(). It is same to be used as tf.layers.Dense().

How to use and regularize weights in tf.contrib.layers.fully_connected()?

Look at this example:

import tensorflow as tf

x = tf.random_normal([5, 2, 2])
y = tf.contrib.layers.fully_connected(inputs=x, num_outputs=10, activation_fn=tf.nn.relu)
print(y)

for n in tf.trainable_variables():
    
    print(n.name)
    print(n)
                    
with tf.Session() as sess:
    sess.run(tf.global_variables_initializer())
    print(sess.run(y))

Run this code, we will find:

Weight and Bias in tf.contrib.layers.fully_connected() is called:

fully_connected/weights:0
<tf.Variable 'fully_connected/weights:0' shape=(2, 10) dtype=float32_ref>
fully_connected/biases:0
<tf.Variable 'fully_connected/biases:0' shape=(10,) dtype=float32_ref>

In this example code, the dimension of output is 10, which means weight is 2* 10 in tf.contrib.layers.fully_connected().

In order to regularize weights in tf.contrib.layers.fully_connected(),  you can read this tutorial:

Multi-layer Neural Network Implements L2 Regularization in TensorFlow – TensorFLow Tutorial

Leave a Reply