Layers (contrib)

Ops for building neural network layers, regularizers, summaries, etc.

Higher level ops for building neural network layers

This package provides several ops that take care of creating variables that are used internally in a consistent way and provide the building blocks for many common machine learning algorithms.

Aliases for fully_connected which set a default activation function are available: relu, relu6 and linear.

stack operation is also available. It builds a stack of layers by applying a layer repeatedly.


Regularization can help prevent overfitting. These have the signature fn(weights). The loss is typically added to tf.GraphKeys.REGULARIZATION_LOSSES.


Initializers are used to initialize variables with sensible values given their size, data type, and purpose.


Optimize weights given a loss.


Helper functions to summarize specific variables or ops.

The layers module defines convenience functions summarize_variables, summarize_weights and summarize_biases, which set the collection argument of summarize_collection to VARIABLES, WEIGHTS and BIASES, respectively.

Feature columns

Feature columns provide a mechanism to map data to a model.