Skip to content

larq.activations

Activations can either be used through an Activation layer, or through the activation argument supported by all forward layers:

import tensorflow as tf
import larq as lq

model.add(lq.layers.QuantDense(64))
model.add(tf.keras.layers.Activation('hard_tanh'))

This is equivalent to:

model.add(lq.layers.QuantDense(64, activation='hard_tanh'))

You can also pass an element-wise TensorFlow function as an activation:

model.add(lq.layers.QuantDense(64, activation=lq.activations.hard_tanh))

[source]

hard_tanh

larq.activations.hard_tanh(x)

Hard tanh activation function.

Arguments

  • x tf.Tensor: Input tensor.

Returns

Hard tanh activation.


[source]

leaky_tanh

larq.activations.leaky_tanh(x, alpha=0.2)

Leaky tanh activation function. Similar to hard tanh, but with non-zero slopes as in leaky ReLU.

Arguments

  • x tf.Tensor: Input tensor.
  • alpha float: Slope of the activation function outside of [-1, 1].

Returns

Leaky tanh activation.