Instagram
youtube
Facebook
Twitter

Activation Functions in Neural Networks

Activation Functions are a very important concept in neural networks. TensorFlow provides various activation functions that can be used in our models. In this tutorial, we’ll learn about activation functions and what their role is in neural networks.

What are Activation Functions?

  • Activation functions are mathematical functions that can be applied to neural networks.

  • Some complex problems cannot be written in linear form, so activation functions help us create non-linear equations.

  • They basically define the accuracy and efficiency of the model.

There are many different types of activation functions that can be used in neural networks. Some of the most common activation functions in TensorFlow are:

Sigmoid Function - 

  • Sigmoid function is the commonly used activation function in TensorFlow. 

  • It maps any input value to a value between 0 and 1.

  • it useful for modeling probabilities or as an activation function in neural networks.

  • In TensorFlow, the sigmoid function can be implemented using the tf.sigmoid function. 

Code

import tensorflow as tf
a = tf.constant([22,11,45,80.9,4])
tf.sigmoid(a)

Output

<tf.Tensor: shape=(5,), dtype=float32, numpy=
array([1.        , 0.9999833 , 1.        , 1.        , 0.98201376],
      dtype=float32)>

 

hyperbolic tangent function - 

  • Hyperbolic tangent function is also an important mathematical function used in deep learning.

  • It is also known as tanh function.

  • It maps any input value to a value between -1 and 1.

  • The tanh function can be implemented using tf.tanh function.

 Code

import tensorflow as tf
a = tf.constant([22,11,45,80.9,4])
tf.tanh(a)

Output

<tf.Tensor: shape=(5,), dtype=float32, numpy = array([1.       , 1.       , 1.       , 1.       , 0.9993292],dtype=float32)>

 

RelU Function - 

  • RelU stands for Rectified Linear Unit.

  • It maps any input value x to max(0, x).

  • In simple words, it sets negative values to zero and leaves positive values unchanged.

  • The RelU function can be implemented using the tf.nn.relu function.

​​​​​​​

​​​​​​​

 Code

import tensorflow as tf
a = tf.constant([22,-11,45,80.9,-4])
tf.nn.relu(a)

Output

<tf.Tensor: shape=(5,), dtype=float32, numpy=array([22. ,  0. , 45. , 80.9,  0. ], dtype=float32)>