How to Understand Sigmoid Function in Artificial Neural Networks?

Swapnil Vishwakarma Last Updated : 12 Oct, 2024
8 min read

Introduction

The sigmoid function is a fundamental component of artificial neural network and is crucial in many machine-learning applications. This blog post will dive deep into the sigmoid function and explore its properties, applications, and implementation in code.

neural network

Source:  Pixabay

First, let’s start with the basics. The sigmoid function is a mathematical function that maps any input value to a value between 0 and 1, making it useful for binary classification and logistic regression problems. The shape of the sigmoid function is often referred to as an “S” shape, as it starts with a slow increase, rapidly approaches 1, and finally levels off.

In artificial neural networks, the sigmoid function is commonly used as an activation function in the neurons. It introduces non-linearity into the model, allowing the neural network to learn more complex decision boundaries. The function is particularly useful in feedforward neural networks, which are used in different applications like image recognition, natural language processing, and speech recognition.

Learning Objectives:

In this blog post, we will discuss the properties of the sigmoid function, its use in artificial neural networks, and how to implement it in code. We will also explore different applications of sigmoid and its limitations. So, let’s get started and explore the exciting world of the sigmoid function in artificial neural networks.

This article was published as a part of the Data Science Blogathon.

Understanding the Sigmoid Function

The sigmoid function is defined mathematically as 1/(1+e^(-x)), where x is the input value and e is the mathematical constant of 2.718. The function maps any input value to a value between 0 and 1, making it useful for binary classification and logistic regression problems. The range of the function is (0,1), and the domain is (-infinity,+infinity).

One of the key properties of the sigmoid function is its “S” shape. As the input value increases, the output value of it starts with a slow increase, then rapidly approaches 1, and finally levels off. This property makes a valuable function for modeling decision boundaries in binary classification problems.

Another property of the sigmoid is its derivative, commonly used in training neural networks. The derivative of the function is defined as f(x)(1-f(x)), where f(x) is the output of the function. The derivative is helpful in training neural networks because it allows the network to adjust the weights and biases of the neurons more efficiently.

It’s also worth mentioning that the sigmoid function has some limitations. For example, the output of the sigmoid is always between 0 and 1, which can cause problems when the network’s output should be greater than 1 or less than 0. Other activation functions like ReLU and tanh can be used in such cases.

Visualizing the sigmoid function using graphs can help to understand its properties better. Its graph will show the “S” shape of the function and how the output value changes as the input value changes.

Python Code:

import numpy as np
import matplotlib.pyplot as plt 
x = np.arange(-10, 10, 0.1) 
y = 1 / (1 + np.exp(-x)) 
plt.plot(x, y, 'r') 
plt.grid(True) 
plt.axhline(y=0.5, color='gray',linestyle=':')
plt.xlabel('Input') 
plt.ylabel('Output') 
plt.title('Graph of the Sigmoid Function') 
plt.show()

What Sigmoid Function do?

Sigmoid Function in Artificial Neural Networks

Sigmoid Function
Source: Pexels

The sigmoid function is commonly used as an activation function in artificial neural networks. In feedforward neural networks, the sigmoid function is applied to each neuron’s output, allowing the network to introduce non-linearity into the model. This nonlinearity is important because it allows the neural network to learn more complex decision boundaries, which can improve its performance on specific tasks.

Advantages:

  • Produces output values between 0 and 1, which can be helpful for binary classification and logistic regression problems.
  • Differentiable means that its derivative can be calculated, and it is easy to optimize the network by adjusting the weights and biases of the neurons.

Disadvantages:

  • It can produce output values close to 0 or 1, which can cause problems with the optimization algorithm.
  • The gradient of the sigmoid function becomes very small near the output values of 0 or 1, which makes it difficult for the optimization algorithm to adjust the weights and biases of the neurons.

Here is a comparison table of sigmoid, ReLU, and tanh activation functions in terms of performance and optimization:

Activation Function Performance Optimization
Sigmoid Good for binary classification and logistic regression problems.
Can have issues with optimization near output values of 0 or 1.
ReLU Computationally efficient, however, can suffer from the “dying ReLU” problem. Can have issues with optimization with neurons that output 0.
tanh Computationally efficient, its range is centered around zero, which can be helpful for specific problems. Can have issues with optimization near output values of -1 or 1.
  

Implementing Sigmoid in Code

In the previous section, we have seen how to implement the sigmoid function in Python using NumPy. However, when working with deep learning frameworks like TensorFlow and PyTorch, it’s often more convenient to use built-in functions to calculate the sigmoid function.

Here is an example of how to implement the sigmoid function in TensorFlow:

import tensorflow as tf
# Define a tensor
x = tf.constant([-1.0, 0.0, 1.0])
# Apply the sigmoid function
y = tf.math.sigmoid(x)
print(y)

The output of the above code will be 0.26894143, 0.5, and 0.7310586, which are the sigmoid values of -1.0, 0.0, and 1.0, respectively.

Here is an example of how to implement the sigmoid function in PyTorch:

import torch
# Define a tensor
x = torch.tensor([-1.0, 0.0, 1.0])
# Apply the sigmoid function
y = torch.sigmoid(x)
print(y)

The output of the above code will be 0.2689, 0.5000, and 0.7311, which are the sigmoid values of -1.0, 0.0, and 1.0, respectively.

It’s worth mentioning that when working with deep learning frameworks, you must be mindful of the data types and shapes of the tensors you’re working with. For example, suppose you’re working with a large dataset. In that case, it may be more efficient to use a batch version of the sigmoid to simultaneously calculate the sigmoid values for multiple input values. Additionally, it’s essential to consider the memory and computational requirements of the sigmoid function when working with large models.

In terms of best practices, ensuring that your implementation of the sigmoid function is efficient and numerically stable is essential. This means that you should avoid using explicit loops to calculate the sigmoid and instead use vectorized operations or built-in functions. Additionally, you should be mindful of the range of the input values and make sure that the sigmoid function is not producing NaN or Inf values.

Applications of Sigmoid Function

applications of Sigmoid Function
  • The sigmoid is a mathematical function that maps input values to a value between 0 and 1, making it useful for binary classification and logistic regression problems.
  • It is commonly used as an activation function in artificial neural networks, particularly in feedforward neural networks. This is because it allows the network to introduce non-linearity into the model, which allows the neural network to learn more complex decision boundaries.
  • For example, in image classification tasks, the sigmoid can be used to convert the output of the linear model into a probability. This probability can be used as the prediction for the binary classification problem (whether the image contains a cat or a dog).
  • The sigmoid function is also commonly used in logistic regression problems, where the goal is to predict a binary outcome given a set of independent variables. The sigmoid function’s output, a value between 0 and 1, can be interpreted as the probability of the sample belonging to the positive class.
  • Another example is natural language processing. The sigmoid can be used in sentiment analysis to determine the probability that a given statement is positive, negative, or neutral.
  • In other words, the function is useful when the output is binary or when the output is a probability of some event happening.
  • It’s worth mentioning that other activation functions like ReLU, tanh, or LeakyReLU can also be used in cases where the sigmoid function is not suitable or it may not perform as well.

Derivative of sigmoid function in neural network

The sigmoid function is a mathematical function that takes an input value and outputs a value between 0 and 1. It is shaped like an S curve, with a steep slope in the middle and flat slopes at the top and bottom.

The derivative of a function tells us how much the function changes in response to a change in its input. The derivative is calculated as follows:

d/dx sigmoid(x) = sigmoid(x) * (1 – sigmoid(x))

  • The derivative of the sigmoid function is used during backpropagation, which is a process that allows neural networks to learn from their mistakes and improve their performance over time.
  • During backpropagation, the derivative of the sigmoid function is used to calculate the error gradient of the network. The error gradient is a measure of how much the network’s output differs from the desired output.
  • The error gradient is then used to update the weights of the network in such a way that the network’s output will be closer to the desired output in the future.

Why the sigmoid function is important in neural networks?

Here are the following points why it is important:

  • Non-linearity: The sigmoid function is a non-linear function, which means that it can learn more complex relationships between the data than a linear function.
  • Easy to compute: It is relatively easy to compute, which makes it efficient for training neural networks.
  • Nice mathematical shape: It has a nice mathematical shape, which makes it easy to analyze and debug neural networks.

Conclusion

In conclusion, the sigmoid function is an essential component in artificial neural networks, particularly in the context of binary classification and logistic regression. It allows for the introduction of non-linearity into the model, which enables the neural network to learn more complex decision boundaries. The sigmoid function is also commonly used in deep learning frameworks like TensorFlow and PyTorch, and it can be easily implemented in code using NumPy, TensorFlow, and PyTorch.

Key Takeaways:

  • The function is an essential component in artificial neural networks, particularly in the context of binary classification and logistic regression.
  • It allows for the introduction of non-linearity into the model, which enables the neural network to learn more complex decision boundaries.
  • The function is commonly used in deep learning frameworks and can be easily implemented in code.
  • Experimenting with the function and comparing its performance to other activation functions can help understand the working of artificial neural networks and their importance in deep learning.

FAQs

Q1. What is the difference between the logistic function and the sigmoid function?

The logistic function outputs values between 0 and 1, while the sigmoid function outputs values between -1 and 1. The logistic function is also more computationally efficient than the sigmoid function.

Q2. Is the sigmoid function a loss function?

No, the sigmoid function is a transformation function. It takes a number as input and outputs a new number between 0 and 1. Loss functions, on the other hand, compare the model’s predictions to the actual data to see how well the model is performing.
In short, transformation functions change inputs, while loss functions measure performance.

Q3. What are some other activation functions that can be used instead of the sigmoid function?

There are a number of other activation functions that can be used instead of the sigmoid function, such as the tanh function, the ReLU function, and the leaky ReLU function. These activation functions have different advantages and disadvantages, so the best choice for a particular application will depend on the specific problem being solved.

If you liked this blog, consider following me on Analytics VidhyaGitHub, and LinkedIn.

The media shown in this article is not owned by Analytics Vidhya and is used at the Author’s discretion.

Hello there! πŸ‘‹πŸ» My name is Swapnil Vishwakarma, and I'm delighted to meet you! πŸ„β€β™‚οΈ

I've had some fantastic experiences in my journey so far! I worked as a Data Science Intern at a start-up called Data Glacier, where I had the opportunity to delve into the fascinating world of data. I also had the chance to be a Python Developer Intern at Infigon Futures, where I honed my programming skills. Additionally, I worked as a research assistant at my college, focusing on exciting applications of Artificial Intelligence. βš—οΈπŸ‘¨β€πŸ”¬

During the lockdown, I discovered my passion for Machine Learning, and I eagerly pursued a course on Machine Learning offered by Stanford University through Coursera. Completing that course empowered me to apply my newfound knowledge in real-world settings through internships. Currently, I'm proud to be an AWS Community Builder, where I actively engage with the AWS community, share knowledge, and stay up to date with the latest advancements in cloud computing.

Aside from my professional endeavors, I have a few hobbies that bring me joy. I love swaying to the beats of Punjabi songs, as they uplift my spirits and fill me with energy! 🎡 I also find solace in sketching and enjoy immersing myself in captivating books, although I wouldn't consider myself a bookworm. πŸ›

Feel free to ask me anything or engage in a friendly conversation! I'm here to assist you in English. 😊

Responses From Readers

Clear

Congratulations, You Did It!
Well Done on Completing Your Learning Journey. Stay curious and keep exploring!

We use cookies essential for this site to function well. Please click to help us improve its usefulness with additional cookies. Learn about our use of cookies in our Privacy Policy & Cookies Policy.

Show details