{"id":19559,"date":"2021-06-30T08:01:05","date_gmt":"2021-06-30T08:01:05","guid":{"rendered":"https:\/\/www.askpython.com\/?p=19559"},"modified":"2021-06-30T08:01:07","modified_gmt":"2021-06-30T08:01:07","slug":"activation-functions-python","status":"publish","type":"post","link":"https:\/\/www.askpython.com\/python\/examples\/activation-functions-python","title":{"rendered":"4 Activation Functions in Python to know!"},"content":{"rendered":"\n<p>Hello, readers! In this article, we will be focusing on <strong>Python Activation functions<\/strong>, in detail.<\/p>\n\n\n\n<p>So, let us get started!! \ud83d\ude42<\/p>\n\n\n\n<hr class=\"wp-block-separator\"\/>\n\n\n\n<h2 class=\"wp-block-heading\">What is an Activation function?<\/h2>\n\n\n\n<p>In the world of Neural networking and deep learning with convolutional models, Python has been playing a significant role when it comes to modeling and analysis of data. <\/p>\n\n\n\n<p>Activation functions are the mathematical base model that enables us to control the output of the neural network model. That is, it helps us analyze and estimate whether a neuron contributing to the enablement of the model is to be kept within or removed (fired).<\/p>\n\n\n\n<p>Some of the prominent Activation functions&#8211;<\/p>\n\n\n\n<ol class=\"wp-block-list\"><li><strong><a href=\"https:\/\/www.askpython.com\/python\/examples\/relu-function\" target=\"_blank\" rel=\"noreferrer noopener\">ReLu function<\/a><\/strong><\/li><li><strong><a href=\"https:\/\/www.askpython.com\/python\/examples\/relu-function\" data-type=\"post\" data-id=\"19314\">Leaky ReLu function<\/a><\/strong><\/li><li><strong><a href=\"https:\/\/www.askpython.com\/python\/examples\/neural-networks\" data-type=\"post\" data-id=\"17284\">Sigmoid function<\/a><\/strong><\/li><li><strong><a href=\"https:\/\/www.askpython.com\/python\/examples\/calculating-softmax\" data-type=\"post\" data-id=\"18458\">Softmax function<\/a><\/strong><\/li><li><strong><a href=\"https:\/\/www.askpython.com\/python\/examples\/linear-regression-from-scratch\" data-type=\"post\" data-id=\"11152\">Linear function<\/a><\/strong>, etc.<\/li><\/ol>\n\n\n\n<p>Having understood about Activation function, let us now have a look at the above activation functions in the upcoming section.<\/p>\n\n\n\n<hr class=\"wp-block-separator\"\/>\n\n\n\n<h2 class=\"wp-block-heading\">1. ReLu function<\/h2>\n\n\n\n<p>ReLu function is a type of Activation function that enables us to improvise the convolutional picture of the neural network. It detects the state of the neural network in terms of the model results. <\/p>\n\n\n\n<p>The ReLu function states that when the input is negative, return zero. Else for a non-negative input, it returns one.<\/p>\n\n\n\n<p><strong>Example<\/strong>:<\/p>\n\n\n\n<p>Here, we have implemented a user-defined function to inculcate the ReLu condition using <a href=\"https:\/\/www.askpython.com\/python\/built-in-methods\/python-max-method\" data-type=\"post\" data-id=\"4345\">max() function<\/a> in Python.<\/p>\n\n\n<div class=\"wp-block-syntaxhighlighter-code \"><pre class=\"brush: python; title: ; notranslate\" title=\"\">\ndef ReLu(ar):\n    return max(0.0,ar)\nar = 1.0\nprint(ReLu(ar))\nar1= -1.0\nprint(ReLu(ar1))\n<\/pre><\/div>\n\n\n<p><strong>Output&#8211;<\/strong><\/p>\n\n\n<div class=\"wp-block-syntaxhighlighter-code \"><pre class=\"brush: python; title: ; notranslate\" title=\"\">\n1.0\n0.0\n<\/pre><\/div>\n\n\n<hr class=\"wp-block-separator\"\/>\n\n\n\n<h2 class=\"wp-block-heading\">2. Leaky ReLu function<\/h2>\n\n\n\n<p>The gradient score i.e. the derivative value for the non-zero input passed to the ReLu function was found to be zero. Which basically stated that the weights are not being updated properly by the learning function. <\/p>\n\n\n\n<p>To overcome this Gradient issue of ReLu function, we have been introduced to Leaky ReLu function.<\/p>\n\n\n\n<p>Leaky ReLu function attaches a small linear component(constant value) to the negative (non-zero) input weight being passed to the function. By this, the gradient score for these non-zero input weights turned out to be a non-zero value.<\/p>\n\n\n\n<p><strong>Example<\/strong>:<\/p>\n\n\n<div class=\"wp-block-syntaxhighlighter-code \"><pre class=\"brush: python; title: ; notranslate\" title=\"\">\ndef ReLu(x):\n  if x&gt;0 :\n    return x\n  else :\n    return 0.001*x\n\nx = -1.0\nprint(ReLu(x))\n<\/pre><\/div>\n\n\n<p><strong>Outpu<\/strong>t:<\/p>\n\n\n<div class=\"wp-block-syntaxhighlighter-code \"><pre class=\"brush: python; title: ; notranslate\" title=\"\">\n-0.001\n<\/pre><\/div>\n\n\n<hr class=\"wp-block-separator\"\/>\n\n\n\n<h2 class=\"wp-block-heading\">3. Sigmoid function<\/h2>\n\n\n\n<p>The Sigmoid Activation function is simply based on the below sigmoid mathematical formula&#8211;<\/p>\n\n\n\n<div class=\"wp-block-image is-style-default\"><figure class=\"aligncenter size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"453\" height=\"158\" src=\"https:\/\/www.askpython.com\/wp-content\/uploads\/2021\/06\/image-3.png\" alt=\"Sigmoid formula\" class=\"wp-image-19563\" srcset=\"https:\/\/www.askpython.com\/wp-content\/uploads\/2021\/06\/image-3.png 453w, https:\/\/www.askpython.com\/wp-content\/uploads\/2021\/06\/image-3-300x105.png 300w\" sizes=\"auto, (max-width: 453px) 100vw, 453px\" \/><figcaption><strong>Sigmoid formula<\/strong><\/figcaption><\/figure><\/div>\n\n\n\n<p>As the denominator is always greater than one, thus the output of this activation function is always between 0 and 1.<\/p>\n\n\n\n<p><strong>Example<\/strong>:<\/p>\n\n\n<div class=\"wp-block-syntaxhighlighter-code \"><pre class=\"brush: python; title: ; notranslate\" title=\"\">\nimport numpy as np \ndef sigmoid(num):\n return 1\/(1 + np.exp(-num))\nnum = -1.0\nprint(sigmoid(num))\n \n<\/pre><\/div>\n\n\n<p><strong>Output<\/strong>:<\/p>\n\n\n<div class=\"wp-block-syntaxhighlighter-code \"><pre class=\"brush: python; title: ; notranslate\" title=\"\">\n0.2689414213699951\n<\/pre><\/div>\n\n\n<hr class=\"wp-block-separator\"\/>\n\n\n\n<h2 class=\"wp-block-heading\">4. Softmax function<\/h2>\n\n\n\n<p>The softmax activation function can be termed as a mathematical model that accepts a vector of numeric data variables as input and then normalizes the data.<\/p>\n\n\n\n<p>That is, it <a href=\"https:\/\/www.askpython.com\/python\/examples\/normalize-data-in-python\" data-type=\"post\" data-id=\"9108\">normalizes<\/a> (scales the data values) to a probability distribution wherein the probability of every data value is proportional to the scale of every value present in the vector.<\/p>\n\n\n\n<p>As a result, all the data values will be in the range of 0 &#8211; 1. Also, the summation of all the data values would be equal to 1 as they are being interpreted as probabilities.<\/p>\n\n\n\n<hr class=\"wp-block-separator\"\/>\n\n\n\n<h2 class=\"wp-block-heading\">Conclusion<\/h2>\n\n\n\n<p>By this, we have come to the end of this topic. Feel free to comment below, in case you come across any question.<\/p>\n\n\n\n<p>For more such posts related to Python programming, Stay tuned with us.<\/p>\n\n\n\n<p>Till then, Happy Learning!! \ud83d\ude42<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Hello, readers! In this article, we will be focusing on Python Activation functions, in detail. So, let us get started!! \ud83d\ude42 What is an Activation function? In the world of Neural networking and deep learning with convolutional models, Python has been playing a significant role when it comes to modeling and analysis of data. Activation [&hellip;]<\/p>\n","protected":false},"author":4,"featured_media":19566,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[9],"tags":[],"class_list":["post-19559","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-examples"],"blocksy_meta":[],"_links":{"self":[{"href":"https:\/\/www.askpython.com\/wp-json\/wp\/v2\/posts\/19559","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.askpython.com\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.askpython.com\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.askpython.com\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"https:\/\/www.askpython.com\/wp-json\/wp\/v2\/comments?post=19559"}],"version-history":[{"count":0,"href":"https:\/\/www.askpython.com\/wp-json\/wp\/v2\/posts\/19559\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.askpython.com\/wp-json\/wp\/v2\/media\/19566"}],"wp:attachment":[{"href":"https:\/\/www.askpython.com\/wp-json\/wp\/v2\/media?parent=19559"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.askpython.com\/wp-json\/wp\/v2\/categories?post=19559"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.askpython.com\/wp-json\/wp\/v2\/tags?post=19559"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}