Last updated on September 19th, 2017

Skip to content # What is the derivative of ReLU?

## 8 thoughts on “What is the derivative of ReLU?”

### Questions/comments? If you just want to say thanks, consider sharing this article or following me on Twitter!

Jer's Site

Last updated on September 19th, 2017

Technically I think derivative in x=0 does not exist. If you compute left-site derivative in for x=0, that would be f’-(0)=0. Now right-site derivative f’+(0) that would be 1. So f’-(0) != f’+(0) and derivative does not exist here. That’s why it is a matter of agreement to define f'(0).

Also notice that input of ReLU (when used for Conv Neural Nets) is usually result of a number of summed products, so probability for it to be exactly 0 is really low ðŸ˜‰

good article btw ðŸ˜‰

Agreed that f'(0) does not exist, and so we typically define f'(0)=0. Thanks for the explanation!

really helpful! great visuals, thanks

Thaks!

Wouldn’t the derivative be 0 when x is = 0. I see you have a slope, and the derivative keeps on getting larger and larger. I.e. when x is 3, you have the derivative at 3. Shouldn’t it just be a flat line i.e. when x >= 0 { (0,1), (1,1), (2,1)} & when x < 0 {(-1,0), (-2, 0) … (-x, 0)}

Hi Brendan,

The bottom coloured plot I showed is confusing, and should probably be updated. You are correct that the derivative should be a flat line,where y=1 when x > 0, and y=0 when x<=0. That plot is showing f(x), but the colours are showing f'(x). So the green means f'(x) = 1, and the blue means f'(x) = 0. Hope that helps to clarify.

fantastic article

Fantastic!