# What is the derivative of ReLU?

Posted on Categories deep learning, math

## 8 thoughts on “What is the derivative of ReLU?”

1. matkrak says:

Technically I think derivative in x=0 does not exist. If you compute left-site derivative in for x=0, that would be f’-(0)=0. Now right-site derivative f’+(0) that would be 1. So f’-(0) != f’+(0) and derivative does not exist here. That’s why it is a matter of agreement to define f'(0).

Also notice that input of ReLU (when used for Conv Neural Nets) is usually result of a number of summed products, so probability for it to be exactly 0 is really low 😉

good article btw 😉

1. Jeremy says:

Agreed that f'(0) does not exist, and so we typically define f'(0)=0. Thanks for the explanation!

2. Anonymous says:

really helpful! great visuals, thanks

3. Kumar says:

Thaks!

4. BrendanR says:

Wouldn’t the derivative be 0 when x is = 0. I see you have a slope, and the derivative keeps on getting larger and larger. I.e. when x is 3, you have the derivative at 3. Shouldn’t it just be a flat line i.e. when x >= 0 { (0,1), (1,1), (2,1)} & when x < 0 {(-1,0), (-2, 0) … (-x, 0)}

1. Jeremy says:

Hi Brendan,
The bottom coloured plot I showed is confusing, and should probably be updated. You are correct that the derivative should be a flat line,where y=1 when x > 0, and y=0 when x<=0. That plot is showing f(x), but the colours are showing f'(x). So the green means f'(x) = 1, and the blue means f'(x) = 0. Hope that helps to clarify.

5. Xin Belter says:

fantastic article

6. Anonymous says:

Fantastic!