Rectified linear units

The logic behind keeping a rectified linear units (ReLUs) layer is very simple: it replaces all the negative values with 0. This helps us to make CNN mathematically healthier by avoiding negative values:

Here, in this layer, the size of the image is not altered. We will get the same size output as the input only when the negative values are replaced by 0.