This class implements a layer that calculates the ReLU
activation function for each element of a single input.
Here is the default formula of the activation function:
f(x) = 0 if x <= 0
f(x) = x if x > 0
You also can set the cutoff upper threshold for the function. If you do, the function will be calculated according to the formula:
f(x) = 0 if x <= 0
f(x) = x if 0 < x < threshold
f(x) = threshold if threshold <= x
void SetUpperThreshold( float threshold );
Sets the upper threshold for the value of the function. By default there is no threshold, the function is not bounded from above.
There are no trainable parameters for this layer.
There is only one input, which accepts a blob of any size.
There is only one output, which returns a blob of the same size as the input blob. Each element of the output contains the value of the activation function calculated on the corresponding element of the input.