ReLU Activation Function
Implement the ReLU (Rectified Linear Unit) function in PyTorch.
Problem
Write a function relu(x)
that takes a 1D tensor and replaces all negative values with 0.
- Input: A tensor
x
of shape(n,)
. - Output: A tensor of shape
(n,)
with negative entries replaced by 0.
Example
x = torch.tensor([-2., -1., 0., 1., 2.])
print(relu(x))
# Expected: tensor([0., 0., 0., 1., 2.])
Solution Sketch
Use torch.clamp(x, min=0)
or torch.maximum(x, torch.tensor(0.))
. ReLU is widely used in neural nets as .