ML Katas

ReLU Activation Function

easy (<10 mins) tensors activation nn
this month by E

Implement the ReLU (Rectified Linear Unit) function in PyTorch.

Problem

Write a function relu(x) that takes a 1D tensor and replaces all negative values with 0.

  • Input: A tensor x of shape (n,).
  • Output: A tensor of shape (n,) with negative entries replaced by 0.

Example

x = torch.tensor([-2., -1., 0., 1., 2.])
print(relu(x))
# Expected: tensor([0., 0., 0., 1., 2.])

Solution Sketch

Use torch.clamp(x, min=0) or torch.maximum(x, torch.tensor(0.)). ReLU is widely used in neural nets as f(x)=max(0,x).