ML Katas

Numerical Gradient Verification

medium (<1 hr) debugging gradient calculus numerical-stability backpropagation
this month by E

Understanding and correctly implementing backpropagation is crucial in deep learning. A common way to debug backpropagation is using numerical gradient checking. This involves approximating the gradient of a function numerically and comparing it to the analytically derived gradient.

Your task is to implement a function numerical_gradient(f, x, h) that computes the numerical gradient of a function f at point x. The numerical gradient for a single dimension i is approximated by:

fxif(x+h·ei)f(xh·ei)2h

where ei is a vector with 1 at position i and 0 elsewhere.

Implementation Details: 1. Implement numerical_gradient(f, x, h) which takes: * f: A function that takes a NumPy array x and returns a scalar. * x: A NumPy array (vector) representing the point at which to compute the gradient. * h: A small scalar value (e.g., 1e-4) for the finite difference. 2. The function should return a NumPy array of the same shape as x, containing the numerical gradient.

Verification: 1. Define a simple function, e.g., f(x)=x2. Its analytical gradient is 2x. 2. Implement the analytical gradient function for f(x)=x2. 3. Generate a random NumPy array x. 4. Compute both the numerical gradient and the analytical gradient. 5. Compare the results. The difference between the numerical and analytical gradients should be very small (e.g., less than 1e-9). Use np.linalg.norm(numerical_grad - analytical_grad) to check the difference. 6. Consider testing with a slightly more complex function, like f(x)=sin(xi).