Numerical Gradient Verification
Understanding and correctly implementing backpropagation is crucial in deep learning. A common way to debug backpropagation is using numerical gradient checking. This involves approximating the gradient of a function numerically and comparing it to the analytically derived gradient.
Your task is to implement a function numerical_gradient(f, x, h)
that computes the numerical gradient of a function at point . The numerical gradient for a single dimension is approximated by:
where is a vector with 1 at position and 0 elsewhere.
Implementation Details:
1. Implement numerical_gradient(f, x, h)
which takes:
* f
: A function that takes a NumPy array x
and returns a scalar.
* x
: A NumPy array (vector) representing the point at which to compute the gradient.
* h
: A small scalar value (e.g., 1e-4
) for the finite difference.
2. The function should return a NumPy array of the same shape as x
, containing the numerical gradient.
Verification:
1. Define a simple function, e.g., . Its analytical gradient is .
2. Implement the analytical gradient function for .
3. Generate a random NumPy array x
.
4. Compute both the numerical gradient and the analytical gradient.
5. Compare the results. The difference between the numerical and analytical gradients should be very small (e.g., less than 1e-9
). Use np.linalg.norm(numerical_grad - analytical_grad)
to check the difference.
6. Consider testing with a slightly more complex function, like .