Neural Style Transfer
Implement Neural Style Transfer. Given a content image and a style image, generate a new image that combines the content of the former with the style of the latter. Use a pre-trained VGG network to extract features. The loss function will be a combination of a content loss (e.g., L2 distance between feature maps) and a style loss (e.g., L2 distance between Gram matrices of feature maps).
Verification: The resulting image should visually resemble the content image's structure while having the stylistic appearance (colors, textures, strokes) of the style image. Use matplotlib
or PIL
to visualize the output.