ML Katas

Neural Style Transfer

hard (>1 hr) style transfer vgg loss image
this month by E

Implement Neural Style Transfer. Given a content image and a style image, generate a new image that combines the content of the former with the style of the latter. Use a pre-trained VGG network to extract features. The loss function will be a combination of a content loss (e.g., L2 distance between feature maps) and a style loss (e.g., L2 distance between Gram matrices of feature maps).

Lcontent=FGFC22 Lstyle=lwlGl(FG)Gl(FS)F2

Verification: The resulting image should visually resemble the content image's structure while having the stylistic appearance (colors, textures, strokes) of the style image. Use matplotlib or PIL to visualize the output.