Example-based synthesis models use one or more exemplars as the input. Models either directly copy and apply pixels from the source, or, in the case of non-parametric example-based synthesis, the output is developed to be locally similar to the input sample, so the next output pixel’s neighborhood is based on the previous output pixel’s most similar neighborhood in the exemplar.
I used EbSynth, a non-parametric example-based synthesis model, to generate the animation, directly using the source exemplar and video.
Luckily, Deforum output 291 individual frames of the original video, so those were the frames that I used for EbSynth, both for ease and consistency.
Next step was to hand-paint key frames. Instead of the model merging the source exemplar and the original video, I had to do it manually. I imported a few key frames into Procreate and painted on top as best as I could emulate the source exemplar and match the original video frame. Below are some of these hand-painted key frames.
00040
00140
00190
00240
After hand-painting key frames, it was time to move to EbSynth. I created one folder for the keyframes and one for the video frames, which I dragged into the respective slots. I changed the Keyframes Weight ro 10.0 since I wanted to emphasize the style, in part to hopefully better capture the source exemplar style. The brackets at the bottom represent which video frames are influenced by which keyframe. I split the frames into 5 segments, one for each keyframe. Then I hit the "Run All" button to generate the output frames. Below is my setup.
After the output frames were generated, I clicked "Export to AE" to composite the frames into a video in Adobe After Effects.
This is the resulting output and is shown on the Home Page.