If you’ve got kids, chances are that you’ve watched the Lego Movie or one of the spin-offs. They’ve been created to look as if everything is filmed in traditional “stop motion”, however in reality it was done on a computer.
Anyways, stop motion filming has long been used in animations and movies such as Wallace & Gromit, Chicken Run and so on. It’s basically a process where you move a small model, take a photo, move it a bit more, take a photo and then continue until your footage is done. Put all those individual photos together and the result is rather lovely, but to save the animator going insane, you’ll usually find that the amount of frames (photos) per second is reduced.
However, now – with a completely free bit of code and a sprinkling of AI – you can fill in the intermediate frames. This means that a relatively jerky bit of footage suddenly looks silky smooth.
As an example, here’s a Brickfilm animation from LegoEddy using the code..
To grab the code and try it yourself, just head right here. A network called DAIN interpolates your footage, and you can grab the source code from Github.
To see the difference, here’s the original movie..
…and the enhanced interpolated version…