Having made a handful of tools to support my production pipeline, I've put some of the more recent ones through an experimental pipeline test and made a render of the result. In keeping with some of the past examples renders, I'm using the in-built character in my ProxyMan tool to jump through some water.
Now, the water and the water sim itself could do with a bit of work. But that wasn't the point of this exercise. The point here was to test my tools that transfer animation data between programs and render out the result, and to see how well my little tools handled the process.
Here is the rendered result:
A keen eye will see issues. The water dynamics aren't great; the splashes are poor, there's no trailing water coming off the feet, the ripples don't reverberate out enough, etc and etc. But it's not about the quality of the simulation. It was about testing the pipeline tools.
Here's how it worked: I used my motion capture tool to animate the character. I then used my recently developed export tool to send the animated character mesh to another program, made the water simulation, then used my recent import sequence tool to bring the water mesh sequence back into my animation program for rendering.
The results show that it is possible to do the things I'm after. That is, exchange data between programs during the post-production phase of a project. It was about seeing how well the tools worked the way I intended them too. And they did their intended job pretty well. Albeit for the given simple scenario. I'm sure it will become more complicated as the project becomes more flashy.
For those interested, the total frame count is 440, with each frame taking about 20 seconds to render.
Another box ticked.
I now need to work on getting something that's a little more real-world-project-based done. I'm not quite sure what this will be yet. Maybe something like a camera tracked shot with an animated character composited into it. I'll come up with something.
