Tuesday, October 25, 2016
Did a couple of 'sing-a-longs' with Dead Bob here to get a feel for how the Vuppets feel. I've embraced the glitchiness of the system and at times perhaps took it too far with some effects. Eh, it's a Music Video, it works. I realize I'll need to get better at controlling it. In most conditions I can get the Vuppet to stay stable and not glitch out, I'll just have to get a feel for how to stop them. This certainly was fun to make. Stay tuned for better performances as I practice.
Friday, October 7, 2016
Thursday, October 6, 2016
So if you've followed this blog, I've journaled this hobby of how to create a virtual Muppet system. Think Henson's Kermit the frog, or sock puppet. Something that can be used to generate puppet-style animation as quickly as you could with live-action. Only, with the obvious benefits of CG.
Well, I'm proud to say that I've gotten to a point where I have a working proof-of-concept prototype!
It's a bit raw, but I believe that I can fine-tune it to create a system that'll be entertaining to watch, and a joy to perform.
This is my first recorded test to see how it feels.
I'll tighten him up, build a set, and see about working out a sketch. Something to really run him through a production.
It's a bit glitchy, but I'm willing to embrace it, as it's a byproduct of the capture system. I'm sure it can be fine-tuned.. and that's also part of what these next round of tests will work on.
The character itself is interchangeable.. It can be customized with fully rigged/enveloped textured/shaded assets with as many blend-shapes for facial controls that you want to attempt to puppeteer.
There is a setup that can be implemented that'll allow for OSC midi or keyboard input.. but at this point I'm not concerned with that level of fidelity in the face. I think a few basic shapes and as you can see here, he can be quite emotive.
Anyway, I feel this is a great success at this stage, and am rejuvenated and inspired to move to the next stage during performance testing.
I hope you enjoy.
Monday, April 4, 2016
Wednesday, March 23, 2016
I've been playing with the Leap Motion for a few days now, and today I had some pretty successful results!! This is just the head motion and rotation, with some limited jaw rotation incorporated.
What is exciting is even at this early stage, I can get some expressive results! It's so much fun to play with, and so responsive! It's a joy!
I want to get his body in there, and perhaps some arm/hand control. Also, introducing his eye shapes will go along way.
Monday, March 21, 2016
Today Grifu was kind enough to share his latest build of PTS with me, which incorporates LeapMotion into it!
After talking with him, and our testing with the Kinect, it makes sense that perhaps the LeapMotion offers us a bit better control for a 'Virtual Muppet' setup. Using this early version, I am able to read in various rotation and position channels directly from the leap itself. Rescale them, and output them to Unity using 'Remote-Control'.
First tests show that using one channel of rotation or position works great! However, when they are combined as seen here.. the rotation channel takes a performance hit. It's exciting to see it working at this stage, as it's a huge step forward for me. I have XYZ position data in Unity driven by my hand! The rotation data is being read but seems to be steppy.
Grifu has opened up all the joints of the both the Left and Right hand, offering a range of potential control! It's super exciting!
Very successful first test!
Edit: Here it is with just the Rotation channel isolated. Much more responsive!