Tuesday, October 25, 2016

First Vuppet Test - Experienced.

Did a couple of 'sing-a-longs' with Dead Bob here to get a feel for how the Vuppets feel. I've embraced the glitchiness of the system and at times perhaps took it too far with some effects. Eh, it's a Music Video, it works. I realize I'll need to get better at controlling it. In most conditions I can get the Vuppet to stay stable and not glitch out, I'll just have to get a feel for how to stop them. This certainly was fun to make. Stay tuned for better performances as I practice.

Friday, October 7, 2016

A quick BTS of the Virtual Muppet system performed

Wanted to show the technique used to puppeteer the virtual Muppet.  You can see how responsive it is...

Thursday, October 6, 2016

The Virtual Muppet Setup Prototype is Working!

So if you've followed this blog, I've journaled this hobby of how to create a virtual Muppet system. Think Henson's Kermit the frog, or sock puppet.  Something that can be used to generate puppet-style animation as quickly as you could with live-action.  Only, with the obvious benefits of CG.

Well, I'm proud to say that I've gotten to a point where I have a working proof-of-concept prototype!
It's a bit raw, but I believe that I can fine-tune it to create a system that'll be entertaining to watch, and a joy to perform.

This is my first recorded test to see how it feels.
I'll tighten him up, build a set, and see about working out a sketch.  Something to really run him through a production.

It's a bit glitchy, but I'm willing to embrace it, as it's a byproduct of the capture system.  I'm sure it can be fine-tuned.. and that's also part of what these next round of tests will work on.

The character itself is interchangeable..  It can be customized with fully rigged/enveloped textured/shaded assets with as many blend-shapes for facial controls that you want to attempt to puppeteer.

There is a setup that can be implemented that'll allow for OSC midi or keyboard input.. but at this point I'm not concerned with that level of fidelity in the face.  I think a few basic shapes and as you can see here, he can be quite emotive.

Anyway, I feel this is a great success at this stage, and am rejuvenated and inspired to move to the next stage during performance testing.

I hope you enjoy.

Wednesday, March 23, 2016

Improving the Virtual Muppet Test!

I've been playing with the Leap Motion for a few days now, and today I had some pretty successful results!!  This is just the head motion and rotation, with some limited jaw rotation incorporated.

What is exciting is even at this early stage, I can get some expressive results!  It's so much fun to play with, and so responsive!  It's a joy!

I want to get his body in there, and perhaps some arm/hand control.  Also, introducing his eye shapes will go along way.

More soon.

Monday, March 21, 2016

Using the LeapMotion with PTS to drive Unity - First Test

Today Grifu was kind enough to share his latest build of PTS with me, which incorporates LeapMotion into it!

After talking with him, and our testing with the Kinect, it makes sense that perhaps the LeapMotion offers us a bit better control for a 'Virtual Muppet' setup.   Using this early version, I am able to read in various rotation and position channels directly from the leap itself.  Rescale them, and output them to Unity using 'Remote-Control'.

First tests show that using one channel of rotation or position works great!  However, when they are combined as seen here.. the rotation channel takes a performance hit.   It's exciting to see it working at this stage, as it's a huge step forward for me.   I have XYZ position data in Unity driven by my hand!  The rotation data is being read but seems to be steppy.

Grifu has opened up all the joints of the both the Left and Right hand, offering a range of potential control!  It's super exciting!

Very successful first test!

Edit: Here it is with just the Rotation channel isolated.  Much more responsive!

Wednesday, March 9, 2016

Leap Motion Dexterity Test

I got my hands on a Leap Motion today, and wanted to see how it stacked up against the Kinect as far as dexterity, and responsiveness for a muppet-input device.

Seen here is the Leap, running on my 2010 MacbookPro, inside of Unity 5.  I must say, overall I'm impressed. It was quick to setup, and rather responsive.  Best results so far I've tested.

The next test will be to drive my virtual muppet rig with this setup, and see some results!

Wakka wakka wakka!


First Pull-The-Strings Test!

Seen here is real-time capture of my arm using the Kinect, remapped/rescaled to work with the temporary IK chain within Unity, on the fly!!

So Grifu has been so kind to release his "Pull-The-Strings" OSC Middleware software in alpha, and I was able to get to play with it.

So I got I/O's "ArmTracker" software, and piped it into "PTS". Then using Grifu's "Stringless" remote control script for Unity, I as able to read in the data and remap it to my own temporary IK chain.

The performance is a bit sluggish, and I'm sure it'll improve over time. That, and the fact we're running off an old mac laptop.

Next step will be to incorporate a real Virtual Muppet Character test, to this proof-of-concept workflow.


Saturday, February 27, 2016

Driving Blendshapes in Unity with TouchOSC

Started playing with some of Grifu's amazing OSC plugins for Unity.  He's doing some amazing stuff with toolsets for digital puppetry, and I'm eager to dive into them.

I started with driving blendshapes inside of Unity with my MIDI input on my iPhone with an app called TouchOSC.   This video demonstrates my tests...  based on grifu's tutorial here:

Remote Control for Unity ( control Blend-shapes )

Here are my results from the testing.  It's pretty cool.. basically allowing me to drive any attribute remotely out of, or into Unity via OSC protocal on various network ports.