Over the pandemic, like many people, our senior designer Katie got into streaming art on Twitch. They found it incredibly fun to share their passion with an audience – but something which held them back for a while was not feeling comfortable having their webcam on. Fortunately, it didn’t take them long to discover the Vtubing community.
What is Vtubing I hear you ask?
Vtubing – the V standing for ‘virtual’ and the ‘tubing’ from ‘Youtuber’ – is a community of people who use 2D or 3D avatars in place of a webcam. This allows viewers to associate your video or stream with a character, instead of your true-to-life self. The techniques used in Vtubing are on quite an incredible technological spectrum. Ranging from ‘PNG/GIF tubing’, which like the name suggests, uses a combination of at least 2 images to show when a person is talking, all the way to avatars capable of full facial and body tracking.
Upon discovering this, Katie’s designer brain said “Neat! How do I make one?”
After some research, they ended up circling back to their Adobe suite: Adobe Character Animator.
Katie first played around with CC Character Animator when it was in its early stages, and was pleasantly surprised to see it had been considerably upgraded since then! The devs make incredible tutorials on their Youtube page, with many helpful tips and tricks. Katie was able to pull together a puppet with facial tracking in no time at all.
Basically, it is similar to creating a puppet in After Effects for use with DUIK (but that’s another blog post entirely). You can use either PSD or AI layered files, and follow Character Animator’s naming conventions to have your puppet identified as soon as you import it. “The main difference for me,” Katie commented, “was that the arms needed to be one object, instead of separated into the bicep and the forearm.”
“Another thing was lip syncing. Character Animator has some good default mouth poses you can use with your character to get you started. I wanted to draw my own to match my puppet style, but was very grateful to have them there as a reference.”
“After that I just needed to get my live feed out of Character Animator and into my streaming software, OBS. Once again, the Character Animator devs had made a fantastic video on how to do this, which worked perfectly.”
So, Katie had a puppet which could track their face in place of using a webcam. But they wanted it to feel a little more alive than just being a talking portrait. They wanted it to emote as they did. Which is where recording takes came in.
Character Animator has the ability to record both its tracking data, and any manual movements you make by dragging limbs across the screen. Through a process of trial and error Katie combined physical ‘acting’ with manual animation, to create some stock animations to play on stream for whenever they would be emoting. For instance, whenever they would laugh.
Another neat feature of Character Animator is that you can connect these recorded animations to a ‘trigger’ key. This means you can press the number ‘3’ on your keyboard while inside Character Animator, and trigger an animation you have connected to that key. However, Katie wanted to trigger the animation from OUTSIDE Character Animator too, while they were drawing on stream. Luckily, the program allows you to connect a MIDI device (for anyone lost, MIDI stands for ‘musical instrument digital interface’, and is basically a plug-and-play musical instrument). They were able to set this up and trigger their animations, no matter what program they had open at the time!
The long and short of it is: Katie presses the button, it moves the puppet, they become a cartoon character who can emote on stream.
So that’s how our designer likes to use their Vtubing puppet – but there’s so many other potential uses for Character Animator. For instance, you could create your own conference call filter, make videos for social media, or even use it for explainer videos. Also with the new update, Character Animator has its very own Puppet Maker built in – so you can make your own custom puppet without having to draw a thing!
Would you use Character Animator’s facial or body tracking in any of your projects? Have you already? Either way, we’re eager to see what’s next for Character Animator and other Vtubing software, and how it can be applied to the motion graphics industry.