My Portfolio Website: [ Ссылка ]
If anyone with Mocopi Sensors is looking for a method to do MoCap animations without relying on LiveLInk/Rokoko etc, feel free to contact me.
For the last 2-3 weeks, I've been working on a new pipleline for my Sony Mocopi Sensors, Unreal Engine, and Motiobuilder.
I recorded a "simple" (I can't dance don't judge) animation using the Mocopi sensors and the app it communicates with. From there, I was able to take the fairly ancient .BVH files from the app and onto Motionbuilder. From there I was able to characterize the Mocopi skeleton pretty quickly.
Then I turned my attention to my metahuman skeleton. I exported it out of Unreal to Motionbuilder as an .fbx file, and characterized the most important joints driving the entire Metahuman system. Metahumans come with a lot of extra joints and could fairly daunting to navigate, like twist bones for example.
From there, I was able to merge both files together, retarget the animation to Metahuman skeleton and bake it together.
I then imported it back to Unreal, and paired this new skeleton with my Metahuman mesh.
I was able to then to use Unreal's animation editing tools to fix up a couple of things, such as correcting the skeleton's spine to give a more proper posture, messed with the hands since I have no hand tracking solution, and tried to fix feet placement. I'm still trying to learn how to do motion editing in general, but I'm super proud of my results here.
Ещё видео!