Mobile AR Experience and Body Tracked Avatar
I was hired to create a mobile AR experience using SLAM tracking that involved a crystal bouncing on the floor that became activated and attracted to a human figure once human presence was detected.
Activation would cause the hex crystal to orbit the person, causing electric and magic reactions, smoke, and glowing veins. This would transition into a complete take-over in which the human would control an avatar via bodytracking. Here are a few iterations.
This was an interesting project because most proprietary body tracking software requires limitations such as tracking markers ( like ping pong balls or QR code. ) and also requires a completely still camera in a specific orientation. I wrote custom software that allows tracklerless handheld phone camera, with full motion, including tilting. I solved the tilting problem by simply subtracting the quaternion gathered from the camera orientation via SLAM tracking from the avatar's root orientation. Here is a demo.
Something to note is the position of the feet. In most proprietary realtime body tracking software available today, there is also usually some amount of "feet sliding" involved with body tracking because the avatar is often centered the the hips as a pivot, and the feet are merely dangling close to the ground. Here is a short lecture from 2021 that detailing the problem and how NVIDIA is working on solving it. I ended up faking the feet sliding solution with some code that looped through the bones, found the lowest one, offset all other bones to this point, essentially causing one of the feet to be the pivot instead of the hips, as seen in the videos above.