Engineers at Northwestern University have announced the development of a motion capture app.
Created by a team at the Illinois university’s McCormick School of Engineering, MobilePoser leverages sensors already present within devices such as smartphones, smart watches and wireless earbuds to accurately track body movement and poses in real time.
Leading the study, director at Northwestern’s Sensing, Perception, Interactive Computing & Experiences (SPICE) Lab, Karan Ahuja, said, “Running in real time on mobile devices, MobilePoser achieves state-of-the-art accuracy through advanced machine learning and physics-based optimisation, unlocking new possibilities in gaming, fitness and indoor navigation without needing specialised equipment. This technology marks a significant leap toward mobile motion capture, making immersive experiences more accessible and opening doors for innovative applications across various industries.”
The solution uses inertial measurement units (IMUs), which measure movement and orientation with a combination of sensors – accelerometers, gyroscopes and magnetometers – embedded within mobile devices. Their fidelity is enhanced using a custom-built, multi-stage AI algorithm which the team trained using synthesised IMU measurements generated from a publicly available set of motion caption data. Captured data is fed through the algorithm and MobilePoser estimates joint positions and rotations, walking speed and direction and contact between the subject’s feet and the ground. A physics-based optimiser then refines the predicted movements and ensures physically impossible motions are not produced.
The system has a tracking error of just 8 to 10 centimetres and the subject has the freedom to roam while in use. “The accuracy is better when a person is wearing more than one device, such as a smartwatch on their wrist plus a smartphone in their pocket,” Ahuja commented. “But a key part of the system is that it’s adaptive. Even if you don’t have your watch one day and only have your phone, it can adapt to figure out your full-body pose.”
Pre-trained models, data pre-processing scripts and model training code have been released as open-source software and Ahuja said the app will soon be available for iPhone, AirPods and Apple Watch.