cse145_web_presence

Full-body tracking

Overview

Full body tracking systems have many use cases such as virtual reality and rigging for CGI video productions. There are several existing techniques that deal with tracking; however current methods have severe limitations which we aim to address.

The first and most common body tracking method is visual tracking where the users wear several reflective markers on the body that is seen by an external capture device such as a camera. This capture device is able to then take the location of where each marker is and calculate the body location. Despite this being a commonly used method, there are major limitations including a steep initial setup cost, time, and labor as well as a limited capture space. This is compounded by occlusion, or broken line of sight between the capture device and person which creates the need for even more capture devices.

The second method is using several very accurate inertial measurement units (IMU) and estimating body movement. This has advantages over visual tracking where the user does not have to consider the location of capture devices. However, its core disadvantage is the accumulation of error over time. In order to attain the position of the body, an integration needs to be done relative to acceleration which causes positional drift to occur after around 5 to 10 minutes. Some companies have used high-precision IMUs to address this issue but such devices are expensive and still prone to error after enough time.

Our project plan is to improve on both solutions by using ultra-wide band (UWB) trackers. Moreover, we plan to use ultra-wide band emitters and receivers to create a closed loop system - that is, a system that uses multiple sources of data to implement robust error correction - minimizing the IMU error build up and greatly extending the possible capture time. This will also allow a user to operate in any space without the need for prior set up and any associated costs.

Foreword

For this project to happen, we are grateful to collaborate with UCSD’s research group, Wireless Comunications, Systems and Networking (WCSNG), and to have the opportunity to work with their UWB system called ULoc. In addition, our implementation of full body tracking uses PE-DLS from the work of Zeng, Q., Zheng, G. and Liu, Q. in which said code was generously provided in their code repository.

Technical details

The mechanics of our implementation in using UWB and IMU data is through the triangulation of UWB distance vectors and using sensor fusion between this and the IMU data. Both of these metrics are outputted from the tag, a PCB board that contains an MPU6050 and a Decawave module. UWB and IMU data are handled in separate pipelines that eventually leads to the host computer, which aggregates the data.

For UWB data, the tag’s Decawave module emits a UWB pulse at a set interval. The UWB access points, which contain an array of Decawave receiver modules in the horizontal and vertical direction, can use phase differece to determine a distance vector. Each of the four access points sends the data from the receivers to their own Raspberry Pis that calculates this vector and then sends it to the host computer. The host computer then uses the distance vectors from each of the four access points to triangulate the tag’s position given the known distances between each access point.

As opposed to the complex nature of gathering UWB data, collecting IMU data from the tag is much simpler. With the ESP32 on the tag, the rotation and acceleration of the MPU6050 IMU is sent directly to the host through a UDP socket.

At the moment, 3 tags are used to locate the position of the lower body which is process by the PE-DLS implementation to find the body pose. We are currently facing challenges in sensor fusion where we need to account for three different cases:

Equipment details

The equipment needed for this project are as follows:

Team Contributions

Everyone:

Anh Le:

Branson Beihl:

Danny Vo:

Michael Shao:

Repository organization

The codes for this project are separated into multiple small repositories. Please request access using the form in the following link if you are interested: WCSNG @ UC San Diego

Our group has created a repository to contain the modifications of PE-DLS within ULoc-IMU_Fusion and modified code within ULoc_ESP_UDP and ULoc_Data_Collect_Sync.

Works Cited

Minghui Zhao, Tyler Chang, Aditya Arun, Roshan Ayyalasomayajula, Chi Zhang, and Dinesh Bharadia. 2021. ULoc: Low-Power, Scalable and cm-Accurate UWB-Tag Localization and Tracking for Indoor Applications. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 5, 3, Article 140 (Sept 2021), 31 pages. https://doi.org/10.1145/3478124

Zeng, Q., Zheng, G. & Liu, Q. PE-DLS: a novel method for performing real-time full-body motion reconstruction in VR based on Vive trackers. Virtual Reality 26, 1391–1407 (2022). https://doi.org/10.1007/s10055-022-00635-5