previous thoughts
Step 1: How to Assess Sensor Data on your iPhone with CoreMotion?
Step 2: Storage and Visualization with Python
Step 1: How to Assess Sensor Data on your iPhone with CoreMotion?
Step 2: Storage and Visualization with Python
I have implemented the real-time visualization of captured three-dimensional spatial data onto a three-dimensional coordinate system on the screen and a not very elegant transformation from three dimensions to two dimensions. However, the results have been mediocre, so I have paused the project for now. If you are interested in it, feel free to drop me an email so I can share GitHub access with you.
This article inspired me on how to compress 3D trajectories into 2D characters. Since I haven’t paid much attention to the multi-view problems in the 3D field, I didn’t realize that there are pre-trained models for this kind of task, despite the popularity of pre-training and fine-tuning in NLP.
I’ve just discovered some robust research from the year 2000
The SmartQuill digital pen [28] uses tilt sensors to digitize the pen’s ink trail. Fitzmaurice augments a palmtop device with a six degree-of-freedom tracker to create a virtual window into a 3D information space
https://arxiv.org/pdf/2311.15421
They did not release their code.
rerun, rust lib for spatial and time series data