Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FR: XYZ displacement export to CVS #51

Open
jgeerds-zz opened this issue Oct 24, 2023 · 3 comments
Open

FR: XYZ displacement export to CVS #51

jgeerds-zz opened this issue Oct 24, 2023 · 3 comments

Comments

@jgeerds-zz
Copy link

in addition to yaw, pitch and roll IMU data, it would be great to make use of the XYZ accelerator data to calculate the temporary displacement (from vibration, bumps etc) in the Cartesian coordinates.
for 360 footage, this would allow plugging the XYZ data into a camera coordinate and move the camera within a sphere to compensate the i.e. bump. additional bonus points for adding that feature in the ofx plugin as well.

@AdrianEddy
Copy link
Contributor

Do you know how to do it? As far as I know, accelerometer data is not suitable to track 3d position, it's too inaccurate. It may be fine for short and simple movements, but it will get lost very quickly.
Unless there's some technique I'm not aware of, but I didn't see 3d tracking from only IMU anywhere, everyone uses visual tracking with multiple cameras and infrared beacons

@jgeerds-zz
Copy link
Author

jgeerds-zz commented Oct 25, 2023

if we consider using this feature as a bump detector, it would start with setting a threshold, either manually, or automatically, across the 3 axis. at the beginning of the bump we can assume v=0 and the calculate d = vt + 0.5att for each sample (basically a simple integration, on the next sample v=v0+at etc), until the bump acceleration wave form has dropped below the abs(threshold) again. the tricky part is aligning the actual frame with the proper max in the imu data sample for systems that are not sync'd by default. but theoretically, this could be used as a manual calibration tool, kinda similar how kolor used the pixel acceleration data from a 360 twist to calculate the camera sync point. the algo would just add up the displacement until the next frame lines up (i.e. 400Hz IMU data vs 29.97fps video means 13.35 samples per video frame). how this would apply in quaternion or FFT math is far above my highschool math/physics skills. theoretically this could be used to determine the direction of a constant vibration, and then apply a similar technique to eliminate the vibration.

@jgeerds-zz
Copy link
Author

not sure if this is already used, but the IMU should "see" the 1g down in most cases, and this quaternion could be used to level the 360. (at least a good portion of the 1g, the range IRL is probably 0.4 - 1g)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants