Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About getting results in meters unit #37

Closed
nightheronry opened this issue Aug 4, 2020 · 9 comments
Closed

About getting results in meters unit #37

nightheronry opened this issue Aug 4, 2020 · 9 comments

Comments

@nightheronry
Copy link

@ranftlr Thank you for the work. I'm trying to apply it with Myriad X VPU.
So I would like to ask whether the unknown scale and shift mentioned in #36 are linear parameters?
For example, in each frame, I can find a linear equation like "P = D * scale + shift" to project the values of depth maps "D" to the physical absolute measurements "P" according to putting a known scale ruler in the view, right ?

@ranftlr
Copy link
Collaborator

ranftlr commented Aug 4, 2020

Yes, they are linear parameters. When doing alignment, consider that P and D are inverse depth so you need to convert your absolute measurements to inverse depth first.

@nightheronry
Copy link
Author

@ranftlr Thank you for your kind reply !! :)

@rodrigoGA
Copy link

If I understand correctly ...
P = is in inverse depth = 1/ physical distance in meters
D= inverse depth = result of the inference. It does not need inverse, the model already predicts the inverse
scale and shift = are estimated with least squares from known points in the image

Can I calibrate "scale and" shift "and use the model in an environment with the same lighting? Or is it necessary to calibrate in each frame?

@mahavir-GPI
Copy link

@rodrigoGA Did you get the correct scale and shift values based on the approach you mentioned?

@rodrigoGA
Copy link

according to my tests the model is very inaccurate to measure distance (uncalibrated in each frame ).
It's a shame, because it has very good runtimes and is compatible with tensorflow lite.
According to the internet there are approaches that take into account the previous frame that achieve better results

@mahavir-GPI
Copy link

Oh, To confirm, by uncalibrated do you mean you didn't calculate the scale and shift value? Did you directly use the model predicted values?

@puyiwen
Copy link

puyiwen commented Feb 10, 2023

Yes, they are linear parameters. When doing alignment, consider that P and D are inverse depth so you need to convert your absolute measurements to inverse depth first.

Hi, sorry to bother you. I use depth camera D435i , but the depth map is bad. So I want to use MiDaS to improve the quality of the depth map. I write the scrips to align the MiDaS and absolute depth map.
I need to confirm something. The alignment formula is y = kx + b, y means invert absolute depth with meters, x means the prediction of MiDaS, k means scale and b means shift. Am I right?
The align code is:
realsense_depth_array_invert = 1 / realsense_depth_array # invert absolute depth with meters
x = midas_depth_array.copy().flatten() #Midas Depth
y = realsense_depth_array_invert.copy().flatten() # Realsense invert Depth
A = np.vstack([x, np.ones(len(x))]).T
s, t = np.linalg.lstsq(A, y, rcond=None)[0]
midas_depth_aligned_invert = midas_depth_array * s + t
midas_depth_aligned = 1 / midas_depth_aligned_invert
The midas_depth_aligned is finally absolute depth with meters. Am I right?
If I am right, I find aligned depth have large error. I dont know why, can you help me? Thank you very much!

@putrasto
Copy link

putrasto commented Jun 18, 2023

@rodrigoGA i think you need to calculate "scale" and "shift" for each image they can not be used for for other image because "D" is normalized.

@Shubhamkumarroy
Copy link

@ranftlr Thank you for the work. I'm trying to apply it with Myriad X VPU. So I would like to ask whether the unknown scale and shift mentioned in #36 are linear parameters? For example, in each frame, I can find a linear equation like "P = D * scale + shift" to project the values of depth maps "D" to the physical absolute measurements "P" according to putting a known scale ruler in the view, right ?

and can we use scale and shift ,same for all image if i have calculated using it two image if not then so for every image i have to calculate the scale and shift so how can it be don?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants