-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Extrinsic calibration from two trajectories #2
Comments
Hi Sandeep, Yes, you should be able to get the extrinsic calibration between them if you replace
One thing to note is that this algorithm assumes that one of the sensor's trajectory (
is very far from 1. The MATLAB code in our repo does not make this unknown-scale assumption and therefore will be slightly more accurate for your application. We will eventually be adding this functionality to the Python code as well, my apologies for the inconvenience. |
@mattgiamou |
You're welcome! The distance between the two cameras cannot be incorporated into this method: incorporating this information by constraining their squared distance would be an interesting extension of the traditional extrinsic calibration problem! |
Thank you @mattgiamou |
@mattgiamou However I am getting a huge error Obtained from matlab code after using the relative poses. Sharing the data I used: https://drive.google.com/drive/folders/1WxmmWQNFvussY7jqRe6WSGSLPMO7iN93?usp=sharing Some guidance would be really helpful. |
It looks like I misunderstood - the Matlab code was suggested because I thought you had two separate stereo cameras! For the case you've described, the Python code is actually appropriate, but only if you have the distance between the monocular cameras (you need to divide the translation in the extrinsic calibration by this distance to resolve the scale). I tried to run the Python code below on your data, to make sure that we're on the same page. I'm also getting rotation extrinsics that are very different than the actual extrinsics you posted, so there is probably some other issue. It may be a problem with the code, which I will try to investigate sometime in the next week or so.
From just playing with the data, it looks like the rotation estimates in
returns
which should be constant if the scale were not drifting. Thanks for your patience, and I'm sorry this isn't working out of the box for you. I appreciate you sharing these problems, as I would like to improve the code and cover more use cases in the near future. |
Actually, my point about constant distance between the estimated absolute trajectories isn't true. I'll keep thinking about this some more. |
Thank @mattgiamou you for explaining the scale drift issue.
Are you saying the scale drift is not an issue? |
@mattgiamou |
As per the synthetic datasets, I see the data loaded from the dataset are a list of transformation matrices and the extrinsic calibration.
certifiable-calibration/python/extrinsic_calibration/andreff/utils.py
Line 5 in ee67728
I have a system of two non-overlapping stereo cameras and I am doing ORB-SLAM to get the trajectories for both of them. Can I use the solver mentioned in this repo to get the extrinsic calibration between the non overlapping cameras.
certifiable-calibration/python/extrinsic_calibration/example_hand_eye.py
Line 2 in ee67728
The text was updated successfully, but these errors were encountered: