Making Basalt generate a consistent map for single rooms in realtime
While Basalt VIO can run in real-time, it needs an offline pass from the mapper to avoid drifting and generate globally consistent trajectories.
The current integration of Basalt with Monado for VR, tends to slowly drift after some time using it.
In the context of VR, it's reasonable to assume that the data that gets fed to Basalt is always produced in the same room.
With this constraint in mind, do you have any recommendations as to what would be a good approach to trying to generate a globally consistent and realtime trajectory from Basalt?
Some possible routes I've thought off on how to approach this are:
-
Try to get the mapper to run in a separate thread and execute it at a very slow frequency. I think ORB-SLAM3 does something similar, but the problem of that for VR is that the correction seems to get applied very abruptly so when you are inside a headset you can feel constant "jumps". Maybe keeping two separate trajectories and applying a very slow exponential smoothing between the VIO one towards the mapper one could improve this.
-
Add static keyframes to the VIO. Right now Basalt uses a window of the last, by default, 7 keyframes to do the optimization for VIO. Maybe it would be reasonable to add additional "static keyframes", on top of those 7 "dynamic keyframes", to try to get good coverage of the room's features and to help achieve global consistency. This could be initially done through a manual process, think of it as manually taking pictures of the room. That then could be automated.
It's very likely that there are better ways, but I'm not very experienced in the area, so it would be great to know your thoughts on this.
And of course, I would love to try to upstream any kind of work related to this if I got to implement it and you consider that it fits with the Basalt project.