Any news about this issue?
I can confirm - the perceived distance is surely smaller.
I set up a model of a room in my apartment to measurements and used a double-tap feature of Oculus Quest 2 to visually compare the location of the corner of my room according to the simulation, and to the cameras, and in the simulation it was ~60 cm closer with the view distance of ~4 meter, while another corner and room orientation was aligned. I just saw the room corner where actually resides the middle of my door…
When comparing the see-through mode of Oculus and the real life, the images are aligned nearly perfectly, so I exclude distortion of scale on the camera part.
When examining the objects up close the perceived size is correct, but when the objects are far in distance, they are displayed closer than they should be, at least on Oculus Quest 2 via the “send to OQ2” feature.
I’m not sure if it’s the view angle or something else. It’s really great to be able to experience immersion in the model, but it would be very helpful to be able to calibrate the view options so that the distances look realistic.