Relative Scale in VR

Hi-
I am new to SketchVR and so far it seems like a really impressive product.

I have a wierd feeling that the relative view scale of my model is not correct (I feel like it may be something like 90% full scale) Is there a way to adjust this ? I feel like its all a bit small. (IE, my head is too close to doors etc. counters seem too low etc.)

Hi, thanks for the comments!

The “wrong scale” is a feeling that I suspect is not real. You should measure it exactly (e.g. add to your model a line that is exactly 2 meter long, somewhere on the floor; then in VR, look at the line and leave each controller at the end; then outside VR measure the real distance). Maybe the problem comes from the ground not being at the right height in VR?

Of course it may be the case that there is really a systematic and measurable difference. If so—and while I don’t understand why there should be one—I could think about adding a configuration parameter.

EDIT: ah, maybe adjusting the distance of the lens between the eyes would have an effect on the perceived distances, too.

Note that it might also be simply that the real floor’s level has been misconfigured. Check if the apparent VR floor is at the same level as the real floor (check in VR by bringing the controllers down until they hit the real floor). If not, redo the room setup.

I have this same issue!!. It is slightly downsized making the experience of all my home design slightly too small. All my clients when previewing the designs see the issue right away. I have accurately set the floor height and when measuring in virtual and real life at 1:1: scale they do not add up. Please advise on a potential fix…

(Received a private mail and replied there first. I will post here if we manage to identify where the problem comes from.)

Any news about this issue?

I can confirm - the perceived distance is surely smaller.

I set up a model of a room in my apartment to measurements and used a double-tap feature of Oculus Quest 2 to visually compare the location of the corner of my room according to the simulation, and to the cameras, and in the simulation it was ~60 cm closer with the view distance of ~4 meter, while another corner and room orientation was aligned. I just saw the room corner where actually resides the middle of my door…
When comparing the see-through mode of Oculus and the real life, the images are aligned nearly perfectly, so I exclude distortion of scale on the camera part.

When examining the objects up close the perceived size is correct, but when the objects are far in distance, they are displayed closer than they should be, at least on Oculus Quest 2 via the “send to OQ2” feature.

I’m not sure if it’s the view angle or something else. It’s really great to be able to experience immersion in the model, but it would be very helpful to be able to calibrate the view options so that the distances look realistic.

Hi Alexander,

Yes, I finally confirmed that this effect is caused by the inter-pupillary distance being wrong in the headset. This is a setting that you need to “configure” your headset for—it depends on the headset, but typically there is a physical knob to turn on the headset itself. If it doesn’t match the actual distance between your eyes, then nothing else looks wrong, thus it is common to have a mismatch here. The only bad effect is that the depth is bogus, making objects appear nearer or farther than they are. The rest is correct: for example, if you close one eye then everything you see (and measure) is correct.

I will try to add a way in VR Sketch to help. At least we could give an explanation, and a possibly a scene where the error can be seen clearly. I’m thinking about a wall or line on the floor that goes straight, but that you see curving when this setting is erroneous. You can already see a bit this effect by looking at the warehouse’s walls, but it can probably be made more apparent.

Hi Arigo,
Is it really? I tried using this on an oculus rift and my wall heights as well as cabinet heights all seemed too short compared to reality. I dont think curvature had to do with it. I think even if the pupil distance is incorrect, your sense of horizon would still be correct.

Thanks Arigo,

I’m not sure it’s connected with IPD, I tried with different and also with closing one eye.

I’m trying to get a reproducible scenario.

Got a way to see a blend of the simulation and the Passthrough+ on OC2 by placing the headset at the edge of the guardian boundary. For some angles, the calibration is perfect and the room is blending like in this 80s A-ha video, and for some angles, it seems there is a discrepancy… Will recheck my measurements and try to record on the phone camera.

Pity that Passthrough+ API is not public yet, it could be a very nice feature to see an AR blended view of the simulation and Passthrough…

@Benjamin_Cox: your problem seems to be wrong floor height. Even if the floor is less than 5 centimeters off it can visibly change the height of tables. Try to put the physical controller on the floor, and check that the virtual controller appears to rest on the virtual ground. If it is instead inside the ground (or floating above it), then that’s the problem.

@Alexander_H: the passthrough should not be relied for that. I’ve noticed that the distances appear bogus in the passthrough (for me, when seeing the physical world and then putting on the headset in passthrough mode, the distances are somehow wrong). I think that’s one reason the passthrough API is not public: it’s really not meant to be used. The image is reconstructed from the 4 on-headset cameras which are not spaced like your eyes.