Moving the world improvments

Definitely,

For me, I like things to be to1:1 most of the time because I’ts surreal to design something like an arcade cabinet and walk around it. But i will scale to zoom in if i need to model a little detail.

I feel like zooming in should be more of a deliberate movement. just as rotating is a deliberate movement. It’s highly unlikely that you would mean to do both at the same time.

Same with movement.

You should see if you can play the demo of the game Sports Bar VR. They actually do the world movement thing exactly like you have it here. Except to move the world forward and back you need to hold both grips and make a pull or push movement. Kind of like rowing a boat.

Moving the world with both hands as well as rotation is easier because those two movements are different enough that they don’t interfere with each other in an unexpected way.

Then again maybe no changes are required to the gestures but as you said maybe its just the detection of them. Gripping with both hands and pulling them apart should be a vastly intentional movement.

When I do the rotation move it’s no where close to what i’m doing when i’m trying to scale in and out. There must be a way to make it so that the zoom detection is based on more intentional movement. If you know what I mean.

As for the world moving on the z axis when im trying to move around the two handed push or pull might work better for moving forward and back in general. Honestly I don’t know i’m not a gesture expert.

All i know is it’s s close to perfect if you could solve this minor issue.

Ah, maybe a good first step would be that when we press both grips, we could either do a rotation or a scaling, but not both? That should be easy to understand for the user…

yeah it’s a start. The more intuitive you can make the controls the less i would be tempted to sneak back to the desktop app to do real editing.

Thought about it some more, I think moving and rotation should be allowed with two hands(grips). And can occur at the same time. But two hand movement forward and back would be restricted along the X-Y. and rotate about the Z.
This would feel very natural without being out of control

One hand world movement could function as-is. So movement in all directions.

Scaling would also be on two grips same as now but as you said once you pick moving or rotating it would only move or rotate and if you make a zoom gesture it would only scale.

In order for this to feel natural the gesture for scaling needs to be detect what your intentio is more accurately. Which shouldnt be hard because scaling is a very specific movement.

what do you think?

Great! I think it would be a good improvement, while at the same time not appearing to change much, so it shouldn’t be an issue for most existing users. Wrote it in our internal issue tracker. Thank you!

1 Like

Sorry, I tried doing this and it feels like a restriction. Moveover, it adds some three-states logic that you need to keep in mind now (“rotation mode”, “scaling mode”, “not decided yet”). When we press both grips and start moving, it’s not that obvious which mode the user really wants, so the user will sometimes mean the opposite as the one which was picked, and the reason why it is so does not feel obvious (to me as a user).

Moreover—maybe because I’m used to how it works now—it feels like an arbitrary restriction to not be able to scale and rotate freely the model. When you want to move and rescale the model so that it is exactly where you need it, I can do it without thinking at the moment, but in this new version, it takes several steps. That’s a big drawback already, and if the logic detects the wrong mode in one of these steps, it goes from bad to worse.

If you want to try it out, I can send you a version of VR Sketch with this.

Maybe I should play with alternatives too, like when we use a single-hand grip and our move is roughly horizontal, then make it exactly horizontal, possibly with visual feedback that it is so?

thanks for the effort I appreciate it.

Sure I would llove to try it out. And provide some feedback

OK, here it is (along with many other changes which will soon be inside version 12):

https://vrsketch.eu/download/baroquesoftware_vrsketch_v12.0.a1.rbz

Before I give my feedback, I was having some weirdness when uninstalling the older version. Im not sure if im using the right version.

The file says version 12, but inside sketchup it says version 11, is that right?

Is quest hand tracking unique to this version because my mind was blown when I saw my actual hands interacting in 3d space.

Unsure about your questions. Minimal hand tracking in the Quest was added in VR Sketch 11.1 already. But the present “12.0.a1” is a file with the .rbz extension, meaning that you need to use it for PC only, not for the Quest—unless you use the tethered Oculus Link with the Quest, but then as far as I know you don’t see your hands. So maybe you’re still running VR Sketch 11.1 directly on your Quest? If so, do you need a Quest version of this new “12.0.a1”, or can you try with the Link (i.e. with a cable)?

Ah, and yes, although the file name is “12.0.a1” the version numbers inside haven’t been updated yet and are still “11.1.0”. Sorry for the confusion it creates.

sorry for the confusion i was using the quest wirelessly. and when i put it on i noticed the hands.

i didnt realize the vr implementation was different for quest so yes I guess I would need that version of 12.

BTW i modelled my office to scale (a long time ago, not just for this -_-) and lined it up in vr 1;1 so things ate physically in the real world where they are in VR.

What I noticed which was amazing is that if you peek out side the boundary, the passthrough mode fades in.

at about 50% opacity I was basically seeing AR!

when oculus makes passthrough mode available to devs i wonder if you could make everything passthrough except the models in the scene.

i usually model arcade cabinets and stuff and it would be even cooler to build them in my actual room in AR.

For hand tracking , is your intention to make putting your thumb to each of your other fingers perform up to 4 different functions?

My intention with hand tracking is not to add much, because I feel it’s already a bit past the limit of how it can reliably recognize your gestures. I regard the current hand tracking as more a (very cool!) technological experiment than a really useful thing to do actual work.

1 Like

Note that about “AR”, it’s also not really useful for actual work with the current Quest. The “reality” part is rebuilt, as a black-and-white image that is—most importantly—not precisely at your eye’s position but only nearby. This makes the “real” part of the image a bit off. I don’t think displaying that for a long period of time is a good idea, and I certainly see why Oculus decided to not give developers the APIs needed to access that.

In both cases, they are cool “previews” of what the next generations of VR headsets will certainly be able to do better :slight_smile:

1 Like

Here it is: https://vrsketch.eu/download/baroquesoftware_vrsketch_v11.9.1.apk

I’ve literally be waiting for this day. Oculus has now made Passthrough mode available to devs by introducing the Passthrough API!!!

I cant wait to build furniture in my actual environment. Please make this dream come true. I dont care of its black and white. Thats actually better since it allows you to focus mlre on your models whilst still seeing how they fit in to your actual space.

1 Like

what Dilmer is doing here is what I want to be doing in VRSketch.

Yes, that would be great, if only this video showed what the guy really sees in the Quest—what this video shows instead is a montage, made using older techniques. It makes you think that this is what it looks like in the Quest, and that’s a lie.

look at what some creative guy just achieved with the Paathrough Api. Plus with the rumored Quest Pro/3 possibly having improved color passthrough or cameras It couldn’t hurt to experiment.