Freehand notes is an interesting tool! I’m wondering if it’s possible to put notes on “2D”? I mean lock one of the axis then write on a surface like a blackboard for example.
There are two tools: “notes” and “freehand”. I’m not sure which one you’re talking about
The “notes” tool is the one doing colorful drawings. They cannot be snapped to any plane and don’t end up as regular SketchUp geometry. They are useful to draw very rough sketches, or things like crosses and arrows as reminders to “look here” later.
The “freehand” tool, on the other hand, draws many small regular segments as regular SketchUp geometry. You can’t change colors, like you can’t draw line segments in various colors. However you can use snapping rules, including snapping inside a plane.
Sorry, I was talking about the notes but even with freehand I can’t lock an axis…
OK, what do you mean “lock an axis” exactly? I meant that it will snap to a real rectangle that you have drawn previously: as long as you’re drawing near that rectangle, the lines you’re making will snap exactly inside the rectangle.
Do you have in mind something like an option in the context menu (the menu you get by pressing the “menu” button)? This could be added. In general the “freehand” tool is not entirely satisfying, in that even the existing snapping rules are not always working like you’d want. I can try to look in more details.
Alternatively, I can also think about a way to have directly the “notes” tool constrained to a plane.
The tool I was thinking is your alternative : I can also think about a way to have directly the “notes” tool constrained to a plane.
It would be perfect!
Maybe do you also plan to add voice recognition in the future? This could be usefull to select a plane/object or other and have the possibility to add a note. It will allow to save time without having to write the comment.
Note that you can’t get the full “writing on a blackboard/whiteboard” experience in VR.
With a real pen on a real whiteboard, you write by touching the whiteboard and moving the pen around. But the pen is physically stopped by the whiteboard, and can’t go “through” it. The limit between “touching the whiteboard” and “floating above” is clear. You know that if the pen is now touching the whiteboard, then it only needs to be pulled a centimeter away from the whiteboard to float above it. That all sounds obvious, but it needs to be said because it can’t easily be reproduced in VR.
The VR experience would need to be done differently. The drawing-versus-not-drawing limit should probably not be based on the position alone, because the 3d position of the VR pen can be “inside” the whiteboard, and if you pull back one centimeter it is still “inside”. You may still have to press and release a button in order to start and stop drawing.
I know about some approaches that require a real, physical wall or table to be identified; and then you draw in VR but at the exact position of the real wall or table. While interesting, I think these approach are not flexible enough.
Why am I explaining all this? Because the risk is that you, or other people, think “just like a whiteboard, in VR” when I really only said “drawing notes constrained on a plane”. That’s not the same thing! I only had in mind something like “the same as the notes tool, but if you happen to be near a face when you press the trigger, then the line snaps to the face”. It’s not even clear that it helps a lot drawing letters.
If the goal is really to leave text annotations around while we are in VR, then maybe we should think about alternatives. You’ve already mentioned one: voice recognition. Maybe a more straightforward one would be to leave audio comments, which could be replayed both in VR or inside SketchUp, and which could be represented as “cassette tape” icons. Or, we could have text notes where you really use a virtual keyboard—slowish to type, but it might be faster than drawing letters by pressing and releasing the trigger all the time, and avoids problems with hard-to-read notes. It’s to this kind of text notes I think voice recognition might be added later.
Do I make sense to you? And which of these solutions would make the most sense to you? …the real answer might be “try them all and let’s see in practice”!
I have seen some apps that use the controllers vibration to give a sort of feedback when your hand goes through a wall. Maybe that could be used?
There is already haptic feedback. Maybe I’m having a hard time to explain what I mean. It’s similar to another tools: try to really put a VR headset and start in VR Sketch with a big wall. Pick the “pencil” tool. Then when you move your hand towards the wall, it will stick and un-stick to the wall depending on the exact position. There is haptic feedback when it sticks. So then imagine that the pencil would draw on the wall whenever the pencil is stuck to the wall. That’s probably the best we can achieve. I think it’s not good enough, though, because you need to pull off the wall by a varying amount before the pencil un-sticks (before it stops “drawing”).
Thank you for the explanations. I understand now the difficulty/impossibility of drawing on a blackboard. I’m not sure the virtual keyboard will be useful as it will be quiet slow to select each letter with the trigger (I will not say that if it was AR). I found an interesting possibility on the demo video here : http://meetinvr.net/ but I’m waiting to test it and see how it works. With that kind of possibility (putting virtual post-it and ‘drawing/writing’) it can also allow to do more than designing with Sektchup/VR Sketch. Mixing all of these, as I’m an assistant professor, I can imagine giving a class by discussing a theory, then take the model, show it, modify it as I want, then continue on my diagram, etc. Not sure if it’s clear for you or only in my mind But for day-to-day meeting with clients, I think voice records and then voice recognition, could be a great tool!