This piece of software is the result of a crash-course into developing for virtual reality headsets. In this
case, we were using the, at the time, latest oculus release which was the last fully connected one that they
pushed out before moving to the Quest series. We were given tasks from our instructor at the university as to
what he wanted the end product to be, and then were let go to figure it out. The video shown above is the finished
product with all the pieces put in the virtual replica of our classroom.
In order, the first thing interacted with in the video is the pickup command. This involved tracking where the user's controller was, and once it intercepted the box's collider, it lit up red. Then with a press of the back trigger on the controller, the box would be linked to the controller until the trigger was released. As shown, the user could not interact with the box with the second controller while it was controlled by the first.
Immediately after, we see the movement control that our instructor came up with, which involved actually shrinking the entire game world when both side triggers were held and the controller brought together. This allowed the dragging control that many other VR pieces use to cover a much larger distance with a much smaller movement, since the world was shrunk. Doing the reverse movement, of bringing the controllers apart while holding down the triggers, would grow the world again. As of the recording, I don't believe we had a cap on either shrunk or grown of the world, so you could really get down close with the pieces in the world if you wanted to.
The next piece worked with was the slider, which was an interesting puzzle. It needed the previous grabber controls, but they had to be applied specifically in that the slider holder was a grab-able object, and the slider itself was a separate one. Then the math had to be done that allowed the slider to slide on the same path it should, regardless of how the slider holder was angled. As seen in the demonstration, we don't currently obey collision on the pieces that are movable by the player, but that shouldn't be terribly difficult to add in should the need arise.
The final interactable shown in the video is similar to the slider in that it needed to be two separate parts and math had to be handled to allow the dial to spin regardless of how it was angled. The colored cube was just put in place to see the ability of these objects to affect other objects in the environment.
After the final interactable, we move onto the different types of the grabber controller. These different grabber modes are as simple as swapping which index of the enum that is holding the names. First up is the line grabber, allowing the user to interact with objects at a distance. The blue line is the visual assistant for the user, with the white sphere showing up where ever the blue line intercepts an interactable object. As shown, once again the user can not interact with both controllers on the same object. The final variant of the grabber is the eye-to-target grabber, or more affectionately named, the scoped grabber. This required a line renderer to fire from the user's eye, in this case the right one, and any time the user lined up the invisible line with an object, it would allow selection of the object. However, since the line is invisible to the user, it simply looks like they are scoping down the white orb on their controller to grab an object in the distance.
Over all, this is a fairly dated control setup for VR now, but it has served me well in the few personal projects I've built in VR sense, as it was built to be a drop-in package for any new VR project. This allows the developer to jump straight into building the experience without having to worry about building up the control scheme. The controller buttons are also not hard-coded into the script, so they could be swapped to any other button to allow the developer to use the already in-use buttons. Alongside that, the entire project was made using Unity's VR handler, which I'm sure had been changed dramatically since then, however it meant that you could plug in any headset the handler could recognize and the proper controllers would show up and the inputs would be automatically modified.