Reader small image

You're reading from  Enhancing Virtual Reality Experiences with Unity 2022

Product typeBook
Published inNov 2023
Reading LevelBeginner
PublisherPackt
ISBN-139781804619537
Edition1st Edition
Languages
Tools
Right arrow
Author (1)
Steven Antonio Christian
Steven Antonio Christian
author image
Steven Antonio Christian

Steven Christian is an Augmented Reality Developer, Animator, and Medical Student Student. He's also the Owner of Iltopia Studios where he creates comics and cartoons that explore the Black experience in America. He is Unity Certified 3D Artist and Instructor and his goal is to improve equity in arts and technology through content creation and community-focused education.
Read more about Steven Antonio Christian

Right arrow

Working with Inputs and Interactions

Making realistic interactions in VR is key to making memorable immersive experiences.

In VR, we don’t just rely on joysticks and buttons; we also apply natural gestures to interact with the world. We kneel to pick things up, we clench our fists to hold objects, we wave our arms to swing swords, and so on. The act of punching in real life is similar to performing a punching action in VR. We must acknowledge how we interact in VR and incorporate that into our development process.

It is in our best interest to explore this area of development because users are attracted to VR to explore digital worlds and free themselves of the limitations of the real world. Incorporating interactions that make users feel free creates a framework for memorable experiences in VR.

Part of the development process is testing the limits of the medium and pushing it beyond what is expected.

Since we set up our basic VR scene in the VR setup section in...

Why do interactions matter?

Interactions are important because we’re trying to mimic real-world experiences and phenomena in VR. The reason people are attracted to VR is because they’re able to have that physical experience in a virtual world and do things that go beyond what is physically possible in the real world. The vital part of the interactions that we have in VR is that they mimic actions of our daily lives. If the interactions we have in VR don’t resonate with users, then we inadvertently reduce the incentive to return to those experiences because they feel “off.” Therefore, our goal as developers is to create the most believable and relevant interactions possible so that when a user engages in the experiences we build, they can have something that resonates with them.

The possibilities that interactions allow are limitless. Think about being able to build a virtual scene in any part of the universe, whether it’s fictional or possible...

Setting up a demo scene using primitive shapes

The first thing we’re going to do before we get everything set up for interactions is turn our base scene into a working demo scene. A demo scene is an environment that is used to showcase the features and functionality of an experience. This is important because, with demo scenes, you’re able to rapidly prototype the ideas that you have without having to waste time on all the other details of the environment you are testing in. We’re not at the point where we need to build worlds and explore specific interaction use cases – we’re just testing what is possible with our basic knowledge of VR. That includes walking around and engaging with different objects. Our demo scene is going to focus on making simple interactions possible.

First, we are going to create an environment we can use to test all of the interactions. We will create it using primitive shapes, which we will then learn how to replace with...

Setting up the locomotion system

With our base demo complete, let’s set up our locomotion. By locomotion, I am referring to adding the ability for the user to move the VR rig around the VR scene using the joystick inputs on the hand controllers. The locomotion system enables the user to move around the virtual environment, allowing for a more immersive experience and the ability to interact with the environment more naturally. It is important for the player to feel comfortable and not experience motion sickness when experiencing VR.

I like to work non-destructively, meaning I always save and back up my projects so that I can recover anything if I make mistakes. Instead of creating a new scene, let’s duplicate the demo scene and name it 01_Locomotion_Setup. Duplicating will retain all the data in the scene for us to change and modify while retaining a backup. Now, to establish our locomotion system, we must follow these steps:

  1. In the locomotion setup scene,...

Adding interactor components

In this section, we will delve into the topic of object interactions in VR. Object interactions refer to the ability of users to grab and manipulate virtual objects as if they were real-life objects. To enable object interactions in VR, developers must use interactor components.

Several types of interactor components can be used to create engaging VR experiences, as shown in Figure 3.14:

Figure 3.14 – List of available interactor components

Figure 3.14 – List of available interactor components

In this section, we will cover four types of interactor components: ray interactors, direct interactors, gaze interactors, and socket interactors.

Ray interactors allow users to interact with objects by pointing at them with a virtual ray emitted from a controller. Direct interactors enable users to directly grab and manipulate objects using hand controllers. Gaze interactors allow users to interact with objects simply by looking at them, without the need for any physical input...

Multiple object interactions

To enable multiple object interactions on a single VR object, you can add multiple interactor components to the object. For example, you can add a Ray Interactor component, a Direct Interactor component, and a Gaze Interactor component to the same VR object. This will allow the object to be interacted with using different methods, such as pointing and clicking with a hand controller, walking up to the object, and looking at the object for a certain period.

You can also set up different events for each interactor component, such as changing the color of the object when it is interacted with using the Ray Interactor component and playing a sound when it is interacted with using the Direct Interactor component. It’s also important to keep in mind that the order in which the interactors are added to the object and the order of the events on each interactor component will determine which interaction takes precedence when multiple interactions are possible...

Adding haptic feedback to our VR controllers

Haptic feedback, also known as haptics, is the use of vibrations or other tactile sensations to communicate information or provide a sense of touch in a virtual or remote environment. In VR, haptic feedback can be used to simulate the feeling of holding or touching objects and add an extra layer of realism to the experience. For example, when you pick up an object in VR, the controller you are holding may vibrate to give you the sensation of grasping something.

In our demo scene, we will use XR Interaction Toolkit’s built-in haptic feedback system to add haptic feedback to our objects that can be interacted with. This system allows us to add haptic vibrations to specific events, such as when an object is picked up or when a button is pressed. We can also adjust the strength and duration of the haptic feedback to suit the specific interaction.

For example, when you pick up an object in VR, the controller you are holding may vibrate...

Adding attach points to virtual objects

To configure attach points on VR objects, we must first understand what attach points are and why they are important. An attach point is a designated location on a VR object other objects can be attached to. This allows for more complex interactions within a virtual environment, such as holding a gun or a tool or placing an object on a shelf.

To set up attach points, we need to select the VR object we want to add the attach point to. Within the object’s hierarchy, we must create an Empty GO and position it at the desired location for the attach point. This empty GO will act as the actual attach point.

Once the Empty GO is in place, we will need to add it to the Attach Transform reference slot on the XR Interactor component. We will test this feature out with our Cylinder GO.

Let’s duplicate the 11_HapticFeedback scene and rename it 12_AttachPoints.

To configure an attach point on the Cylinder GO, go through the following...

Adding socket interactors to our demo scene

Socket interactors are a type of interactor that allows you to place objects in specific locations with precision and feedback. This can be achieved by providing a socket or a designated location where the object can be placed. This socket can be visualized as a preview and can provide feedback when the object is placed correctly or incorrectly.

The object being placed can also snap to the socket to ensure it is in the correct location. This can greatly enhance the interactivity and realism of the VR experience, especially in scenarios where objects need to be placed in specific locations, such as puzzles or problem-solving experiences. Additionally, since the objects are not affected by physics when being placed in the socket, attaching them can prevent unwanted movement or collision of objects, providing a more seamless and polished experience for the user. We will place socket interactors on the table to serve as place markers for our...

Extending XR Interaction Toolkit

Unity provides not only standard tools for VR development but also some experimental packages and features. As mentioned in Chapter 2, in the VR setup section’s Installing XR Interaction Toolkit subsection, you can download these from their repositories and the Package Manager to test new developer tools and extensions before their official release. We won’t delve too deeply into these features because they are experimental, and Unity regularly updates and changes them. Keeping up with the documentation can be quite challenging due to the frequency of their updates.

Nevertheless, Unity offers a sample project that utilizes the most up-to-date features of XR Interaction Toolkit. These will be included in the latest build of the toolkit. Among these are a scene with starter assets for VR development right out of the box, a VR device simulator that allows you to simulate different head-mounted displays and controllers, and presets for...

Summary

This chapter provided a comprehensive understanding of the various interactions and locomotion features that can be used in VR experiences. By setting up a demo scene, we were able to explore different forms of interactions, such as ray, direct, gaze, snap, continuous, and teleportation. We also looked at how to incorporate haptic feedback, attach points, and socket interactors to enhance the realism and immersion of our VR experience. We now have a foundation we can build on that includes interacting with objects from a distance, interacting with them directly, changing the way we navigate virtual spaces, and much more. These are the core interactions that every VR experience should have and at this point, you should have a better understanding of how to use these interactions within your experiences. Remember that the key to VR development is a mix of planning features, adding components to GOs within the scene, and testing in the editor. With the basic foundations in place...

lock icon
The rest of the chapter is locked
You have been reading a chapter from
Enhancing Virtual Reality Experiences with Unity 2022
Published in: Nov 2023Publisher: PacktISBN-13: 9781804619537
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
undefined
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at €14.99/month. Cancel anytime

Author (1)

author image
Steven Antonio Christian

Steven Christian is an Augmented Reality Developer, Animator, and Medical Student Student. He's also the Owner of Iltopia Studios where he creates comics and cartoons that explore the Black experience in America. He is Unity Certified 3D Artist and Instructor and his goal is to improve equity in arts and technology through content creation and community-focused education.
Read more about Steven Antonio Christian