Search icon
Arrow left icon
All Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletters
Free Learning
Arrow right icon
Unity 2020 Virtual Reality Projects - Third Edition

You're reading from  Unity 2020 Virtual Reality Projects - Third Edition

Product type Book
Published in Jul 2020
Publisher Packt
ISBN-13 9781839217333
Pages 592 pages
Edition 3rd Edition
Languages
Author (1):
Jonathan Linowes Jonathan Linowes
Profile icon Jonathan Linowes

Table of Contents (15) Chapters

Preface 1. Virtually Everything for Everyone 2. Understanding Unity, Content, and Scale 3. Setting Up Your Project for VR 4. Using Gaze-Based Control 5. Interacting with Your Hands 6. Canvasing the World Space UI 7. Teleporting, Locomotion, and Comfort 8. Lighting, Rendering, Realism 9. Playing with Physics and Fire 10. Exploring Interactive Spaces 11. Using All 360 Degrees 12. Animation and VR Storytelling 13. Optimizing for Performance and Comfort 14. Other Books You May Enjoy
Interacting with Your Hands

When we're in a virtual world with all this cool stuff, it is in our nature to try to reach out and touch something. While the gaze-based selection that we used in the previous chapter is a good first step for interacting with virtual scenes, most people intuitively want to use their hands. Most VR devices provide a hand controller to select, grab, and interact with virtual objects in the scene.

In this chapter, we introduce practices for capturing user input in Unity, illustrating how to use them in a simple VR scene. Everyone loves balloons, so in this project, we will make balloons. We may even pop a few. We will continue with our work in the previous chapter, using C# programming for basic scripting, and exploring several software design patterns for user input. We will discuss the following topics:

  • Polling for input device button presses
  • Invoking and subscribing...

Technical requirements

To implement the projects and exercises in this chapter, you will need the following:

  • PC or Mac with Unity 2019.4 LTS or later, XR Plugin for your device, and the XR Interaction Toolkit installed
  • A VR headset supported by Unity XR platform

You can access or clone the GitHub repository for this book (https://github.com/PacktPublishing/Unity-2020-Virtual-Reality-Projects-3rd-Edition-) to optionally use assets and completed projects for this chapter as follows:

  • Asset files for you to use in this chapter are located in UVRP3Files/Chapter-05-Files.zip.
  • All completed projects in this book are in a single Unity project atUVRP3Projects.
  • The completed assets and scenes for this chapter are in the UVRP3Projects/Assets/_UVRP3Assets/Chapter05/folder.

Setting up the scene

To begin our exploration of input mechanisms, let's set up our scene. The plan is to let players create balloons. Everyone loves balloons!

For this scene, you could start with a new scene (File|New Scene) and then add an XR Rig from the GameObject | XR menu. Instead, I've decided to start with the Diorama scene used in the previous chapter and remove all but the GroundPlane and PhotoPlane, as follows:

  1. Open the Diorama scene.
  2. Remove all the objects, except for XR Rig, XR Interaction Manager, Directional Light, GroundPlane and PhotoPlane.
  3. Position the XR Riga few feet from the scene origin, Position (0, 0, -1).
  4. Select File | Save Scene As and give it a name, such as Balloons.

Now that the scene stage is set, we are first going to define a balloon game object, make it a prefab, and add an empty controller object in the hierarchy and script that will instantiate the balloon prefab...

Using an Input Manager button

Unity's legacy input manager includes a standard Input Manager for accessing traditional game controllers, keyboard, mouse, and mobile touchscreen input. This includes specific button presses and joystick axes. It also supports input from VR and AR input controllers mapped to logical input axes.

The Input Manager provides an abstraction layer over the physical input devices. You can define logical inputs, such as the Fire1 button to fire a gun, which may be mapped to a physical finger trigger button. Unity has a collection of preset inputs that are available when you create a new project. You can review and edit these settings in your project in the Input Manager settings window (Edit|Project Settings|Input Manager). For a general overview and details of the Unity Input Manager, see https://docs.unity3d.com/Manual/ConventionalGameInput.html.

The hand controllers for VR often have a lot of different buttons and axes, and it can...

Controlling balloons with the input trigger

Now we're ready to implement the meat of our game. When the trigger is pressed, the controller script creates a new balloon by instantiating the Balloon prefab in the scene. When the trigger is released, the balloon object is permitted to float up into the sky. And while the button is held, we'll grow (inflate) the balloon's scale. Let's begin.

Creating balloons

The BalloonController.cs script should nowcreate a new balloon when the trigger button gets pressed. In your code editor, change theUpdatefunction to the following:

 void Update()
{
if (Input.GetButtonDown("XRI_Right_TriggerButton"))
{
CreateBalloon();
}
}

We need to write this CreateBalloon() function. It will reference the Balloon prefab that we created earlier in this chapter and create a new instance of it in the scene. So first, declare a public GameObject variable named balloonPrefab...

Using Unity events for input

Events allow the decoupling of the source of the event from the consumer of the event. Basically, events are a messaging system where one object triggers an event. Any other objects in the project can listen for the event. It can subscribe to a specific function to be called when the event occurs.

Events are a very rich topic, and we can only introduce them here. We will be using the event pattern in various contexts throughout this book, including UI, collisions, and XR interactions. For more information on using Unity events, there are a lot of good references online, including the Unity tutorials at https://learn.unity.com/tutorial/events-uhandhttps://learn.unity.com/tutorial/create-a-simple-messaging-system-with-events.

For this example, we'll create a separate input controller that reads the input button's state and invokes an event and then modify the BalloonController to subscribe to the events. The following diagram...

Tracking your hands

To start taking advantage of the positional tracking of your hands, we simply need to parent the balloon prefab to the hand model. Our scene includes an XR Rig that contains not only the Main Camera that is positionally tracked with the player's head-mounted display, but it also contains a LeftHand Controller and RightHand Controller that are tracked with the player's hand controllers. Any game object that is a child of the hand controller object will be tracked along with it. To implement this, we will first modify the CreateBalloon function so that new balloons are attached to your hand controller and move with it as you move your hands. As we'll see, this introduces a new problem where the balloons are not necessarily positioned upright, so we'll fix that as well.

Parenting the balloon to your hand

The Balloon Controller will need to know which hand pressed the button and parent the balloon to that controller object. Specifically...

Interacting with a balloon gun

For this part of the project, I'm going to show you how to grab and use interactable objects. Let's make a balloon gun. You'll pick up the gun, and when you pull the trigger, a balloon comes out! Haha!

The implementation uses a different approach to building interactions. Rather than a main central controller script that reads the user input and directs the actions, we are going to use a more object-oriented interactor/interactable paradigm provided by the XR Interaction (XRI) Toolkit. With our toolkit, we'll create a grabbable balloon gun by making it an interactable object, and then use the toolkit to respond to the interactor's Activate events (caused by pulling the trigger) to create, inflate, and release the balloons.

Introducing the XRI Interactor/Interactable architecture

The XR Interaction Toolkit implements an object-oriented interaction system that couples interactor objects, such as your...

Popping balloons

Do you want to pop some balloons? "No", said no one ever! Let's add that feature and make a little game out of this. First, we'll make the balloons poppable with collision detection and an explosion. Then we'll add a ball to the scene that you can throw at a balloon to pop it. And after the ball is thrown, we'll fetch it back by resetting it to its original position after a short delay.

Making the balloons poppable

The Unity physics engine can detect when two objects collide. To do this, each object must have aCollider component attached. You can then have the collision trigger an event. We can also subscribe to that event to make something else happen, like play an explosion effect. This is set up on the balloon prefab. Let's do that now:

  1. In the ProjectPrefabs/ folder, open your Balloon prefab for editing by double-clicking it.
  2. SelectComponent |Physics |Sphere Collider.
  3. ...

Summary

In this chapter, we explored a variety of software patterns for handling user input for your VR projects. The player uses a controller button, the trigger, to create, inflate, and release balloons into the scene. First, we tried the standard Input class for detecting logical button clicks, like the XRI_Right_TriggerButton button, and implemented it using a polling design pattern. Then we replaced that with Unity events instead of polling, decoupling our BalloonController script from the input itself. Later, this was even more important when we used the XR Interaction Toolkit's Interactor events to implement the same mechanic.

We learned about the XR Interaction Toolkit and its Interactor/Interactable design pattern. We saw how the XR Rig's hand controllers are the Interactors in the scene. We also created Interactables, including the balloon gun and the ball projectile, that you can grab, activate, and throw. We learned how to wire into the Interaction...

lock icon The rest of the chapter is locked
You have been reading a chapter from
Unity 2020 Virtual Reality Projects - Third Edition
Published in: Jul 2020 Publisher: Packt ISBN-13: 9781839217333
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $15.99/month. Cancel anytime}