Search icon
Arrow left icon
All Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletters
Free Learning
Arrow right icon
Unity 2020 Virtual Reality Projects - Third Edition

You're reading from  Unity 2020 Virtual Reality Projects - Third Edition

Product type Book
Published in Jul 2020
Publisher Packt
ISBN-13 9781839217333
Pages 592 pages
Edition 3rd Edition
Languages
Author (1):
Jonathan Linowes Jonathan Linowes
Profile icon Jonathan Linowes

Table of Contents (15) Chapters

Preface 1. Virtually Everything for Everyone 2. Understanding Unity, Content, and Scale 3. Setting Up Your Project for VR 4. Using Gaze-Based Control 5. Interacting with Your Hands 6. Canvasing the World Space UI 7. Teleporting, Locomotion, and Comfort 8. Lighting, Rendering, Realism 9. Playing with Physics and Fire 10. Exploring Interactive Spaces 11. Using All 360 Degrees 12. Animation and VR Storytelling 13. Optimizing for Performance and Comfort 14. Other Books You May Enjoy
Canvasing the World Space UI

In the previous chapter, we discovered how to interact with game objects in the world space scene. Not only can these objects be balls and toys or tools and weapons, but they can also be buttons you interact with and other graphical user interface (GUI)—or just UI—widgets. Furthermore, Unity includes UI canvas components and an event system for building menus and other UIs.

UI usually refers to onscreen, two-dimensional graphics, which overlay the main gameplay and present information to the user with status messages, gauges, and input controls, such as menus, buttons, sliders, and so on. In Unity, UI elements always reside on a canvas. The Unity manual describes the Canvas component as follows:

The Canvas component represents the abstract space in which the UI is laid out and rendered. All UI elements must be children of a GameObject that has a Canvas component attached...

Technical requirements

To implement the projects and exercises in this chapter, you will need the following:

  • A PC or Macintosh with Unity 2019.4 LTS or later, XR Plugin for your device, and the XR Interaction Toolkit package installed
  • A VR headset supported by the Unity XR platform

You can access or clone the GitHub repository for this book (https://github.com/PacktPublishing/Unity-2020-Virtual-Reality-Projects-3rd-Edition-), to optionally use assets and completed projects for this chapter, as follows:

  • Asset files for you to use in this chapter are located in UVRP3Files/Chapter-06-Files.zip.
  • All completed projects in this book are in a single Unity project atUVRP3Projects.
  • The completed assets and scenes for this chapter are in theUVRP3Projects/Assets/_UVRP3Assets/Chapter06/folder.

Let's talk about VR design principles in the next section.

Studying VR design principles

Before we get into the implementation details, I want to introduce the topic of designing three-dimensional UIs and VR experiences. A lot of work has been carried out in these areas over the past few decades, even more so in the past few years.

With consumer VR devices being so readily available and the existence of powerful development tools, such as Unity, it's not surprising that a lot of people are inventing and trying new things, continuously innovating and producing really excellent VR experiences. You are probably one of them. However, the context of today's VR progress cannot be viewed in a vacuum. There is a long history of research and development that feeds into present-day work. The following are some examples:

  • The 3D User Interfaces: Theory and Practice (Bowman et al) book, for example, is a classic academic survey of three-dimensional user interaction for consumer, industrial, and scientific applications...

Making a reusable default canvas

A Unity canvas is a two-dimensional planar surface that is a container for UI graphics, such as menus, toolbars, and information panels. In conventional applications, canvases are commonly rendered in screen space that overlays the scene's gameplay graphics and has the ability to stretch and conform to a huge variety of screen sizes, aspect ratios, and orientations (landscape versus portrait). In contrast, in VR, we never use screen space because the VR "screen" has no edges and differs for the left and right eyes. Instead, in VR, we use a world space canvas that floats (albeit still on a two-dimensional surface) in the same three-dimensional space as all your other Scene objects.

Unity's UI canvas provides many options and parameters to accommodate the kinds of graphical layout flexibility that we have come to expect not only in games but also from websites and mobile apps. With this flexibility comes additional complexity...

Implementing a HUD

The term HUD originates from its use in an aircraft, where a pilot is able to view information while looking forward rather down at their instrument panels. In Unity, a HUD may be implemented as a canvas-based UI floating in your field of view, overlaying the gameplay scene. Typically a HUD is more about displaying information than providing interactable buttons or controls. In this section, we'll test two different variations of HUDs—what I characterize as visor HUD and windshield HUD. We'll start with the visor HUD and then add a little script that gracefully hides the panel with a fadeout when we want it to disappear.

Creating a visor HUD

For a visor HUD, the UI canvas is attached to the camera object in the scene, so when you move your head, the canvas doesn't appear to respond to your head movement. Rather, it seems to bestuck to your face (haha)! For a nicer way to describe it, suppose you're wearing a helmet...

The in-game world space UI

Game objects in your game world that use UI elements might include billboards, scoreboards, control panels, handheld menu palettes, puzzles, and so on. What all of these have in common is that they are objects in the scene that are meant to convey some information and/or indicate that the user should interact with them to perform some operations. They are better served if they are able to dynamically update with runtime information, so a pre-saved texture image or sprite will not be sufficient. In this section, we will try a couple of different scenarios—a scoreboard and an info bubble. We will also introduce the powerful TextMesh Pro (TMP) tools, which are built into Unity and give greater control over your text graphics. We'll start with the scoreboardgame element example, and then implement an info bubble.

Making a scoreboard

When Ethan gets killed in the diorama scene from Chapter 4, Using Gaze-Based Control, the score...

The reticle cursor

A variant of the visor HUD is a reticle or crosshair cursor that, for example, is essential in first-person shooter games. The analogy here is to imagine you're looking through a gun-sight or an eyepiece (rather than a visor) and your head movement is moving in unison with the gun or turret itself. You can do this with a regular game object (for example, use Quad and a texture image), but this chapter is about UI, so we'll use a world space canvas. Then, we'll re-implement the reticle using XRI toolkit components instead, first as part of the interactor hand controller, and then as a HUD reticle.

Adding a canvas reticle to gaze-based interaction

The first step in creating a canvas reticle is to add a crosshair graphic to a canvas. I've included a sprite image named GUIReticle.png with the files for this book that you can use, or you can find another. (If you are importing your own image, be sure to first set its Import Settings...

Building an interactive dashboard

Up to now, we have been using the canvas primarily as a container of display-only information. However, the canvas can also contain interactive UI elements, including Button, Toggle, Slider, and the Dropdown lists option. In this section, we will be building an in-game interactive dashboard or control panel that is integrated into the game environment itself.

Earlier in this chapter, we discussed windshield HUDs. Dashboards are pretty much the same thing. One difference is that the dashboard may be more obviously part of the level environment and not simply an auxiliary information display or a menu. A typical in-game scenario is an automobile or a spaceship, where you are seated in a cockpit. In VR, dashboards are familiar in the home environments—for example, the Oculus Home menu is depicted in the following screenshot:

In this part of our project, we'll operate a water hose in the scene...

Direct interaction with UI elements

Up to now in this chapter, we've been using the XR ray interactor on hand controllers to interact with the UI. In the previous chapter, we learned that there other kinds of interactors, including a direct interactor. To implement the ability to reach out and directly touch an object in your scene, rather than cast a ray, you could use a direct interactor instead of a ray interactor.

This implementation may change if XRI adds support for UI objects with the direct interactor. Presently, I have described a workaround.

Let's see what it will take to change our UI to support direct interactions with the toggle button. The direct interactor works by using physics colliders to detect when the hand is touching an interactable object. Let's switch to that now and see what happens. (We'll work on a copy so its easy to switch back if you want:)

  1. In Hierarchy, select RightHand Controller (under XR Rig...

Building a wrist-based menu palette

Some VR applications that are designed for two-handed setups give you a virtual menu palette attached to the player's wrist while the other hand selects buttons or items from it. Let's see how that is done. This scenario will assume you have a two-hand controller VR system. Converting our dashboard control panel into a wrist palette is not too difficult. We just need to scale it appropriately and attach it to the hand controller.

We'll duplicate and re-purpose the Dashboard object to use it on your left wrist:

  1. In Hierarchy, right-click on Dashboard and choose Duplicate.
  2. Rename the new object Palette.
  3. Disable the old Dashboard object.
  4. Drag the Palette object so that it is a child of the LeftHand Controller object (under XR Rig/Camera Offset).

Now, we'll modify the Palette graphics, as follows. Feel free to change the settings for what works for you:

...

Summary

In Unity, UIs that are based on a Canvas object and the Event System include buttons, text, images, sliders, and input fields, which can be assembled and wired to objects in the scene. At the start of the chapter, I reviewed some of the UI design principles to consider when building your own UIs for VR.

We took a close look at various world space UI techniques and how they can be used in VR projects. We considered the ways in which the UI for VR differs from the UI for conventional video games and desktop applications, never using screen space UI. We implemented over a half-dozen of them, learning how each can be constructed, coded, and used in our own projects. These range from passive HUD info panels and dynamic in-game displays to interactive control panels with buttons. We learned about using TMP to make text information a truly graphical element. Our C# scripting got a little more advanced, probing deeper into the Unity Engine API and modular coding techniques...

lock icon The rest of the chapter is locked
You have been reading a chapter from
Unity 2020 Virtual Reality Projects - Third Edition
Published in: Jul 2020 Publisher: Packt ISBN-13: 9781839217333
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $15.99/month. Cancel anytime}