What made you feel like you were truly immersed in a game world for the first time? Was it graphics that looked impressively realistic, ambient noise that perfectly captured the environment and mood, or the way the game's mechanics just started to feel like a natural reflex? Game developers constantly strive to replicate scenarios that are as real and as emotionally impactful as possible, and they've never been as close as they are now with the advent of virtual reality (VR).
VR has been a niche market since the early 1950s, often failing to evoke a meaningful sense of presence that the concept hinges on---that is, until the first Oculus Rift prototype was designed in 2010 by Oculus founder Palmer Luckey. The Oculus Rift proved that modern rendering and display technology was reaching a point that immersive VR could be achieved, and that's when the new era of VR development began.
Today, VR development is as accessible as ever, comprehensively supported in the most popular off-the-shelf game development engines such as Unreal Engine and Unity3D. In this book, you'll learn all of the essentials that go into a complete VR experience, and master the techniques that will enable you to bring any idea you have into VR.
This chapter will cover everything you need to know to get started with VR, including the following points:
The concept of VR
The importance of intent in VR design
Common limitations of VR games
The anatomy of a VR headset
The importance of input
Meeting required frame rates
Installing the Oculus runtime
Configuring your Oculus Rift
Stepping into Oculus home
VR has taken many forms and formats since its inception, but this book will be focused on modern VR experienced with a Head-Mounted Display (HMD). HMDs such as the Oculus Rift are typically treated like an extra screen attached to your computer (more on that later), but with some extra components that enable it to capture its own orientation (and position, in some cases). This essentially amounts to a screen that sits on your head and knows how it moves, so it can mirror your head movements in the VR experience and enable you to look around the virtual world, making you feel like you're actually there:
Depth perception is another big principle of VR. Because the display of the HMD is always positioned right in front of the user's eyes, the rendered image is typically split into two images, one per eye, with each individual image rendered from the position of that eye.
You can observe the difference between normal rendering and VR rendering in the following two images. This first image is how normal 3D video games are rendered to a computer screen, created based on the position and direction of a virtual camera in the game world:
This next image shows how VR scenes are rendered, using a different virtual camera for each eye to create a stereoscopic depth effect:
While VR provides the ability to immerse a player's senses like never before, it also creates some new, unique problems that must be addressed by responsible VR developers.
VR headsets are meant to make you feel like you're somewhere else, and it only makes sense that you'd want to be able to explore that somewhere. Unfortunately, common game mechanics such as traditional joystick locomotion are problematic for VR. Our inner ears are accustomed to sensing inertia while we move from place to place, so if you were to push a joystick forward to walk in VR, your body would get confused when it sensed that you're still stationary.
Typically when there's a mismatch between what we're seeing and what we're feeling, our bodies assume that a nefarious poison or illness is at work, and they prepare to rid the body of the culprit; that's the motion sickness you feel when reading in a car, standing on a boat, and yes, moving in VR. This doesn't mean that we have to prevent users from moving in VR, we just might want to be more clever about it---more on that later.
The primary cause of nausea with a traditional joystick movement in VR is acceleration and smooth movement; your brain gets confused easily when picking up speed or slowing down, and even constant smooth motion can cause nausea (car sickness, for instance)---rotation is even more complicated, because rotating smoothly creates discomfort almost immediately. Some developers get around this using hard increments instead of gradual motion, such as rotating in 30 degree "snaps" once per second instead of rotating smoothly.
One of the potentially clumsiest aspects of VR is getting your hands where they need to be without being able to see them. Whether you're using a gamepad, keyboard, or motion controller, you'll probably need to use your hands to interact with VR, which you can't see with an HMD sitting over your eyes. It's good practice to centralize input around resting positions (that is, the buttons naturally closest to your thumbs on a gamepad or the home row of a computer keyboard), but shy away from anything that requires complex precise input, such as writing sentences on a keyboard or hitting button combos on a controller.
Some VR headsets, such as the HTC Vive, have a forward-facing camera (sometimes called a passthrough camera) that users can choose to view in VR, enabling basic interaction with the real world without taking the headset off. The Oculus Rift doesn't feature a built-in camera, but you could still display the feed from an external camera on any surface in (we'll play with that idea later in the book).
You may not have thought about it before, but looking around in a traditional first-person shooter (FPS) is quite different than looking around using your head. The right analog stick is typically used to direct the camera and make adjustments as necessary, but in VR, players actually move their head instead of using their thumb to move their virtual head. Don't expect players in VR to be able to make the same snappy pivots and 180-degree turns on a dime that are simple in a regular console game.
Another limitation to consider when designing your VR game is what's called a vergence-accommodation conflict. This is basically what happens when the distance to the point of your focus, or vergence (that is, an object in VR), is notably different than your focal distance, or where your eyes are actually focusing on the screen in front of you.
This image from a research article in the Journal of Vision demonstrates the conflicting difference:
Forcing the user to focus on objects that are too close or too far away for extended periods of time can cause symptoms of eye fatigue, including sore eyes or headaches. Therefore, it's important to consider placement of the pieces of your game that will draw a lot of attention.
The full article, titled Vergence-accommodation conflicts hinder visual performance and cause visual fatigue by David M. Hoffman, Ahna R. Girshick, Kurt Akeley, and Martin S. Banks, is available at http://jov.arvojournals.org/article.aspx?articleid=2122611. It is a valuable resource in avoiding depth cues that may cause eye fatigue.
Earlier in this chapter, we briefly mentioned that HMDs sometimes monitor positional movement as well as rotational. There are a few different methods of tracking an HMD positionally, but so far, every solution includes an external component not connected to the headset. In the future as the hardware gets better, we can expect a solution to positional tracking as part of the headset itself (referred to as inside-out tracking).
The Oculus Rift's solution to positional tracking is called constellation tracking. It uses an infrared camera that faces the user to detect small infrared LED markers, invisible to the naked eye, and extrapolate positional movement values based on the number of pixels those markers move in a frame.
Here's what the tracking camera of the Oculus Rift looks like:
As long as the Rift is in view of this camera, the user can laterally move their head and the HMD's display will update to reflect it; this can be used for mechanics such as leaning into or understanding something or sticking your head out from behind a corner. The constellation tracker is capable of tracking the Rift in a seated or standing experience, which means you could even engage the player in limited full body movement. The Oculus Touch controllers, shipping in late 2016, will include an additional camera to improve the quality of tracking further.
This is an image of an early Oculus Rift prototype that shows the exposed IR trackers covering the outside of the device:
The consumer version of the Oculus Rift has these markers embedded in the strap on the back of the headset as well, so there are markers that can help the constellation system track you no matter which direction you're facing.
Generally, the more realistic a VR experience is, the more the user forgets about the world outside of it. Positional tracking adds a lot of realism to the feeling of looking around in VR, so it's a good idea to design your game in a way that will take full advantage of it.
What you choose as the primary input device for your VR experience will have a lot of bearing on how it feels. Out of the box, you have three possible solutions for controlling your game: the Xbox controller, the Oculus remote, or your keyboard and mouse. Once the Oculus Touch controllers mentioned in the last section are released at the end of 2016, you'll have a much more immersive way of interacting with VR experiences, and the implementation will be similar to the Xbox controller's implementation but with the added ability of tracking the user's hands just like the Rift headset is tracked.
The Xbox controller, included with the Oculus Rift and pictured here, is ergonomic but still has numerous unique inputs, meaning it's a decent choice for a game with complex controls:
For simpler games that emphasize observation and exploration over mechanical input, the Oculus remote is the perfect simple solution. It only has a few buttons, making it easy to use in tandem with the headset, and lending itself to head-based interfaces (which we'll cover in a later chapter).
This is an image of the Oculus remote, also included with the Oculus Rift:
If all else fails, you can use your keyboard and mouse, but because the keyboard is complicated and somewhat restrictive, you should always look to simpler input methods unless you have a strong particular reason for using the keyboard.
Drops in frame rate are much more acceptable outside of VR than in VR; if a computer is good enough to run a game at 30 frames a second but not at 60 frames a second, some players are able to push through it and get accustomed to the lower frame rate. After all, we watch movies at a frame rate, sometimes even lower than 30 frames per second, so it's not unreasonable to think that a game at 30 frames per second would still be enjoyable.
Suboptimal frame rates are much more important when it comes to VR. Since the HMD takes over the entirety of what our eyes see, it needs to update the world virtually as quickly as our eyes could. If the world we're perceiving doesn't update as fast as we look around it, our brain starts to get confused again and cue the nausea. This is sometimes referred to as VR sickness, and not only does it decrease the feeling of immersion the player gets, but it can also leave them feeling ill even after removing the headset.
With the hardware in the Oculus Rift, we can update the display with a new frame up to 90 times in a second. While 120 times a second would be even more ideal, a steady frame rate of 90 will be adequate in mitigating the vast majority of nausea.
Complex, asset-dense games that require a large number of calculations every frame can start to have a monopolistic impact on a computer's hardware, meaning your VR experience may start to drop frames. This should be avoided at all costs, because sporadic frame choppiness, known as judder, is one of the fastest ways to induce VR sickness.
The good news regarding the high performance cost of the Oculus Rift is that you're not without help. Asynchronous Timewarp (ATW) is a rendering technique that helps fill delays in rendering with a calculated intermediary frame.
In essence, ATW warps an image based on the user's head movement, giving the appearance of multiple rendered frames but actually only modifying one while the rest are generated. All rendered images need to be warped at a baseline level so they don't appear skewed when viewed through the lenses, so adding an extra step to the warping process is relatively inexpensive for your hardware.
ATW can go a long way in making up for lost time in your VR experience, but it's important not to rely too heavily on it, because like every other rendering trick, it has its limits. For instance, with purely rotational ATW, the user can experience positional judder, which causes objects close to the user to appear blurry or doubled noticeable.
This screenshot of a submarine interior, provided by Oculus, demonstrates the perceived effect of positional judder:
ATW calculations are relatively simple because they only take rotation into account. Lateral movement is an entirely different problem; in 2016 Oculus announced work on a complementary feature called Asynchronous Spacewarp, which will perform predictions for head position as well. We'll see Asynchronous Timewarp at work in a later chapter, when we focus on performance and optimization of VR experiences.
Now that we've covered the basics of VR in theory, it's time for some practice. The first thing you'll have to do is download the Oculus runtime, which is the background process responsible for handling all of the activity in your Rift.
Open a web browser and navigate to http://www.oculus.com/setup. This page will provide a download link to the Rift setup package, which includes the runtime you need and Oculus home, the central hub for browsing and starting all of your Oculus software.
Setting up the Rift requires an Internet connection and about 30-60 minutes of setup time. The setup utility will guide you through the process step by step on screen.
After all of the software required to run the Rift is installed, you'll run through a quick hardware calibration step that will set up your constellation tracker, remote, and the headset itself. It will also help you set values such as height and interpupillary distance (IPD).
IPD is the distance between the pupils of your eyes, and even though it may seem like everyone's IPDs are similar to each other, significant inaccuracies can affect the way we perceive scale. Scale is everything in VR, especially when you're trying to use size to convey a mood or purpose, so ensuring your IPD is accurate is important in experiencing content exactly as it was meant to be experienced.
In this chapter, we approached the topic of virtual reality from a fundamental level. The HMD is the crux of modern VR simulation, and it uses motion tracking components as well as peripherals such as the constellation system to create immersive experiences that transport the player into a virtual world.
Now that we've covered the hardware, development techniques, and use cases of virtual reality---particularly the Oculus Rift---you're probably beginning to think about what you'd like to create in virtual reality yourself. In the next chapter, we'll be diving into implementations using the Unity3D engine, and you'll take your first step into VR development.