Search icon CANCEL
Subscription
0
Cart icon
Your Cart (0 item)
Close icon
You have no products in your basket yet
Arrow left icon
Explore Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Free Learning
Arrow right icon
Arrow up icon
GO TO TOP
Unreal Engine 4 Virtual Reality Projects

You're reading from   Unreal Engine 4 Virtual Reality Projects Build immersive, real-world VR applications using UE4, C++, and Unreal Blueprints

Arrow left icon
Product type Paperback
Published in Apr 2019
Publisher Packt
ISBN-13 9781789132878
Length 632 pages
Edition 1st Edition
Languages
Tools
Arrow right icon
Authors (2):
Arrow left icon
Robert Ruud Robert Ruud
Author Profile Icon Robert Ruud
Robert Ruud
Kevin Mack Kevin Mack
Author Profile Icon Kevin Mack
Kevin Mack
Arrow right icon
View More author details
Toc

Table of Contents (20) Chapters Close

Title Page
About Packt
Contributors
Preface
1. Thinking in VR FREE CHAPTER 2. Setting Up Your Development Environment 3. Hello World - Your First VR Project 4. Getting Around the Virtual World 5. Interacting with the Virtual World - Part I 6. Interacting with the Virtual World - Part II 7. Creating User Interfaces in VR 8. Building the World and Optimizing for VR 9. Displaying Media in VR 10. Creating a Multiplayer Experience in VR 11. Taking VR Further - Extending Unreal Engine 12. Where to Go from Here 1. Useful Mind Hacks 2. Research and Further Reading 3. Other Books You May Enjoy Index

Best practices for VR


Now that we've talked a bit about immersion and presence, let's take a look at a few specific practices we can follow to keep our users comfortable and avoid breaking immersion. Don't consider any of these to be set in stone (except the requirements to maintain framerate and to leave the user's head alone)—VR is still a very new medium and there's a lot of room to experiment and find new things that work. Just because someone says a thing can't be done doesn't mean it can't. That having been said, the following recommendations generally represent our current best understanding of what works in VR, and it's usually a good idea to follow them.

Maintain framerate

Are you sensing a pattern here? You absolutely must maintain frame rate. High latency will pull the user right out of immersion, and this is a leading trigger for simulator sickness. Consider the work you're asking the renderer to do in VR, and you'll see that this is going to be a bit of a challenge. The HTC Vive Pro displays a 2,880 x 1,600 image (1,400 x 1,600 per eye), while the original Vive and the Oculus Rift display 2,160 x 1,200 (1,080 x 1,200 per eye), and all of them require this to happen 90 times per second, leaving the renderer 11 milliseconds to prepare the frame. The Oculus Go displays 2,560 x 1,440 pixels (1,280 x 1,440 per eye) 72 times per second, meaning the renderer has about 13 milliseconds to deliver the frame. The Unreal Engine renderer is blazingly fast, but, even so, this is a lot to render, and there is not a lot of time in which to get the frame drawn. You're going to have to make some compromises to reach your target. We'll talk about ways to do this throughout this book.

Here's a list of headsets currently on the market and their rendering demands.

Tethered headsets

HMD Device

Resolution

Target Framerate

Oculus Rift

2,160 x 1,200 (1,080 x 1,200 per eye)

90 FPS (11 ms)

HTC Vive

2,160 x 1,200 (1,080 x 1,200 per eye)

90 FPS (11 ms)

HTC Vive Pro

2,880 x 1,600 (1,400 x 1,600 per eye)

90 FPS (11 ms)

Windows Mixed Reality

It varies. Most display 2,880 x 1,440 (1,440 x 1,440 per eye)

90 FPS (11 ms)

Standalone Headsets

HMD Device

Resolution

Target Framerate

Gear VR

It varies depending on the phone used.

60 FPS (16 ms)

Oculus Go

2,560 x 1,440 (1,280 x 1,440 per eye)

72 FPS (13 ms)

Oculus Quest

3,200 x 1,440 (1,600 x 1,440 per eye)

72 FPS (13 ms)

 

Bear in mind as well that you should aim for frame rates slightly higher than these targets so hitches don't cause major discomfort.

VR hardware does do a bit of work to reduce perceived latency if the frame rate drops and the new frame isn't ready to be rendered when the headset needs to display it, but it does this by a bit of trickery. In these cases, the hardware will re-render the last frame and adjust it to fit the user's current head movement, so what the user sees isn't an exactly correct frame—it's just better than dropping the frame altogether. (Oculus calls this process Asynchronous Time Warp (ATW), and on the Vive it's called Asynchronous Reprojection.) Don't use time warp or reprojection as a crutch, though—they're there to keep the user comfortable when your application hitches, but it's still a degraded experience for the user. Don't let your application miss the target frame rate for extended periods.

Also be sure to test your application on the minimum spec hardware you intend to support, and give your users ways to scale the rendering demands so they can meet the frame rate target on the hardware they're running.

Never take control of the user's head

Beyond dropping frames, the next most common cause of simulator sickness is the sensory conflict we mentioned earlier—a mismatch between motion perceived visually and motion felt in the inner ear. There are two major types of motion you're going to need to accommodate in VR:

  • Movement of the player's avatar (walking around, teleporting, or piloting a vehicle)
  • Movement of the player's head relative to their avatar

Movement of the player's avatar is handled by the locomotion system you implement for your experience. You really don't have a choice here—you're going to have to create movement that isn't happening in real life, but there are things you can do to make this less of a problem and we'll talk about them shortly.

Note

The word avatar originated in Sanskrit and referred to the embodiment of a deity in human form. In its current usage, it extends this metaphor to refer to the embodiment of a human user in a virtual world. You'll hear the term commonly used to refer to a character in a simulated world under the control of a human player. Its companion term, agent, refers to a character under the control of an AI routine.

You should never interfere, however, with movement of the player's head.

What this means in practice is: never move the camera in a way the user didn't cause by their own actions. If you're making a game and the user's avatar dies, don't leave the camera bolted to the head as the body falls. You will almost definitely make users sick if you do this. Consider cutting to a third-person view instead or handling the action in some other way. Never move the camera to force the user to look around in a cinematic, and don't apply a walking bob or a camera shake. The user should control their head always.

Note

Never move the camera separately from the user's head, and never fail to move the camera when the user's head is moving. You should always maintain a 1:1 correlation between head movement and camera movement relative to the user's avatar.

This applies both ways. If the player moves their head, the camera must move, even if the game is paused or loading. Never stop tracking.

If you need to teleport the user to a new location or change cameras for any reason, consider using a fast fade to black or white to cover the transition. People instinctively blink when they turn their heads quickly, and it's a good idea to mimic this behavior.

In-game cut scenes need to be handled differently in VR than they would be on a traditional flat screen for the same reasons. Ordinarily, in authoring a cut scene, you would take control of the camera, moving and cutting from shot to shot, but you can't do this in VR. You can't control where your user is going to look, and you need to be careful moving them around. This leaves you with a few options. First, if your scenes are pre-rendered, then you really have no choice but to map them on to a screen in the virtual environment. This breaks immersion, but is no more difficult for the user than watching a movie in real life. If you're doing them in-engine, you need to think about how you're going to handle the player's point of view.

For a first-person point of view, it's probably best to stage the cinematic scene around the user and allow them to look and move freely within it. You can't cut away to another shot when doing this, and you can't guarantee that your user will be looking where you want them to look when a key moment occurs, but it's the most immersive approach you can take.

Cut scenes can also be handled in the third-person, in which you pull the user's viewpoint out of their body and allow them to view the scene unfolding, but you need to do this carefully—the out-of-body experience can be disorienting for your player and can weaken immersion and the player's identification with the character.

For film-making in VR, be very careful of the ways you move the camera. Even very small moves may induce sickness. Users will tolerate forward movement more easily than side-to-side or rotational movement and seem to tolerate movement more easily if it's justified by a visible vehicle or some other way of explaining why it's happening.

Thinking about how to use the camera in VR isn't just about managing user discomfort either. This is new territory, and the rules you learned from film and gaming work differently here. You're designing to recreate the user's eyes, not a camera, and this has far-reaching implications for your compositions. How does the user move? Do they know where you want them to look? What do they see when they look at their hands? What about a mirror? How does surrounding the user with a world (instead of making them watch it through a window) change their relationship to it? All of these factors require a conscious choice as you develop your work.

Do not put acceleration or deceleration on your camera

Depending on the type of application you're creating, you're probably going to need to give your users a way to change their location, either by teleporting or moving smoothly. (We'll dig into this in depth in a later chapter.) If you do choose to implement a smooth movement method though, don't accelerate or decelerate as the player starts and stops moving. Start the movement at full speed, or if you opt to smooth your starts and stops at all, keep them very short. (And, of course, never do a start-moving or stop-moving animation that takes control of the user's camera.)

Do not override the field of view, manipulate depth of field, or use motion blur

We mentioned a moment ago that VR mimics the user's eyes, not a camera. For this reason, don't do things in your simulation that eyes don't do in real life. Never change the focal length of the camera lens or its depth of field. The eyes' focal lengths don't change the way cinematic zoom lenses do, and you're very likely to make your user sick if you change this.

Manipulating the depth of field isn't a good idea in current-generation VR, as we don't yet have a reliable way to know what the user is actually looking at within the view. In the future, as eye-tracking improves, this will likely change, but for now, don't make this choice for your user.

Motion blur shouldn't be applied to your camera or objects in the scene. This is an artifact of the way film photographs a static frame for a fixed period of time, smearing the motion within that frame, but that's not the way eyes work, and it will look unnatural in VR.

While we're on the topic, steer clear of other camera-mimicking effects, such as lens flares and film grain. Again, these mimic the behavior of film, not the eyes, and we're not trying to mimic film in VR. Filmic effects such as these can also cause unwanted physical side effects in the user, contributing to simulator sickness if the effects don't line up between the eyes, and they cost precious frame time to render. Don't use them.

Minimize vection

Have you ever looked out the window of a car sitting still, and watched a large vehicle such as a truck or bus moving, and felt as if you were moving in the opposite direction instead? This phenomenon is called vection, and it refers to the illusion of self-movement produced by optical flow patterns. If a large portion of your view is moving, this can produce sensations of movement in your body, and as we discussed earlier, sensations of movement that don't match the signals from the inner ear can trigger simulator sickness.

Note

Vection is the illusion of movement produced when large parts of your field of view move. Optical flow, or optic flow, refers to the pattern of movement of the contents of your view, and it's these patterns of movement that cause vection.

What this means in practice is that, if a big chunk of your user's view is moving, you're at risk of inducing simulator sickness. We've talked about this already with regard to moving the user's head (don't do it), and we've touched on some of the ways we can handle this in your locomotion system, but you'll also want to be aware of other circumstances that can cause vection.

Be aware of moving patterns that fill large parts of the frame—whether or not they're part of your locomotion system, they can still create an illusion of motion, which may be a problem for your users.

 

Several games and applications have experimented with a tunnel vision effect to reduce vection when users need to move quickly through the environment—when the player's avatar runs, an iris closes in from the edges of the view to reduce peripheral vision.

Users seem to be much more tolerant of forward movement than they are of strafing—moving side-to-side. This may in part be because, in real life, we move forward far more than we move sideways, but it may also be because the optical flow the user sees when moving forward still has a relatively fixed point at the center, whereas in sideways movement, everything in the view moves.

Note

When you're trying to figure out whether a particular movement in VR is likely to cause simulator sickness, it can be useful to think about the kind of optic flow that movement is going to create. Optic flows with relatively fixed reference points, such as the horizon when running forward, may be fine, while flows that move everything in the view, such as sideways movements, may not.

Rotating the player's view is especially problematic. It moves pretty much everything in the view, and the vestibular system is especially tuned to detecting rotation. Be very careful here. Smooth rotations are generally not a good idea, but developers have found that snapping the user to a new rotation works well to reorient the user without making them sick. It turns out that the brain is very good at filling in interruptions in perception, so snapping to a new rotation or "blinking" the view during a large movement can be very effective at disrupting the perception of motion without distracting the user.

Many developers have also found that giving users a visible vehicle that moves with them, such as an aircraft cockpit, can mitigate the effect of vection when rotating. Whether this is an appropriate solution for you depends on the type of experience you're creating, but the takeaway here should be that users seem to be less prone to simulation sickness if they're given fixed points of reference in their view. Where this is appropriate, consider factoring it into your design, and where it isn't, consider other ways of breaking the optic flow if you have to do large smooth movements such as blinking or snapping.

Avoid stairs

If you're allowing your user to move smoothly through your environment, be aware that certain features of environments can provoke simulator sickness when users navigate them. Stairs are especially bad. Stairs that provide collision for every step so the view bounces when user navigates it are worse. Environment features such as these that create a sense of vertical movement when traversed can be difficult because the inner ear is very sensitive to changes in altitude.

Avoid stairs if you can. If you can't avoid them, be conscious of how steep they are and how fast you're letting your user move over them. You'll have to test a bit to get it right.

Use more dimmer lights and colors than you normally would

Be careful of using bright lights and strong contrasts in your scene. Bright lights contribute to simulator sickness in some users, and strong contrasts can increase the user's sense of vection as the world moves across their view. Also, with current hardware, bright lights can often create a flare on the headset's fresnel lenses, which can pull users out of immersion by reminding them of the hardware they're wearing. In general, it's recommended that you use cooler shades and dimmer lights than you normally would.

Keep the scale of the world accurate

VR communicates the scale of objects in the world in ways that flat screens simply do not. Each of us sees the world in stereo vision through a pair of eyes that are a fixed distance apart. This distance, called Interpupillary Distance (IPD), contributes to our sense of how large or small objects in the world appear. Most VR headsets can be adjusted to match the interpupillary distance of their user and should be adjusted correctly to minimize eyestrain.

Note

The distance between the pupils of the user's eyes is called the interpupillary distance and is a major contributor to a user's sense of how large or small objects in the world are.

What this means for you as a developer is that the scale of objects in your world matters. On a flat screen, the user is limited to comparing the size of an object to another object to determine how large it is, but in VR, the user's IPD drives an absolute sense of scale. An object that's too large or too small on a flat screen will still appear normal if it's alone on the screen. The same object in VR, even if there's nothing to which it can be compared, will look wrong to a viewer in stereo 3D.

Some users may be prone to simulator sickness if the scale of the world feels wrong, and even those who aren't will still likely feel that the world feels "wrong," without necessarily knowing why.

Make sure objects in your world are scaled correctly. In Unreal, by default, one Unreal Unit (UU) is equal to one centimeter.

Be conscious of physical actions

Your users in VR are moving around the real world wearing electric blindfolds. Respect this, and be careful what you ask them to do in VR. Take care when asking users to swing their arms, run, or strafe, as they can easily run into obstacles or walls in the real world. For headsets with cables, don't ask users to turn repeatedly in the same direction and tangle themselves in the cable. Be conscious as well of asking users to reach for objects on the floor or outside their normal reach area—this may not be easy or possible in their real-world physical environment. As mentioned earlier, avoid shifting the horizon in ways that could cause your user to lose balance. Remember that nearly all of the user's information about the world is coming from the VR simulation while they're in it—be conscious of how this information lines up with or contradicts what's in the invisible physical world around them.

Manage eyestrain

The eyes use muscles to focus on objects and orient the eyes, and these muscles, like any other, can get fatigued. We call this eyestrain. Symptoms of eyestrain can include headaches, fatigue, and blurred or double vision. As a designer, there are things you can do to minimize eyestrain in your users, and understanding a little about what causes eyestrain will help you do this.

First, eyestrain can be caused by flickering. We've already talked a lot about the importance of keeping latency low—this is another reason to keep low latency a priority. Don't create purposely flickering content, as this can produce eyestrain but could also trigger photosensitive seizures.

Note

Flickering caused by high latency can cause eyestrain. Keep your latency low.

Second, the eyes need to do some physical work to focus on an object in 3D space. They have to adjust the shape of their lenses to focus on the object (this is called accommodation), and they need to aim themselves so their lines of sight converge at the object. This is called vergence. We naturally have a reflex that correlates these two actions with each other, so the eyes naturally want to converge to a depth plane that matches the depth to which their lenses are focusing, and the lenses naturally want to focus in a way that matches where the eyes are converging. The problem comes in VR, where the actual images the eyes are seeing are a fixed distance away, but the content of those images exist at a variety of virtual depth planes, so the eyes still have to rotate so they converge at the objects they're looking at. This creates a conflict, as the focal depth the lenses are accommodating doesn't match the depth at which the eyes are converging, and it can cause eyestrain.

Note

Eyestrain can be caused by two factors in VR: flickering, which can be managed by keeping your latency low, and conflict between the fixed distance at which the eyes' lenses need to focus to see the headset screen, and the changing distances at which they need to converge to see objects in stereo depth. This is commonly called the vergence-accommodation conflict, and you can manage it by keeping important objects in the virtual world about 1 maway so the vergence and accommodation demands mostly line up.

You can manage this when designing your world by keeping these two demands in mind. The fresnel lenses on the HMD make the headset screen appear to be about 1 m from the eyes, allowing the lenses to accommodate to a focal plane about 1 m away. The user's eyes, then, will naturally find it easier to focus on objects in the virtual world that appear to be about that far away. In practice, objects are most easily viewed at a range of 0.75 m to 3.5 m, with 1 m seeming to be ideal. Avoid making users look for long periods at objects less than half a meter away from the eye. 

Note

Put objects you know your user will be fixating on for long times at least a half-meter away from the camera and ideally around 1 m to minimize eyestrain.

Don't force your user to be an eyeball contortionist to view your user interface. Attaching a GUI to the user's face is usually a bad idea—as they turn their head to view a UI element, it appears to "run away" because it's attached to the same head that's turning to try to look at it, so users have to turn their eyeballs alone to focus on it. Don't do this to them. It's irritating to users, fatiguing, and has no real-world analogue. Put your UI in the world so your users can focus on it from comfortable viewing angles and at a comfortable distance. Attaching UI elements to the user's body, such as a wrist, can work well as it allows users to bring it into view when they want to interact with it. Putting GUI elements into a cockpit or vehicle can work well too. UI elements can be placed around the world and revealed when the user looks at them.

Keep GUI elements within the ideal range we discussed and at an angle that allows it to be read without straining, if you do wind up attaching it to the user's head. 

Try to avoid creating situations that force the user to change focal distance rapidly and often. If you're making a shooter, for example, that puts critical information on a nearby UI element while the enemies are in the distance, you may be creating a situation that will force your user to change focus frequently to check the UI and focus on enemies in the field. In a flat-screen game, this wouldn't be a problem, but in VR, it will tire them out. Design your UI in such a way that the user can get critical information without focusing on it—easy-to-read graphical elements, for example, or consider putting UI elements over the enemies' heads.

GUI elements can be occluded by objects in the world that are nearer to the camera than the UI element is. Don't try to use tricks from 2D gaming space to change this. In 2D game design, it's common to draw a UI element over a 3D element even if that element would really block the player's view of it. If you do this in VR, however, you'll create a confusing stereo image that won't be at all comfortable to look at. Accept the reality that your UI exists as a physical object in the world and follows the same rules as other physical objects.

Make conscious choices about the content and intensity of your experience

Presence, when it's achieved in VR, produces strong reactions. It's an intimate experience, a visceral experience, and sometimes a fear-inducing experience. Be conscious of what you're doing as you craft experiences—you can easily trigger a fight-or-flight response in some users. This might be exactly what you intend, and we're not suggesting that you shy away from whatever it is you're trying to create. But be aware that you can be playing with strong stuff here and make intentional choices. VR is much more capable of triggering phobias than its flat-screen predecessors because the user is immersed in the space and not being constantly reminded by their peripheral vision that what they're seeing isn't true. Be on the lookout for circumstances that can induce vertigo, claustrophobia, fear of the dark, fear of snakes, spiders, or other phobias. Remember also that users will react more strongly to threats within their personal space.

Note

For those of you deliberately playing with fear in VR, making horror experiences, or therapeutic experiences to treat PTSD, there are meaningful distinctions between film and VR—the user always exists in VR, which isn't the case in film. They have an instinctive sense of personal space that you can use to great effect. Film doesn't have this either. In film, an object that's supposed to seem close is just big on the screen, but it's still whatever distance away from the user that the screen actually is. In VR, this space is real. It's right behind you in VR really means that it's right behind you.

Let players manage their own session duration

VR puts demands on the user's body, eyes, and mind that other media don't. They're wearing a device on their head, and often standing or moving physically. Design your experience to let them exit whenever they want to or need to and resume later on. Let them take breaks as they need them.

Keep load times short

In contrast to games and applications on flat screens, users in VR can't do anything else while they're waiting for the application to load. Optimize to keep your load times short. Remember as well that, even during a load, your application must be responding to the user's head tracking.

Question everything we just told you

VR is in its infancy as a medium and an art form. It's far too early to pretend we know what its rules are really going to turn out to be. In the early days of film, actors were always filmed in full-frame, because the conventional wisdom at the time was that audiences wouldn't pay to see half an actor. Be equally willing to question the guidelines and advice you receive in VR design. These represent the current best understanding of what seems to work, but that doesn't mean that there aren't other ways to do things that haven't been tried. Be open to them. This is part of the reason why these guidelines were each presented with information about why they exist—so you can understand where they're coming from and make your own choices and try your own experiments. You're on a frontier in VR, part of the creation of an entirely new means of communication. Don't be afraid to explore.

You have been reading a chapter from
Unreal Engine 4 Virtual Reality Projects
Published in: Apr 2019
Publisher: Packt
ISBN-13: 9781789132878
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $19.99/month. Cancel anytime
Banner background image