What in the virtual-world is going on here?
Ever wonder how Virtual Reality (VR) mimics your perception of the world? Do you wonder how designers create a believable, immersive experience? Your crash course is here! In this blog, learn about how designers are working to bridge the separation of the viewer and the virtual with visual and audio engineering.
As real as the eye can see.
Fully immersive virtual realities work to mimic our field of view; however, human’s visual field, the degrees of visual angle they can see when their eyes are stable, is not yet fully accommodated by current virtual reality technologies. Popular headsets like VRgineers’ 180 degrees FoV headset can cover about 85.7% of the horizontal arc of the human visual field, which is around 210 degrees according to the work of Hansh Strasburger and Ernst Pöppel in the Encyclopedia of Neuroscience.
The content presented in the field of view changes in response to a user navigating it. As a user moves their head, the display must refresh and respond with a visual the viewer expects of the simulated environment. The time it takes to change the frame, to render the next image and put it on the display, is the refresh rate, which is measured in hertz (Hz). A device set to 60 Hz presents a new frame on the display up to 60 times every second.
The speed of the refresh rate is connected to the frame rate, the speed graphics are transferred to the screen, measured in frames per second. For example, a monitor with a refresh rate of 60 Hz can optimally display 60 frames per second (FPS). A monitor with a refresh rate of 60 Hz and an FPS of 144 FPS will lose the extra frames offered because it is not capable of displaying them. A monitor with a refresh rate of 144 Hz can display more than 60 FPS, but if those frames are not present, they will not display. When both of these rates are high together, the displayed images are quite smooth.
Motion of the ocean… and the virtual reality user.
High performing visual rates are important for making virtual reality like the real world. In simulations, reducing latency, the time between the viewer’s action and the device’s reaction, improves the navigation experience and perceived responsiveness of the reality. The image should present at the same speed as the viewer turning their head to navigate. If the image lags behind the viewer’s head movement, it can make them feel sick.
Head and body movement allow the user to navigate and interact with virtual worlds. Environment responses aligned with our natural real-world gestures make for a more immersive experience and reduce the controls a user will need to learn to participate. The gesture tools a user can employ are both semantics and responsive. Semantic gestures are common movements in real reality such as walking to a point or nodding one’s head in agreement. Responsive gestures are how the user interacts with the environment such as picking objects up or pushing a button. Environments are designed to replicate cues from the real world, such as replacing a button with a doorknob.
Ongoing innovations may further improve realism by acknowledging how a world responds to the viewer’s visual experience. For example, eye tracking can be used to gain information on where people are looking and how they may be processing data. This understanding can be used to trigger VR responses such as the user looking at someone and them walking over in response. Identifying functional controls and recognizable cues common in the real world can be incorporated as design features in the virtual world to optimize the visual experience.
Not just a pretty face (covered by a VR headset).
To be about more than just looks, VR technology uses elements of 3D audio to emulate the expected audio experience of the real world. An assembly of sounds guide VR juggernauts through the virtual landscape, orienting them with clues of how they relate to their environment.
By adjusting the frequency, timing and amplitudes of sound, designers can build 3D spatial images like ones in the real world. Without sound design, audio reaches ears at the same time, diminishing the sense of depth. With it, listeners gain information and make assumptions on where and what sound is in an environment.
Feedback in the environment can supplement interactions to confirm user actions, such as a sound effect marking a door opening or, in this example from 99designs blog, a ding being triggered from the cash register in an e-commerce setting. This generates feedback that is less disruptive to the user experience, than say a pop-up screen dictating the order is being processed.
These world-building techniques are being engaged by gadgets such as the HTC Vive, Oculus Rift or Google Cardboard to transport users into worlds and situations, some real and others unimaginable. The result of all these design features is truly out of this world.
If you are interested in seeing VR in action, come on out to Carroll Community College for an informal, networking hour and VR demo experience. Enjoy conversation with folks from other businesses over local beer, wine and hors d’eurves and experience what VR can do for you and workforce training.
Then, join us for a small group discussion on how to leverage VR in training scenarios and, later on, at the County Workforce Development offices to hear a panel discuss and debate how workforce training is changing. We will explore the capabilities and possibilities of VR, consider how people might respond and connect to the goal of ensuring our workforce has the skills they need to succeed.
Videos to go down a virtually simulated rabbit hole on VR:
RealSpace 3D Audio Demo – this is incredible and will blow your mind (wear headphones for a better experience).
Immersive Soundscape – demonstrates how sound sources can be designed to mimic where a user would expect to find them in the real world (wear headphones).
The Tucker Zone (A 3D Sound Experience) – I’ve been listening to this for ages; it’s super weird (keep those headphones on).