How Intel Optimized the User Experience for VR e-Sports

How Intel Optimized the User Experience for VR e-Sports

By | November 26th, 2018
No Comments on How Intel Optimized the User Experience for VR e-Sports

VR, as a new form of interaction between humans and the virtual world, is

able to immerse users by making them feel like they are experiencing the

simulated reality. However, it’s not easy to achieve considering the

frame rendering budget for VR is 11.1ms per frame (90fps), and you need to

render the entire scene […]


VR, as a new form of interaction between humans and the virtual world, is able to immerse users by making them feel like they are experiencing the simulated reality. However, it’s not easy to achieve considering the frame rendering budget for VR is 11.1ms per frame (90fps), and you need to render the entire scene twice (once for each eye). In this session, we’ll focus on both performance and user experience optimizations for VR games, introducing techniques we used in an arena Premium VR game called “Code51” to minimize motion sickness and increase user playtime in VR, as well as what optimizations and differentiations have been done in Code51 to increase both the user experience of players and audience throughout the game.


Code51 is the first worldwide mech arena VR game supporting Oculus Rift, HTC Vive, PSVR and Pico VR. It allows up to four versus four combat among worldwide players and is specifically tailored for VR e-sports, with a nausea minimized gameplay design and a built-in VR spectator mode. Code51 has already been released in over 3000 VR Arcades and experience centers in China (ZamerVR & VRLe), and is targeted to be released on PlayStation Store, Oculus Store and Steam in Q2’18.

Intel worked closely with Smellyriver to optimize the user experience and performance of the game. Moreover Intel and Smellyriver added richer visual and audial enhancements enabled by Intel Core i7 processors, including features like 3D audio, object destruction, enhanced CPU particles and additional background objects.

In this article, we’ll describe seven design points in Code51 that can help increase both the immersion and user experience of VR games.

Design Points of Immersive VR Games

Moving immersively in VR

Currently there are four classes of immersive motion tracking systems to drive player movements in the virtual world of a game, which are:

  • Teleport + Six DoF Tracking (e.g. Rebo Recall)
  • Virtual Cockpit (e.g. EVE: Valkyrie)
  • Locomotion Simulator (e.g. Virtuix Omni)
  • Large Scale Tracking System (e.g. OptiTrack)

All of these solutions have different pros and cons. For Code51, virtual cockpit was chosen to be the way to move in VR due to the following reasons:

  1. Able to move continuously is an important way to improve immersion in VR since it matches our real-world experience, and virtual cockpit is the only way that can move continuously in VR without the need of extra hardware and cost.
  2. Considering that current sales of Premium VR helmets is not high enough, a moving way which is compatible to 3DoF VR helmets helps reduce the engineering effort of porting the game to these devices.
  3. Code51 is an arena VR game where a player sits in the cockpit of a mech and fights with others in VR. This kind of “sitting down” experience perfectly matches the moving way of virtual cockpit and increases the immersion.
  4. Sitting & using gamepad to play is less tiring than standing & using motion controllers to play, it allows users to play longer in VR[1]. Fig. 1 shows how the users play Code51 with various VR helmets.

Fig. 1. Code51 supports various VR helmets. Left: HTC Vive (6 DoF). Right: Pico VR (3 DoF).

Alleviating motion sickness in VR

Motion sickness[2] is one of the main factors preventing users from experiencing VR for a relatively long period of time. There are several factors causing it, including:

  • Visual stress caused by the vergence-accommodation conflict of the viewers[3]
  • VR scene w/o directional clues or references (i.e. not able to get hints on the current moving direction)
  • Low FPS or high MTP latency
  • Acceleration mismatch between what you see and what your body feels
  • Angular velocity
  • Zoom in & zoom out
  • VR blur

To address the factors leading to dizziness, several approaches were adopted in Code51 to minimize these effects from various aspects

  • UI Design
    • Put the rendered control panel at 1 meter or more away from the user to avoid frequent changing of the user’s vergence distance
  • Level Design
    • Provide clear directional cues in the scene such that the user knows which direction he/she is going consistently. It can get one’s brain prepared for the movement and reduce dizziness. In practice, avoid adding objects with plain texture that might block the whole view of the user to the scene, for it might interrupt the sensation of the moving direction from visual clues, increasing the equivalent variation of perceived speed.
  • Rendering Performance
    • Optimize performance to get Code51 stably rendered at 90fps for minimizing MTP latency.
  • Reduce Acceleration
    • Avoid adding acceleration to Code51 as much as possible to reduce dizziness, velocity should be changed between different levels instantly. For example, apply acceleration for a short period of time only to initialize the action, and then maintain constant speed during jumping or landing.
  • Reduce Angular Velocity
    • Reduce the ability to rotate (angular velocity) at high speed, cockpit acts as a reference and helps reduce perceived optical flow.
  • Dynamically reduce the field of view (FOV)
    • Restricting the FOV in VR helps reduce motion sickness[4][5]. This method was adopted in Code51 where depth buffer is used to calculate the instant velocities at 4 corners of the screen, and then the vertices are warped inward accordingly in the stencil buffer to subtly reduce the FOV during movement. The faster the moving, the narrower the FOV, as shown in Fig. 2. It can reduce the equivalent magnitude of optical flow perceived by end users.

Fig. 2. Dynamic FOV adopted in Code51 to eliminate the optical flows at the edges of the screen. It helps reduce the motion sickness induced to the end users.

Minimizing network latency

For VR application, it’s also important to minimize network latency in order to have a fluent gameplay experience without lagging, for it might generate dizziness if not optimized. In Code51, all motion of mechs are predicted locally, and then being refined through synchronization with the server in a low frequency pace to avoid frequent interruption during bad network condition. If synchronization was failed, the client would extrapolate previous trajectory to get the new position.

Enhancing spectator viewing experience for e-Sports

To create a better viewing experience for e-Sports audience, a built-in VR spectator mode was implemented in Code51. The spectator mode acts like a client in the game and allows audience to watch a live battle in either stereo or mono mode.  Spectators can use VR HMDs or not, and view from any angles and positions they want through keyboard, mouse or gamepad control, as shown in Fig. 3.

Fig. 3. VR spectator mode in Code51. End users can view a live battle in VR or non-VR mode.

Maintain the sharpness of the rendered scene

Since the PPD (Pixels Per Degree) of current VR HMDs is still low as compared to conventional displays, it’s critical for VR apps to preserve the sharpness of the rendered scene as much as possible to minimize VR blur and reduce dizziness. Currently there are 3 types of Anti-Aliasing (AA) approaches available to the UE4 developers:

  • Temporal AA (TAA)
    • TAA and its derivatives have good AA quality for static environment with moderate computational cost. If TAA is used in VR apps, it has to be computed in additional pass for non-opaque objects since there is no depth information available for these objects. Also, it’s better to adopt a conservative parameter settings for TAA used in VR in order to reduce VR blur generated during head movement or in dynamic scenes.
  • Multi-Sample AA (MSAA)
    • MSAA is available in forward rendering pipeline only, it’s adopted in Code51 to minimize VR blur generated from AA. 4x MSAA is good enough for VR.
  • Screen Space AA (SSAA)
    • SSAA has the high computational cost but is able to achieve the best quality, 1.4x SSAA is also used in Code51.

CPU performance optimization

CPU optimization is also critical to the VR experience to ensure consistent and fast submits to the graphics pipe, and to prevent stalling the VR scene rendering. To minimize the CPU boundedness of DX11 VR game made with UE4, reduce render thread workload as much as possible[6]. The RHI threads for DX11 in UE4 (4.20+) will help reduce the CPU render thread overhead through D3D11 deferred rendering context[7], and it should be adopted whenever possible.

In addition, there are various approaches to optimize the render thread workload in UE4[8], where some of them are deployed in Code51:

  • Reduce drawcalls (better less than 2000 on Premium VR according to our experience for forward rendering pipeline, the more GPU is optimized (less GPU computation time per frame), the larger this number can be)
  • Optimize visibility culling (InitView)
    • Modify assets to reduce dynamic occlusion culling (hardware occlusion queries or hierarchical z buffer culling), for it is computational inefficient to calculate visibility culling for an object that only shows up a small portion of it on the screen. An example of this optimization is shown in Fig. 4
    • Use precomputed visibility culling[9] to reduce numbers of primitives needed to be processed with dynamic occlusion culling in runtime. However, it’s not so efficient in Code51 because mechs can jump and fly in the game. The statically occluded primitives (~300 primitives) are 20% of the total occluded primitives in Code51 when precomputed visibility culling is enabled
    • Masked Occlusion Culling (Github/Paper) is an occlusion culling approaches implemented in the CPU with SIMD acceleration(SSE, AVX, and AVX-512), it is an alternative to the hierarchical z buffer culling, which can be parallelized and efficiently computed on modern multi-core CPUs
  • Reduce dynamic lighting, turn off shadow projection for dynamic lighting or use capsule shadows instead when budget allowed
  • Use light baking as much as possible
  • Use LOD & HLOD, keep tris < 2.5M for Premium VR
  • Use particle LOD and don’t use non opaque particles at LOD0

Fig. 4. An asset was modified from the left one to the right one to avoid culling calculation for objects behind the asset which can be seen through in the original one.

Differentiation and deepening the immersive experience

Last but not least, one key approach to optimize the user experience of VR apps is to utilize all available computational resources on a hardware platform as much as possible in order to deliver the best experience on that platform.  Having well optimized CPU compute and higher thread count CPUs enables developers to employ the CPU to deepen the VR immersion experience.  For example, a user with an Intel Core i7-7700K has more CPU threads and resources to use than a user with an Intel Core i5-4590 only (the minimum CPU spec. of Oculus Rift without using ASW). Thus the experience of a VR app can be improved if extra CPU intensive visual and audial features are added to the app to consume those extra resources.

In Code51, we implemented several CPU enhanced features to better utilize the CPU resources available on high end Intel Core i7 CPUs. These features include 3D audio, object destruction, CPU particle enhancement and additional background effects.

Most of the CPU computation of the visual and audial enhancements is offloaded to the worker threads or the audio simulation thread in UE4, including physical simulation performed by PhysX and ray tracing for physically-based sound propagation performed by the Steam Audio plugin (occlusion and environmental audio). These kinds of features significantly increase the immersion of Code51 on high end CPUs without performance drop since most of the computation is offloaded to the idle CPU cores, where the impact to the critical path of the rendering (CPU render thread and GPU) is minimized.

Fig. 5 shows the frame rate data of Code51 accordingly. The game is able to run smoothly on both Intel Core i7-7700K and Intel Core i7-7820HK with enhanced features on, but drops frames significantly on Intel Core i5-4590 with the same setting, implying that the i5 CPU is not able to handle all the enhanced features in 11.1ms per frame. Instead, for users with the min spec CPUs, they can still maintain a good user experience by turning off all the enhanced CPU features (low quality setting). According to the performance data, Intel Core i7-7700K shows 27% performance benefit against Intel Core i5-4590 in terms of fps(frame per second) when all CPU enhanced features are turned on in Code51.

Fig. 5. The frame rate of Code51 running on different CPUs (Intel Core i5 and  i7 processors) and quality settings, where Ultra High is the one with CPU enhanced features and Low is the one without. Test systems:  Intel Core i5-4590, NVIDIA GTX1080, 2x4GB DDR3-1600, Windows 10 version 1703; Intel Core i7-7700K, NVIDIA GTX1080, 4x4GB DDR4-2400, Windows 10 version 1703; Intel Core i7-7820HK, NVIDIA GTX1080, 4x4GB DDR4-2400, Windows 10 version 1703.

Fig.6 and 7 show the GPUView screenshots of Code51 in Ultra High quality running on Intel Core i7-6800K with CPU cores set to 4C4T and 6C12T respectively using msconfig, running at the same CPU frequency. It’s obvious from the charts that on 4C4T configuration (a proxy to 4C4T Intel Core i5 processors), the rendering time of a frame cannot reach 11.1ms and frame dropping happens in this case, while it can run smoothly on 6C12T configuration.

Here are the percentages of the CPU computation increased on different threads of Code51 accordingly, running on the same Intel Core i7-6800K in two configurations (6C12T and 4C4T) and observed in Windows Performance Analyzer (WPA):

Total CPU workload:       43%↑

Render thread:                44%↑

Driver thread:                  10%↑

Game thread:                  13%↑

Worker threads:               89%↑

A significant amount of the CPU works of the enhanced features was offloaded to the worker threads in Code51, leading to the largest computation increase on these threads.

Fig. 6. A GPUView screenshot of Code51 on Ultra High setting running on Intel Core i7-6800K and GTX1080. To demonstrate the scalable experience behavior of Code51, the CPU cores were set to 4C4T by msconfig in this case. Test system:  Intel Core i7-6800K, NVIDIA GTX1080, 4x4GB DDR4-2400, Windows 10 version 1703.

Fig. 7. A GPUView screenshot of Code51 on Ultra High setting running on Intel Core i7-6800K and GTX1080. There is no modification on CPU cores (6C12T) in this case, and Code51 employs the additional CPU compute resources to deliver a richer immersive experience. Test system:  Intel Core i7-6800K, NVIDIA GTX1080, 4x4GB DDR4-2400, Windows 10 version 1703.


Making VR users feel comfortable and alleviating motion sickness are two critical factors to the success of an e-Sports VR game. Only games with a long playtime and user addiction can succeed in e-Sports. Code51 adopted various approaches descripted in this article to reduce motion sickness, and to deepen the immersive experience, while at the same time create a spectator mode that helps to improve the viewing experience for audience. For performance optimization, render thread bound is usually the main CPU bottleneck in UE4 DX11 VR apps according to GPUView and WPA, standard game optimization methods and offloading tasks from the render thread to worker threads help in this case, it’s also beneficial to leverage all available CPU resources as much as possible to make your game stand out from the crowd, through adding more CPU intensive features such as those implemented in Code51.

Click here to Join the Intel® Game Dev program for free tools, resources, and opportunities to help you bring the best game experience to the biggest worldwide audience


Ashutosh Gupta
Share a little biographical information to fill out your profile. This may be shown publicly. In WordPress version 2.9, a new filter was added to make changing the [...] string in the excerpt. Do not use both of these methods in