HTC Vive Pro Eye Preview – Necessary step in the evolution of VR

Spread the love

Although much has happened in the field of virtual reality in recent years and great strides have been made with hardware, image quality and software, there are still many barriers to widespread adoption. Things like wearing comfort, ease of use, range of software and angle of view are a few of them, but let’s not forget a very simple one: the price. If you want the best VR experience, you will have to pay a lot of money for high-end glasses and connect them to a very powerful PC. And with the resolution of glasses, which is only expected to increase in the coming years, the required computing power will also increase further.

The problem of computing power also has a solution, or at least something that can ease the pain. Currently, VR graphics are conjured up on the screen fairly stupidly. The same amount of detail can be seen on every part of the screen, while the human eye does not see an equal amount of detail in the entire field of view. In fact, there only needs to be a sharp image in the center of the field of view and less detail may be shown at the edges. The rendering technique based on this is called foveated rendering and requires VR glasses to have built-in eye tracking, so that the image at the point where the user is looking is always sharp.

During previous editions of CES we have already seen prototype VR glasses that support this, and we know, for example, that Oculus is working on a new version of the Rift with eye tracking. However, the first major manufacturer to bring this technique to the masses is HTC. During CES, the manufacturer announced a new variant of the Vive Pro that knows exactly where the user’s eyes are and we were able to get started with it.

Little changed

The Vive Pro Eye is like a Vive Pro, but with eye tracking. From the outside you hardly see any difference and also in terms of operation and fit, as a user you do not realize that you are dealing with a new headset. Looking inside, you’ll see tiny openings around the lenses, presumably to shine infrared light onto the user’s eyes, which is how most eye-tracking solutions we’ve seen so far work. For the time being, it remains a guess as to the exact operation, because HTC does not want to go into details about the technology at this time. More important is of course that it works and that is certainly the case. We were able to do three demos in total, each of which showed some aspect of what is possible if software developers have information about the position of the eyes.

More beautiful for less

The most interesting one was developed by the Zerolight company and allowed users to virtually experience a BMW car. In the demo it was possible to determine the color and actually get into the car to take a closer look at the interior and play with the different boxes, buttons and options. Built in the popular Unity graphics engine, this demo used foveated rendering for higher image quality. The piece of image right in front of the user’s eyes was rendered at nine times the normal resolution, using an Nvidia technique called variable rate shading . That high-resolution image is then scaled back and thanks to the high degree of supersampling, the end result is less affected by jagged edges.

The portion of the image rendered this way is between a quarter and an eighth, Zerolight technical director Chris O’Connor told us. Around that is a small area that is calculated at four times the normal resolution and outside of that the image is simply rendered at the resolution of the image panel in the Vive headset. Those three circles of different resolutions could be illustrated by showing an overlay over the image during the demo , which indicated in color areas which part was drawn with which sharpness. It was interesting to see that those areas of color constantly moved with the viewing direction of the user.

Before we got started ourselves, we could watch on a screen how someone else experienced the demo, so we knew how the system worked and where any hard transitions in resolution should be visible. When we put the goggles on ourselves, however, there was no trace of this and we could not notice any delay in the operation of the system. During the demo, a number of times were switched between normal and foveated rendering, and it was clearly visible that part of the image was sharper in the latter mode. It was noticeable that the limited viewing angle of the current glasses still hinders the effect of this technique. Because of the lenses used, the image of VR glasses quickly becomes blurry when you look from the corner of your eye, training you to turn your head to what you want to see.

According to O’Connor, this technique means that you can use the extra resolution with about a third of the computing power you would need if you were to calculate the complete image at that high resolution. Of course, what developers can also choose is to keep the image quality the same in the center, but reduce it at the edges, where it is not noticeable. That way you can keep the image quality the same, but significantly reduce the required computing power. This is especially interesting for mobile VR, where the available computing power is very limited.

Useful for workouts

A second demo put us in the cockpit of a Lockheed Martin aircraft, where we had to go through a start-up procedure using audio instructions. It was accompanied by operating different buttons in the correct order, which in this demo was done by looking at them. It was not necessary to move the whole head, as was done until now, but turning the eyes was enough. We pushed the limits of the system by turning our head strongly away from the object we had to look at and turning our eyes completely the other way. Even then, the system managed to follow our gaze just fine. According to the makers, eye tracking can be very useful in this type of training simulation, because instructors not only see the actions of students, but also where their attention is.

The last demo was the least impressive, using eye tracking simply to navigate menus. In VR software you sometimes have to select a button on the screen by looking at it for a few seconds and with the current glasses that means turning your entire head so that the button is in the center of the field of view. This demo showed that this can be done more subtly with eye tracking.

Conclusion

Judging by the demos we were able to experience, the Vive Pro Eye does exactly what you expect it to do. As an end user you don’t notice anything that happens under the hood and in the meantime software developers can increase the quality or lower the system requirements. As always, these kinds of innovations start at the top end of the market and the Vive Pro Eye is really aimed at companies and the most enthusiastic end users. The fact that Vive also continues to sell the normal Pro suggests that the Eye variant will come out on top in terms of price, making it not inconceivable that it will cost more than a thousand euros.

Although the impact of this particular model on the industry will be small, it has been shown that foveated rendering works and can be beneficial. Hopefully, this technique will soon find its way to more affordable headsets, because that is where the most profit can be made at the moment. Those who really can’t wait can get their hands on a Vive Pro Eye sometime in the second quarter of this year.

You might also like