There’s a problem with virtual reality.
For all its promise, it’s still pretty clunky; in order to get a high-quality experience, a user must be tethered to a server or PC — a limitation that, by its very nature, diminishes the quality of the experience. Mobile hardware and wireless networks are not yet advanced enough to “sever the tether,” or, as Purdue University's Y. Charlie Hu puts it, “today’s mobile hardware and wireless networks are about 10 times too slow for high-quality, immersive VR.”
So researchers are proposing a software solution in the form of a platform called Furion, which allows for the untethered playing of high-quality VR games using a smartphone. As presented at the ACM MobiCom 2017 conference, the quality of experience (QoE) of Furion passes the acceptability mark for user satisfaction: an individual frame rendering of 16 milliseconds, or 60 frames per second.
Google’s Pixel XL smartphone, by contrast, is only capable of 111 milliseconds per frame. If all the frames are rendered on the server and then transmitted to the smartphone instead, it takes even longer: Transmission at the highest data rate of Wi-Fi that current smartphones can support will produce a speed of 200 milliseconds per frame. The higher the number, the more clunky the experience.
The heavy computational workload of VR apps is partially due to the constant need to render updates to the virtual world’s background environment. However, that background environment is largely unchanged from frame to frame — and the necessary changes revolve primarily around the user’s position. That’s where Furion comes in.
“The user’s position doesn’t change randomly,” Hu points out. “You move continuously and in a very predictable way. That means we can predict how the background will change based on the user’s position, and pre-render the background.”
Furion splits up the rendering: the background rendering is done on the PC or server, and the less computationally heavy foreground rendering is done on the mobile device. By contrast to current VR systems that “fetch” rendered frames from the server as needed, Furion “pre-fetches” the background and anticipates fetch commands ahead of time. The background is rendered as a panoramic photo and split into four images, each of which can be decoded by one of the smartphone’s four microprocessor cores. It can then be cropped automatically to match the user’s changing viewing angle.
“You may suddenly turn your head, so if we render the whole panoramic frame for a location in the virtual world, it can simply be cropped properly to match wherever you are looking,” Hu explains.
This “cooperative rendering” approach speeds frame-rendering time to 14 milliseconds on the Pixel XL — enough to satisfy the QoE of high-quality VR.
Furion was tested with three popular, high-quality VR games: Viking Village, Corridor and Nature. The Purdue Research Foundation’s Office of Technology Commercialization has filed a provisional patent for the platform.