Image Image Image Image Image Image Image Image Image Image

Oculus VR News | December 10, 2019

Scroll to top

Top

No Comments

Facebook Open Sources 'DeepFocus' AI-Powered Rendering System

Facebook Open Sources ‘DeepFocus’ AI-Powered Rendering System

Image courtesy of: Oculus

Staff Writer

Oculus’ parent-company Facebook has announced that it’s releasing the code behind its AI-powered rendering system, DeepFocus, which is used to create realistic and natural looking real-time blur for a more true-to-life visual experience in virtual reality.

The DeepFocus AI-powered software was developed to work with Oculus’ latest prototype headset, ‘Half Dome‘, which Facebook Reality Labs (FRL) unveiled to the public earlier this year. Unlike the Oculus Rift, ‘Half Dome’ uses eye tracking and a varifocal display system that actually moves the panels inside the VR headset depending on what the user is looking at in order to accurately produce focal depth on nearby objects in the virtual world.

Even though Half Dome’s hardware is very advanced, the prototype headset still requires innovative software to mimic how we naturally see the world. That’s where DeepFocus comes into play, as it allows for it to recreate the feeling of depth in VR by generating a realistic, gaze-contingent, defocus effect that runs in real time. This defocus effect essentially makes one part of an image appear much sharper than the rest.

Oculus DeepFocus

“Our eyes are like tiny cameras: When they focus on a given object, the parts of the scene that are at a different depth look blurry,” says Marina Zannoli, a vision scientist at Facebook Reality Labs who helped work on the DeepFocus project via a blog post by the Oculus team. “Those blurry regions help our visual system make sense of the three-dimensional structure of the world, and help us decide where to focus our eyes next. While varifocal VR headsets can deliver a crisp image anywhere the viewer looks, DeepFocus allows us to render the rest of the scene just the way it looks in the real world: naturally blurry.”

One of the other key potential benefits of DeepFocus is that it provides a much more comfortable experience in VR overall. “This is about all-day immersion,” says Douglas Lanman, FRL’s Director of Display Systems Research. “Whether you’re playing a video game for hours or looking at a boring spreadsheet, eye strain, visual fatigue and just having a beautiful image you’re willing to spend your day with, all of that matters.”

The Oculus team hopes that by making its DeepFocus source code and training data available to the wider community of engineers developing new VR systems, vision scientists, and other researchers studying perception that it will help to accelerate development in this area in an effort to benefit the industry as a whole.