Devlog 02: Should VR be Reality-Based?

Willem Helmet Pickleman ’18

The sense of “presence,” or immersion, that a user feels while in a virtual reality experience arises from a multitude of factors. A high frame-refresh rate, monitors with high pixel-density, high-tech refractive lenses, and accurate 3D audio can all contribute to a user’s immersion in a virtual environment by attempting to create a virtual world that mimics our physical reality.  If the VR design falls short in any of these areas, the user experience can quickly turn sour.  Low frame-refresh rates cause a delay in the updated perspective after the user’s head turns, which can lead to ‘simulator sickness.’  The absence of normal visual cues such as being able to see part of your own nose, or view your legs, can also create malaise.  The brain’s subconscious needs to be tricked in all the right ways in order to create presence, and failing to do so can lead to the subconscious becoming confused, which inevitably results in the user feel sick or uneasy.  Virtual reality developers are met with the challenge of keeping the user’s subconscious grounded and secure while still creating visual environments that could never exist in real life.

The dev team has considered multiple approaches to capturing the Gaspee story in an immersive way

One major consideration in creating content for our project is its adherence to both the historical record as well as reality. The spaces that we build need to emulate the real environments that Rhode Island revolutionaries inhabited. With 360 video, this is easy, since all one needs to do is find or create a stage that mimics the original environment, and set up a 360 camera.  For example, we can shoot video that captures the real geography (only slightly altered after more than 200 years) at Gaspee point, or capture the inside of a tavern that maintains its old-fashioned construction.  In real-time VR, however, it’s not so easy. Since every object is a 3D model, assets need to be built at a 1-to-1 scale with their real object counterparts. Users can easily recognize when an object’s dimensions are even slightly off, which will break them out of total immersion.

Navigating the uncanny valley, where man-made objects look almost real but not enough to be convincing, will be a challenge for the Gaspee team. In order to rapidly create assets while avoiding the pitfalls of photorealism, we’ve decided to create all of our assets in Google’s TiltBrush application. This is a 3D drawing program that can export finished objects into Unity with the aid of a special SDK developed by Google. TiltBrush is a fast and intuitive application that can let the entire class quickly create unique and professional looking models for the VR experience.  Through combining high-quality designs from TiltBrush and true-to-life images from 360 location shoots, the team plans to create a diverse, fully immersive Gaspee experience.