Capture GPS and "True North" Meta-Data for VR Geotours

This Case Study explores how to shoot a VR “Geotour” with consistent camera orientation and GPS markers

Calibrating multiple cameras with GPS and orientation metadata to create a VR Video Geotour

VR video can be a very effective and compelling way to create a “VR geotour,” an immersive experience that strongly conveys the physical space of a specific location. VR geotours might be used to help trainees find their way within a factory or warehouse, to present a commercial or residential real estate opportunity, or for travel and tourism experiences.

In this case study, we review the importance of capturing GPS and relative camera orientation data using on-camera markers, much like a clapperboard “slate” has been used in film production for 100+ years to make a visual reference for sound-sync.

Using a “True North Marker On Location

For each shot in the VR video geotour, Cristina can be seen in the opening frames of the shot holding a visual marker. This footage will later be trimmed out of the shot, along with the usual tail-footage of camera operators returning to manually stop or start the camera. This video shows one complete camera take from the shoot, where you can see the heads and tails activity around the camera.

Step 1: Marker

For each shot, Cristina used her iPhone’s Compass application to identify true geographic north. She then held a small blue plastic globe as a visual identifier to mark “north” in the shot.

Step 2: North Offset

This north marker can now be used in post-production to either (a) offset the yaw of the stitched VR video so that it lays at the center of the stitched video, or (b) identify the offset from center of the image in degrees west/east, which can then be used as a variable to identify the “north offset” when creating the geotour.

Step 3: Normalized North

Here is the same shot, this time with north at the center of the video. This normalization is critical for establishing geospatial consistency between shots, which will keep viewers from becoming disoriented and enhance the feeling of immersion.

Adding Interactive Markers

Now that we have identified the absolute “north” for each shot, we can add interactive markers that connect from one location on the map to the other camera positions. Because we know which way is north in all shots, as we take the viewer between camera locations we can maintain the correct field-of-view between each shot.

This consistent view allows the viewer to be in control of the experience, and allows changes between cameras to respect the viewer’s actual body and head orientation between shots, greatly enhancing the viewer’s sense of presence and the VR geotour experience.

Incorrect Teleport

Here we see what can happen when the VR videos are NOT normalized to common “north”. When viewer teleports from Shot 3 to Shot 5, they are gazing -45 degrees west (as measured from “north” which is the center of the video yaw). This effectively makes the viewer feel like they have been rotated 90 degrees to the right–which is very disorienting and breaks the feeling of presence.

What the Viewer Should See!

Here we see the 90 degree offset. The viewer’s POV has been turned to the right 90 degrees, and instead of facing the Greenlake Boathouse, they are facing northeast. With the proper normalizing of Shots 1 and 5 we can achieve correspondence, and then automatically provide the correct angle of gaze after the teleportation.

Corrected Gaze

With the normalized “north” offsets added to both Shot 1 and 5, the viewer starts off looking northwest from Shot 1 towards the Greenlake Boathouse. After they click on the “hotspot” to teleport them to Shot 5, they arrive gazing in the same direction, which greatly enhances the sense of presence

Watch the Greenlake VR Experience

This VR video experience, which combines seven videos taken with GPS coordinates at Green Lake, Seattle WA, was shot in May 2017.

Using Pixvana SPIN Studio we were able to edit the clips together, add hotspot markers, and “VR Cast” the experience to viewers in a headset. The resulting experience provides a immersive tour of the Green Lake boathouse area, and took just 15 minutes to shoot in a rapid “run-and-gun” fashion. After uploading to SPIN Studio’s servers, the entire experience was created using SPIN Studio’s Story tools.

The interactive elements of this VR experience require SPIN Play for viewing, which can be downloaded for free from leading VR app stores such as Oculus, Steam, and Google Play for Daydream.

A link to watch this experience in SPIN Play on your own VR headset is coming soon…