Capture GPS and "North" Meta-Data for VR Geotours

This Case Study explores how to shoot a VR “Geotour” with consistent camera orientation and GPS markers

Calibrating multiple cameras with GPS and “orientation” meta-data for VR Video Geotours

VR videos can be a very effective and compelling way to create a “VR geotour”, an immersive experience that strongly conveys the orientation and physical space of a specific location.  These VR Geotours might be used for training employees on their orientation within a factory or warehouse, or to convey a commercial or residential real-estate opportunity, or for travel and tourism “getting away in VR” experiences.

In this case study we review the importance of capturing GPS and relative camera orientation information using on-camera markers, much like a clapper-board “slate” has been used in film production for 100+ years to make a visual reference for sound-sync.

Using a “North Marker On Location

For each shot in the VR video Geotour Cristina can be seen in the opening frames of the shot holding a visual marker.  This footage is of course trimmed out of the shot, along with tail-footage which often has camera operators returning to the camera to manually stop/start the camera.  This video shows one complete camera take from the shoot, where you can see heads and tails activity around the camera.

Step 1: Marker

For each shot Cristina used her iPhone’s “compass” application to identify true-north.  She then held a small blue plastic globe as a visual identifier, to mark “north” in the shot.

Step 2: North Offset

This north-marker can now be used in post-production to either (a) offset the Yaw of the stitched vr video, so that it lays at the center of the stitched video, or (b) by measuring the offset from center of the image in degrees west/east, which can then be used as a variable to identify the “north offset” when creating the geotour.

Step 3: North Normalized

Here is the same shot “normalized”, such that north is at the center of the video.  This normalization is critical for creating geospatial consistency between shots, which will allow for greatly enhanced immersion for viewers.

Adding Interactive Markers

Now that we have identified the absolute “north” for each shot, we can add interactive markers that connect from one location on the map to the other camera positions.  Because we know which way is north in all shots, as we take the viewer between camera locations we can maintain correct field-of-view between each shot.

This consistent user “agency”, allowing them to be in control of the experience, and for changes between cameras to respect their actual body and head orientation between shots, *greatly* enhances the viewers sense of “presence” and enhances the VR geotour experience.

Incorrect Teleport

Here we see what can happen when the VR videos are NOT normalized to common “north”.  When viewer teleports from Shot 3 to Shot 5, they are gazing -45 degrees west (as measured from “north” which is the center of the video Yaw).  This effectively makes the viewer feel like they have been rotated 90 degrees to the right–which is very disorienting and breaks presence.

What Viewer Should See!

Here we see the 90 degree offset.  The viewer’s POV has been turned to the right 90 degrees, and instead of facing the Greenlake Boathouse, they are facing NE.  With the proper normalizing of Shots 1 and 5 we can achieve correspondence, and then automatically provide the correct angle of gaze after the teleportation.

Corrected Gaze

With the normalized “north” offsets added to both Shot 1 and 5, the viewer starts off looking NW from Shot 1 towards the Greenlake Boathouse.  After they click on the “hotspot” to teleport them to Shot 5, they arrive gazing in the same direction, which *greatly* enhances the sense of presence.

Watch the Greenlake VR Experience

This VR video experience which combines the 7 videos taken with GPS coordinates at Greenlake, Seattle WA, was shot in May of 2017.

Using Pixvana SPIN Studio we are able to edit the clips together, add hotspot markers, and “VR Cast” the experience to viewers in a headset.  The resulting experience provides a immersive “tour of Greenlake Boathouse area” and took just 15 minutes to shoot in a rapid run-and-gun fashion.  After uploading to SPIN Studio’s servers, the entire experience was created using SPIN Studio’s Story tools.

The interactive elements of this VR experience requires SPIN Play for viewing, which can be downloaded for free from leading VR app stores such as Oculus, Steam, and Google Play for Daydream.

A link to watch this experience in SPIN Play on your own VR headset is coming soon…