Editor’s Note: In this technical post we explain how Seurat, a developer tool from Google, is able to bring high-end graphics to mobile VR systems.
If students could see almost any place in the world with the high quality graphics of Google Earth VR, it would create new opportunities for exploration and learning. For this to work, it would need to be based on a system that’s accessible and easy to use for schools. Using a new tool called Seurat, we were able to do just that. We recently launched support for Google Earth scenes in Expeditions, which lets millions of students around the world experience some of the most beautiful places on the planet.
Mobile VR is easy to use, but it has less available computing power than PC-based systems. To make Google Earth VR hit the required framerates on mobile VR, we need to use techniques like occlusion culling. Google Earth has one of the largest 3D datasets available today, and we’re constantly updating it with new scans and applying reconstruction algorithms in our data centers. However, the pace of updates makes it challenging to apply recent research advances.
Fortunately, Seurat can help deal with scenes that are heavily occluded. We announced this tool at Google IO, and it’s a solution for high quality graphics on mobile VR:
Seurat is a scene simplification technology that optimizes the geometry and textures in your scene from a defined viewing volume (cube to the left above). It takes layered depth images (RGBA + Depth) as input and generates a textured mesh, targeting a configurable number of triangles and texture size. Having control of triangle count is beneficial for the Google Earth data, where the quality of the vertex data varies between locations. Hitting consistent framerates is important for VR, and the more reliable the scene geometry rendering time is, the easier it gets to organize per-frame budget for other elements, such as menus and overlays. Below is an example of generating Seurat input from Google Earth data.
As seen below, the mesh looks good inside the viewing volume. As soon as we move outside, we can quickly see the effects of the Seurat occlusion optimization. In this example, we used millions of triangles and 1000+ MB of texture data to generate the Seurat input. The output is a 20k triangles mesh (50:1 compression) with 5 MB texture data (200:1 compression).
In current writing, mobile VR only supports 3DoF (rotation only) tracking. 360 degree images, both mono and stereo, have been utilized in this environment with great success.
Seurat is a powerful tool for bringing immersive experiences with high fidelity graphics to mobile VR. Hear what our friends at ILM had to say about it, and if you want to check out these Expeditions with Earth VR imagery for yourself, download the Google Expeditions app on iOS or Android and enjoy them in your Cardboard or Daydream View. Seurat is still in development, but stay tuned for future updates!