Project full title
3D Geomodels in Virtual Reality
Nature of deliverable
Scheduled delivery date`
31st August 2020
Gwénaël Caravaca, Nicolas Mangold Stéphane Le Mouélic
The scope of this deliverable is to provide an integrated virtual reality (VR) application featuring the Kimberley outcrop and its local vicinity (Gale crater, Mars, traversed by the MSL rover Curiosity in 2014). The working area (2700 m by 2800 m) was defined in deliverable 5.1 (Caravaca et al., 2019a) and the corresponding 3D models were produced and provided by the deliverable 5.2 (Caravaca et al., 2019b).
In the VR application, users can visualize, explore and take measurements of the Kimberley outcrop and its region, reconstructed using both orbital (DEM) and ground-based (DOM) data. We present several innovative tools designed for the geological characterization of this outcrop in VR, allowing the user(s) to make a wide range of field measurement in the virtual environment, as if they were actually there on the field. We also highlight several examples of original results obtained at Kimberley using the newly-developed VR tools and the virtual environment.
This VR application demonstrates the possibilities offered by the VR technology, but also serves the PlanMap effort in allowing a new finer and more accurate geological mapping and characterization of the Kimberley area.
MArs Hand Lens Imager
Mars Science Laboratory
CHEMistry and CAMera
Centre National de la Recherche Scientifique
Planetary Data System
Digital Elevation Model
Digital Outcrop Model
Spacecraft Planet Instrument orientation (C-matrix) Events
Geographic Information System
Global Navigation Satellite System
High Resolution Imaging Science Experiment
Laboratoire de Planétologie et Géodynamique
Work Package 5
The deliverable 5.3, led by CNRS-LPG, is focused on the release of a comprehensive Virtual Reality application, allowing visualization, exploration and geological characterization of an accurate, multiscale and realistic 3D depiction of the Kimberley outcrop area, traversed by the MSL rover Curiosity in 2014.
Using VR to research ends is a very recent approach but it is largely growing among the community. This new technique and its development unlock previously underrated possibilities to study hardly accessible areas, which is particularly suited for planetary bodies, such as the Martian surface. Several tools dedicated to the VR environment have been developed to perform the same actions that a geologist would do on a real field on Earth, such as taking measurements of distances, angles, strikes and dips, with an acute precision allowed by the digital tools. The VR environment also allows several exclusive possibilities not possible on the field, such as the ability to observe a wide range of spatialized data over the 3D models (e.g. geologic maps, MSL-generated imagery) at various scale using GIS-like functionalities.
Base data used to produce the GIS-like database and virtual environment are presented in and are issued from the GIS project provided with the deliverable 5.1 (Caravaca et al., 2019a). The base 3D models used to recreate the VR environment (the Kimberley outcrop itself and its vicinity) are presented in and are issued from the deliverable 5.2 (Caravaca et al., 2019b). All these data have been integrated into a fully functional VR application, provided with this deliverable (cf. Annex A).
The use of this application allows us to propose a new fine-scale geological characterization of the Kimberley outcrop as a demonstration for the huge potential of this new technology in the Planetary Geology field, where remote access to the actual data is always an issue.
Since its landing in 2012 in the Gale crater on Mars, the MSL rover Curiosity explored the diverse sedimentary formations of the northern plain of the crater, Aeolis Palus (Fig. 1a). During its mission, the rover studied the diversity of rocks encountered at Gale, bringing new insights into the past environments of the planet, but also raising new scientific questions (e.g. Grotzinger et al., 2014; Le Deit et al., 2016; Stack et al., 2016).
As specified in previous PlanMap’s deliverables 5.1 and 5.2 (Caravaca et al., 2019a and 2019b), WP5’s work focus on an area of the Gale crater encompassing Curiosity’s traverse between sols 431 and 960 and notably including the Kimberley, Cooperstown and Pahrump Hills outcrops. This area corresponds to the ~2700 x 2800 m red box on Figure 1a, more specifically centred around the Kimberley outcrop, traversed by Curiosity between sols 603 and 630 (Fig. 1; Caravaca et al., 2019a and 2019b).
Figure 1: a) View of Aeolis Palus and Aeolis Mons in Gale crater, Mars, and the traverse of MSL rover Curiosity (white line). The red box highlights the working area recreated in the VR environment, and including the Cooperstown, Kimberley, and Pahrump Hills outcrops. b) Curiosity rover at Kimberley during sol 595, as seen from orbit by Mars Reconnaissance Orbiter’s HiRISE camera. The informally named “Mt. Remarkable” butte is located, as well as the approximate limit of the area reconstructed in the DOM provided in deliverable 5.2 (red dashed line; Caravaca et al., 2019b, 2020) and featured in the VR application described in this deliverable. Base image: HiRISE ESP_036128_1755 full colour tile, NASA/JPL/University of Arizona, https://www.uahirise.org/ESP_036128_1755
The Kimberley outcrop mainly consists in a butte of about 5 meters high (informally named Mt. Remarkable; Fig. 1b) dominating over a flatter terrain. Those features are easily recognized from orbital imagery as observed on Figure 1b. The outcrop shows siliciclastic series recording fluvial settings like many other outcrops observed since Bradbury Landing (Fig. 1a; Le Deit et al. 2016; Stack et al., 2016, Rice et al., 2017). Yet, this specific outcrop stands out by the unusually high abundance of potassium (2-5 wt.% of K2O) analysed in these clastic sediments (Le Deit et al., 2016), contrasting with other sedimentary rocks with basaltic composition lacking the high abundance in potassium (e.g., Mangold et al., 2017).
While the origin of the high potassium abundance might likely be detrital (Treiman et al., 2016), the question of the local to regional stratigraphical relationship of this outcrop with the rest of the Gale sedimentary series remains open to further investigation in absence of a clear continuity in-between outcrops and clear evidenced contacts with other units.
In order to better constrain the structure, geometry and facies of this outcrop, a previous work presented in deliverables 5.1 and 5.2 (Caravaca et al., 2019a and 2019b) focused on gathering all orbital as well as ground-based data available for this outcrop (cf. D.5.1) and on the reconstruction of multiscale, photorealistic 3D models of the Kimberley outcrop area (cf. D.5.2). In this deliverable, the objective is to integrate these spatialized data within a Virtual Reality environment with the purpose of making a new fine-scale mapping and characterization of the sedimentary record of the Kimberley outcrop and its vicinity using dedicated innovative tools.
VR in planetary sciences is not a recent idea (McGreevy, 1993), but the current developments in both hardware and software, as well as the democratization of powerful computing units and commercially available HMDs made it more accessible to our community. Several solutions are available to produce VR experiences using dedicated game-based engines (e.g., Mat et al., 2014; Murray, 2017), allowing promising applications of VR to research, education and training but also outreach.
According to this, the main objective of this deliverable is to provide an application dedicated to the geologic survey and mapping of the Kimberley outcrop in VR. To that extent, a virtual environment has been created to display a multi-scale 3D representation of the outcrop and its regional vicinity using highly-resolved and accurate 3D models reconstructed from orbital and/or ground-based data, provided with previous deliverable 5.2 (Caravaca et al., 2019b). These data need to be handled with a minimum of modifications to ensure their scientific reliability in the same way they would be used with more classic visualization tools (e.g. using a GIS software). Largest-scale orbital models are used to visualize the entire study area (a ~3x3 km square area centred around Kimberley). They also provide a context and background for more localized observations made on ground-based models (DOMs), that present a reduced surface but offer an unprecedented resolution to study at a very fine scale the specificities of the Kimberley outcrop (Caravaca et al., 2020).
The entire study area is reproduced in the virtual environment using a 3D model with relief derived from an excerpt of the HiRISE DEM for the Gale crater (Fig. 2; Parker & Calef, 2016; Caravaca et al., 2019b). This DEM offers a vertical resolution of 1 m/pixel that allows us to discriminate most regional- to outcrop-scale geomorphologic structures at Kimberley and within its vicinity.
This model, provided with the deliverable 5.2 (Caravaca et al., 2019b), can be textured with three different layers that are:
The different textures proposed are being used as part of the entry database for the GIS-like features of the VR application. The orbital 3D model itself is however not resolved enough for an exploration and a fine-scale characterization at real-scale, hence the need for a more detailed DOM of the Kimberley outcrop made from ground-based imagery data.
Figure 2: Regional 3D model of the working area, with relief reconstructed using HiRISE DEM, and textured with a) HiRISE greyscale orthoimage, b) HiRISE greyscale orthoimage and a co-registered extract of the ESP_036128_1755 HiRISE colour orthoimage, and c) an excerpt of the regional geomorphological map after Grotzinger et al. (2014). Details show the Kimberley outcrop with each texture (after Caravaca et al., 2019b).
In addition to regional-scale models made from orbital data, the Kimberley outcrop has been reconstructed as a photorealistic Digital Outcrop Model.
The model itself was computed using photogrammetry, based on nearly 2000 images taken by 3 different imagers onboard Curiosity (Navcam, Mastcam right and left), as detailed by Caravaca et al. (2019b, 2020). This reconstruction results in a DOM representing a useful area of 1625.8 m² (Fig. 3). The high resolution of the mesh allows for a 3D reproduction of the various and multi-scale sedimentary structures, sand ripples and scattered blocks that are present across the outcrop with unprecedented quality, enabling us to discriminate features at the sub-cm-scale (Fig. 3). In addition, this mesh is textured using true colours for an immersive rendition in the virtual environment.
The DOM has been provided as part of the deliverable 5.2 (Caravaca et al., 2019b), and a textured low-resolution version is accessible on the Sketchfab public repository using this link: https://skfb.ly/6KNq8.
Figure 3: Photogrammetric DOM of the Kimberley outcrop, as reconstructed using Curiosity imagery data. a) General view of the DOM towards the Mount Remarkable butte. b) Opposite view towards the eastern part of the outcrop, from the Mount Remarkable perspective.
Once all the models have been computed (using orbital and ground-based data), the next step in the creation of the virtual environment itself is the control of their respective scaling and georeferencing prior to their integration. Indeed, all models have to be properly scaled, and their geographic position has to be known in order to enable the various geospatial features of the developed VR tools (e.g. measurement tools, compass).
Scaling and georeferencing are easy to control on the orbital-derived model: both the DEM and the orthoimages are georeferenced within the Mars 2000 system (Seidelmann et al., 2002), allowing to be precisely situated when roaming on this model in VR. For the local DOM, scaling and georeferencing are less straightforward. While Agisoft PhotoScan Professional photogrammetry software (Agisoft LLC, 2018) is able to scale the 3D model using the absolute spatial position of the georeferenced entry images (cf. Caravaca et al., 2019b, 2020), a verification is needed to ensure the correct dimension of the model before integration with the larger-scale orbital model. The scaling and orientation of the Kimberley DOM were validated in the CloudCompare software (v.2.9, Girardeau-Montaut, 2015) using georeferenced orbital HiRISE products as a reference. This step allowed us to validate both the scaling and the georeferencing of the DOM within the Mars 2000 system, ensuring a common referential for the data and the virtual environment. We were able to make the Kimberley DOM match the HiRISE DEM for the most part of the mesh, the only noticeable exception being a sandstone bed on the northern flank of Mt. Remarkable (Fig. 4). The residual gap existing in-between the DEM and the DOM is due to the difference of vertical resolution between the models (as evidenced by the shadow seen underneath the DOM in Figure 4).
Figure 4: High-resolution full colour DOM of the Kimberley outcrop, scaled and fitted on the georeferenced and scaled HiRISE DEM of the Gale Crater (with HiRISE colour texture draped onto it), showing nearly perfect alignment between the two meshes (with anat exception for the northernmost part of Mt. Remarkable). A 1:1 model of the Curiosity rover is placed for scale (after Caravaca et al., 2020).
Once all scaling, orientating and georeferencing have been verified, an integration of the different models was possible to create the landscape of the virtual environment. That way, the large-scale orbital model serves, when seen “from the ground” in VR, as middle-ground for the more localized Kimberley DOM (Fig. 5a). The background is made by a skybox representing the Gale crater rim and Mount Sharp, in their geographically accurate position relative to the virtual environment. This integration serves two purposes. First, it allows the immersion of the user within the virtual environment, which is important so that the user could get a real sense of its position in the environment and relative distance due to the lack of landmark compared to an Earth-based outcrop (Fig. 5a). Second, this integration has a critical implication for measurements and notably regarding the ones taken between the Kimberley outcrop and other structures in the vicinity, that are represented by the orbital model (e.g. Mt. Christine in Fig. 5b). As the tools should not focus on one specific 3D model or another but rather consider the whole environment, the integration must be fine-tuned to allow accuracy and relevance of the measurements within the same referential and across various ranges.
Figure 5: a) General view of the Kimberley outcrop within the virtual environment, from a 1:1 scale ground perspective. Foreground high-resolution mesh is Kimberley DOM, middle-ground lower resolution model is the regional model derived from the HiRISE orbital DEM. b) Wider-angle view of the DOM blended within the broader regional model derived from the HiRISE orbital DEM. This integration is useful to provide regional context, and to study the stratigraphic relations of the Kimberley outcrop with its immediate to broader vicinity (e.g. between Mount Remarkable and Mount Christine to the North). The orange line indicates the traverse of Curiosity.
Application of VR technology to Geology is a domain that needs to be investigated, with nearly unlimited potential for both research and educational purposes. It is also particularly well suited for the exploration of remote and/or inaccessible areas, both on Earth and on other planetary bodies.
While we are able to produce accurate 3D reconstructions of areas of interest (e.g. the Kimberley outcrop and its surroundings), our main goal remains to be able to use these immersive virtual environments in such manner that it is possible to answer scientific questions and assess parameters like the spatial distribution of morphologies, structures or facies. For that reason, a set of innovative VR-enabled tools has been designed to empower ourselves with the possibility of doing a “field campaign” in the virtual environment, with the ability of using a rangefinder, a clinometer or a compass in the same way we would on the field on Earth.
To implement a fully-functional and reliable VR application and to ensure maximum compatibility with most of the commercially available HMDs at the time this deliverable is released, the development of the VR-enabled and integrated tools described here has been entrusted to a specialized third-party company, VR2Planets. The application has also been tested and is fully functional with commercially available Oculus Rift S and HTC Vive/Vive Pro HMDs.
One of the most acclaimed breakthroughs of the VR is the ability to experience the virtual environment as if the user was actually there. This feature is very useful in Geosciences, and particularly to the PlanMap effort in that we are able to explore digitally reconstructed outcrops that are actually situated several millions of km away, without the necessity of being there in person (Fig. 5). Using VR, the user has therefore the opportunity to observe and characterize geological features that could have not been readily appreciable by other means (e.g. 2D “flat” panorama images). Also, the user is empowered with the ability of freely walk around an accurately reconstructed DOM (cf. Caravaca et al., 2019b, 2020), at real-scale and without spatial limitation (except that of the working area actually rendered in the virtual environment), allowing to make observations in 3D, as if the user was on the field. This is of particular interest to characterize lateral variations in thicknesses, facies and/or morphologies, or to correlate strata over the distance. Moreover, VR is utmost useful in allowing the user to go anywhere in the virtual environment, including areas that could not be reachable in reality, or simply to take some altitude over the terrain to appreciate different and wider points of view (Fig. 6).
While VR free-roam mode allows the user to experience a virtual rendition of an outcrop at real-scale, VR technology allows to step up the visualization with multi-scale capabilities. It is possible for the user to change the scale of the virtual environment at any time, allowing fast travel around the environment (which is useful for large study areas), but also and above all to observe the outcrop and its environment from different points of view and at varying scale, from the regional- (Fig. 6a) to local-scale (Figs. 6b and 6c), and to the real-scale to observe macroscopic features (e.g. the ~25 cm-wide trough cross-stratification at the base of Mount Remarkable, Fig. 6d). This is particularly important in cases where the geographic and/or stratigraphic position of an outcrop within its local to regional environment are at question and need to be clarified, or also to be able to produce local to regional correlation between localities.
This change of scale is possible in-app by pressing lateral buttons on both controllers, and dragging in or out the terrain, as if the user was zooming in/out a photo on its smartphone/tablet. An onscreen tab appears to inform the user of the current scale (Fig. 6).
Figure 6: Illustration of the multi-scale visualization capabilities of the VR application, allowing observation of the Kimberley outcrop from the regional scale (a), to the local scale (b), to the outcrop scale (c) and to small macroscopic details embedded within the strata, such as here the ~25 cm-wide trough cross-stratification at the base of Mount Remarkable. An onscreen tab appears during the change of scale to inform the user of the current scale.
Navigation within the virtual environment can be done in several ways. Obviously, the first and foremost way to move within the virtual environment is by physically moving and walking around, providing the user stays within bounds of the previously defined VR area. When physical displacement is prevented, due to a lack of free space, or to achieve long-range travel across the virtual environment, the user has the possibility to teleport itself on various areas of the virtual environment. The teleportation functionality proposes two options, whose difference matters when using an online multi-user mode. The first teleport option, as illustrated by Figure 7a, is a “Teleport area” function. It allows to teleport the user and its networked group to a point of the virtual environment selected by pulling the trigger of the controller (the destination is materialized by a yellow beam, Fig. 7a). The second option, illustrated by Figure 7b, allows the user to teleport himself within the range of the group area. This option is useful when convening with other users, to move around the area quickly, without disturbing the position of the other users. In this case, the user selects its own destination by pulling the trigger within the designated area (the destination beam being green in this case, Fig. 7b).
The virtual environment in the application also acts as a “model”, and it is therefore easy to manipulate it in a similar way than the scaling functionality (see above), creating a relative displacement of the user. By pressing the lateral button of only one of the controllers, the user activates a “Drag’n’move” function (Fig. 7c) that allows to move the terrain in every direction. This option is useful when the user wants to move quickly from one area to another, or to observe from up-close specific details at various scales. Using this functionality, the user may lose its grounding. That is why a transparent circle has been designed to stand at the feet of the user (Fig. 7d). This circle always indicates the position of the true ground in the VR area, and allows the user to move the virtual environment’s ground to match the actual physical one it is standing on, as illustrated by Figure 7d.
Finally, a “Reset terrain” function is present (Fig.7e) to reset the entire virtual environment to its default scale and position.
Figure 7: In-application navigation functionalities; a) the “Teleport area” function allows to teleport the user (and its networked group) to a specific destination pointed by the yellow beam; b) The “Teleport user” acts in a similar way, yet it allows the seldom teleportation of the user within a limited range (evidenced by a transparent circle), to a destination pointed by a green beam; c) A “Drag’n’move” function allows the user to manipulate the model to fast-travel around the environment, or bring the model to a closer point of view; d) The ground circle marker is present to recall the position of the actual ground the user is standing on, allowing to match both real and virtual grounds; e) The “Reset terrain” function reset the position and scale of the virtual environment.
The VR allows us to enhance “classic” field work techniques by adding advanced GIS-like visualization assets to take advantage of the VR technology and power for displaying and processing spatialized geodata. The compatibility and complementarity with other more classic tools such as GIS software are ensured by the georeferencing of the virtual environment. All this allows to bring in new and useful additional information to the 3D representation such as composition, facies and many more, improving our experience of real world field work.
What is maybe the most important enhancement allowed by a virtual environment over a real field, is the opportunity to augment this environment by adding supplementary information to the 3D representation. This information can take several forms, such as geochemical analyses or mineralogical composition displayed on demand at a given location, or can be in the form of several independent layers draped over the 3D model, in the same way that a GIS software would do.
Here we take advantage of the spatialized data issued in deliverables 5.1 and 5.2 (Caravaca et al., 2019a, 2019b) and presented higher in this work to bring a multi-layer display capacity within the VR application. This allows the user to quickly and easily change the layer draped over the surface of the regional 3D model that makes the base of the virtual environment, in order to dispose of several types of information that can be used for characterization, mapping or correlation of an outcrop and/or its immediate to remote vicinity. Indeed, the user is able to select through the application’s UI (Fig. 8a) one of the relevant available layers which are: the HiRISE greyscale orthoimage of the entire study area (Fig. 8b), the same orthoimage but with a co-registered extract of the ESP_036128_1755 colour image (Fig. 8c), and finally the geomorphologic map of the MSL study area (after Grotzinger et al., 2014; Fig. 8d). Those different layers of georeferenced information are instantly displayed onto the 3D regional model to be viewed from various points of view and scales.
Figure 8: Illustration of the GIS-like multi-layer display functionality of the VR application, allowing to change the type of information displayed over the virtual environment in real time; a) The layer of interest can be chosen “on-the-go” through the application’s UI for quick and easy access; b) HiRISE greyscale orthoimage selected as the display layer of the virtual environment; c) HiRISE greyscale orthoimage and a co-registered extract of the ESP_036128_1755 selected as the display layer of the virtual environment; d) excerpt of the regional geomorphological map after Grotzinger et al. (2014) selected as the display layer of the virtual environment.
During its stay at Kimberley, Curiosity generated a 360° panorama image using the Mastcam imagers on sol 610 while parked in front of the Mount Remarkable during the Windjana drill campaign. We stitched the panorama and included it in the VR application to propose to the user an immersive view from the Windjana drill site (Fig. 9).
Figure 9: View toward the Mount Sharp from Curiosity perspective while parked in front of the Windjana drill site, as seen in the 360° panorama taken by the rover during sol 612 and integrated in the VR application.
While the virtual environment is made from actual orbital DEM and a photogrammetric DOM, some areas might present inaccuracies due to either processes. In that case, ground-truth in the form of actual imagery taken by the rover can help discriminate actual structures from modelling artefacts. Fortunately, each image taken by the Curiosity rover is associated with a set of parameters called SPICE kernels, that precisely describe the attitude, configuration and position of the rover and all its instruments at every moment an operation is done (Alexander & Dean, 2015). Using these SPICE kernels, it is therefore theoretically possible to know the exact location and orientation of the onboard imagers when an image is taken, and to accurately replace them within the virtual environment (Fig. 10).
Figure 10: Illustration of the image projection functionality, allowing to display and project onto the 3D model of the virtual environment each image taken by Curiosity at Kimberley; a) View of the tool allowing to toggle the display of the exact location and orientation of the images taken, sorted by instrument (Mastcam, Hazcam, Navcam and MAHLI); b) Example of the projection of a Navcam image onto the 3D model at the base of Mount Remarkable.
In the VR application, the “Image tools” functionality in the UI allows to display the exact location and orientation of the images taken, sorted by instrument (Mastcam, Hazcam, Navcam or MAHLI; Fig. 10a). It is then possible to show or hide one, several or all the sets of images at will. Using the associated selection tool (Fig. 10b), it is therefore possible to display and project onto the 3D terrain the actual images from the rover, matching the virtual environment texture, as illustrated by the Navcam image projected onto the DOM at the base of Mount Remarkable in Figure 10b.
What really makes a difference between the current different modes of visualization for 3D data (static or not 3D visualization, AR/VR) is the ability to quickly and accurately get a sense of the spatial structure of the observed 3D model. This is obviously easier in VR given the fact that the user is able to “experience” the 3D data, rather than just observing it from afar or through the “deforming prism” of a flat screen display. This statement implies that more straightforward and accurate measurements are being done if the user is able to “do the measurement” on an actual 3D shape. In that, the VR is a powerful tool to make such precise measurements.
In this deliverable, we present several tools that have been specifically designed to be handled in VR. They replicate the functionalities of classical field tools employed by generations of geologists during field campaigns but are adapted to the realm of VR in their handling. Thanks to the high accuracy of the entry data used to create the virtual environment (cf. Caravaca et al., 2019a, 2019b, 2020), an equivalent level of accuracy can be expected for the measurements produced by the VR tools within the application. This allows to obtain higher-grade measurements, readily usable for further studies and publications (in the same fashion as data studied using a GIS software).
Despite several promising projects (e.g. Gerloni et al., 2018; Billant et al., 2019), there is currently no well-defined standard for measurement tools in VR, which raises the question of their ergonomic design and of the way they should be handled (the “user experience”, or UX). Indeed, these tools should be operable by any user, under any working situations in VR, and these tools should be operable whatever the actual remote controllers’ model is used in the “real world”. Also, while these tools can be very different in their function (a rangefinder is obviously dissimilar to a clinometer), they do need to be operated by a single physical controller model. We therefore need to work out a “universal” UX for all these tools in VR. These questions about the UX were addressed in concertation with developers at VR2Planets who offered insights from their previous experiences in that matter and considering the development constraints they could encounter. The foremost question that arose was about the shape and functionalities of the VR tool: should they ideally replicate their “real world counterparts”, or should we create a new VR-specific way of taking measurements? It appears that the simplest and most reliable way to make measurement in VR is to use a point-based approach (Fig. 11). Indeed, this UX is easy to handle: the user only has to place one of several points anywhere in the virtual environment to spark a wide range of measurements, whatever it may be distance, angle, direction, etc. Moreover, this UX is not material-dependant: since every VR controller commercially available, even the most basic ones, has a trigger, the user is able to use it to place a point.
The point-based UX proposes three methods for point designation and placement in the virtual environment, that can be enabled by the user through the in-application UI following its needs: “laser pointing”, “magnetic pointing” and “free-hand pointing”. The “laser pointing tool” allows the user to remotely place a point from afar, using a laser beam to designate the position of the measurement point on the 3D model (Fig; 11a). The “magnetic pointing tool” allows the user to place a point locked onto the surface of the 3D model (Fig; 11b). Finally, the “free-hand pointing tool” allows the user to place a point anywhere in the virtual environment, including in the air or even under the surface of the models (Fig. 11c). Of course, if any point is not correctly set, the user can modify its position using either of the three methods. It only needs to select the point and drag it toward the desired position (Fig. 11d). If a measurement is no longer needed, the user can delete it at will using the specific “delete” functionality, associated to either of the three pointing methods (Fig; 11e). This UX and handling propose a lot of versatility and liberty for a wide variety of uses and users, while also providing precision and accuracy in the measures.
Figure 11: Handling of the versatile point-based UX for the various measurement tools in VR; a) Remote placement of a point on the 3D model using the laser pointing tool, at the tip of the yellow beam; b) Placement of a point on the 3D model using the magnetic pointing tool, locking on the nearest 3D mesh surface; c) Placement of a point anywhere in the virtual environment (including ”in the air”) using the free-hand pointing tool; d) Each point position can be modified, using either of the three point methods, by reselecting a previous point, and dragging it toward the desired position (a blue vector appears); e) Similarly, any measurement can be deleted from the virtual environment using the “delete” function and either of the three pointing methods.
Even on an actual field, it is not often easy to show a specific structure, moreover from a distance. On the Earth, we use a stick or a laser pointer to do so. In the VR application. A similar laser pointer was implemented (with a red beam; Fig. 12), allowing the user to point a feature of interest.
Figure12: Illustration of the laser pointing tool, allowing to pinpoint a particular feature in the virtual environment.
One of the most important capacities for a geologist on the field is the ability to measure distances, from the size of a block to the width of a bed in a section, etc. It is therefore the first and foremost tool that has been integrated in the VR application.
The VR tool is capable of giving a measure in meter (m, ±0.01 m), taken between two points placed anywhere in the virtual environment by the user (Fig. 13a). An option also allows to take a measurement along a fixed axis, either vertically (e.g. to measure the height of a bed while logging a section, Fig. 10b), or horizontally (e.g. to measure the wavelength of a cross-bed structure, or to measure the width of the tracks left by Curiosity, Fig. 10c).
Figure 13: Illustration of the distance measurement tool in the virtual environment, giving a measure in meter (m, ±0.01m) of absolute distances between two points placed by the user (a), or along a specific fixed axis, either vertically (b) or horizontally (c).
The ability of measuring angles on the field can be important to characterize various geological features such as angular unconformities, fault planes and intersections, variations in flows, etc.
The VR tool is capable of giving a measure of an angle in degree (°, ±0.01°), taken between three points placed anywhere in the virtual environment by the user (Fig. 14a). An option also allows to take a measurement along a fixed plane, either vertically (e.g. to measure the plunge of an angular unconformity; Fig. 14b), or horizontally (e.g. to measure the angle between two sets of interfering ripples, Fig. 14c).
Figure 14: Illustration of the angle measurement tool in the virtual environment, giving a measure in degree (°, ±10-2°) of an absolute angle between three points (a), or along a specific fixed plane, either vertically (b) or horizontally (c).
Strike and dip measurements of a surface (contact, unconformity) or of entire formations are of utmost importance for mapping purposes and for the characterization of the intra- and inter-bed relations on an outcrop. Therefore, its integration as a VR tool is critical. To obtain these measurements, the tool relies on the computation of a best-fit plane, that can be created from several entries: the mesh geometry, or user-placed points. The first option allows to place a best-fit plane that is computed to match the 3D geometry of the mesh at the specific point the user placed it upon (Fig. 15a). The second option consists in computing the best-fit plane from either three points placed in the virtual environment by the user (Fig. 15b), or by an unlimited number of points (Fig. 15c), the latter allowing to compute a regionally constrained plane. Any best-fit plane extension can be modified by the user, allowing it to project the plane across the virtual environment, enabling long-range correlations of measured surface and/or unit (Fig; 15d).
The measure of the strike and dip are given in degree (°), following the right hand rule: the strike is given as the complete positive angle taken from the North (between 0 and 359°), and with the dip measured as a positive value relative to the horizontal plane, to the right of the strike direction (e.g. for a strike direction of 80°, the dip will be towards the South).
Figure 15: Illustration of the best-fit plane computation tool for strike and dip measurement. The direction is given in degree (°) from the North, and the dip in degree (°) from the horizontal plane, following the right hand rule; a) Measurement of the strike and dip of a simple plan matching the geometry of the 3D model at the point it is placed upon; b) Measurement of the strike and dip of a plan computed from three points; c) Measurement of the strike and dip of a plane computed from an unlimited set of points; d) Regional projection of the same plane.
On Earth, it is nowadays quite easy to obtain a precise cartographic position on any point of the globe using the GNSS coverage. In the VR application, a similar functionality is also present, allowing to retrieve the precise coordinates (easting and northing, in meter m) and altitude (in meter m) of any point in the virtual environment.
Figure 16: Illustration of the coordinates and altitude functionality, giving easting, northing and altitude in meter (m).
The compass is an essential tool to measure directions and orientation relative to the North, within the horizontal plane. In the VR application, the compass functionality gives this measure in degree (°, ±0.01°), as a positive value comprised between 0 and 359°, following the right hand rule.
Figure17: Illustration of the compass functionality, giving a direction relative to North in degree (°) following the right hand rule.
The other essential tool of the geologist, the clinometer measures the dip or plunge of a surface or a bed, relative to the horizontal plane and within the vertical plane. In the VR application, the functionality gives this measure in degree (°, ±0.01°), as a positive value comprised between 0 and 90°.
Figure 18: Illustration of the clinometer functionality, giving the dip relative to the horizontal plane in degree (°, ±0.01°).
Collaboration is essential when it comes to field work, where interactions and multiple points of view are key to a better understanding of the characterized features. We therefore take advantage of the immersive rendering and power of the VR technology to replicate field “team-work” by optionally adding a collaborative multi-user mode to the VR application.
Several users can therefore simultaneously experience the same virtual environment. Their presence is materialized by an avatar (Fig. 19) indicating their location in the scene in real time, and visible by any other user. Moreover, each of the users present in the virtual environment is able to independently use the aforementioned tools to produce measurements, point structures using the embedded laser pointer, or project Curiosity imagery on the 3D model, and show them off to the other users.
Figure 19: Illustration of the networked multi-user mode, allowing several persons to simultaneously use the VR application and work in the same virtual environment. In this example, two users are exploring the base of the Mount Remarkable.
We also take advantage of the multi-user experience to enable and empower communication and collaboration on shared projects by an immersive “telepresence”. The individual users do not need to be present in the same place to participate in this VR field-work because the collaborative mode has been designed to operate over the Internet as well as it can operate over a local network. This functionality thus permits collaboration from remote places, with users “gathering” in the virtual environment from their respective place (e.g. from Nantes, Padova, Bremen, etc.). Real-time communication and data link are therefore allowed by broadband Internet connection, with all users being able to speak and hear colleagues in the VR application through their respective HMDs.
Compared to a real field work, the VR application provides a “virtual environment” which is made of 3D reconstructions of actual terrains. Even with the most accurately reconstructed DOMs, and the widest dataset, this virtual environment remains a representation (yet the most realistic possible) of the terrain rather than the terrain itself.
This raises the question of potential errors in measurements and interpretations made from the virtual field that could come from the original data themselves and/or from the replication of the environment. Our objective is obviously to have a control over both the accuracy of the data used to recreate the virtual environment, and from within the environment itself using the embedded tools in VR. This degree of control should ensure a maximum reliability in the reproduction of actual data in the virtual environment to propose the most possibly realistic and qualitative experience.
The accuracy of the original dataset used to recreate our virtual environment (cf. Caravaca et al., 2019a) is something that we do not have a control over. These products are obtained “as is” from various public sources (e.g. USGS, PDS repositories, etc.), and therefore we rely on the fact that they are the best currently available. Anyway, if there are problems with these data that we can identify and solve, it is assumed that any needed correction has been performed prior to integration, as we would for any other scientific work and with respect to methods used.
We will never need to modify these entry data in a way that could alter the representation and/or accuracy of the information that is contained, therefore guarantying that original quality of the entry data is correctly represented in the VR application. This is the same implicit rule as if the data would only be read through a GIS software. The only processing step on these data is about the representation, in order to highlight specific elements, or to enhance the visual quality within the virtual environment.
Another way to validate the accuracy of the virtual environment is by using actual data gathered and/or produced in situ by the rover Curiosity and compare them with the recreated scene. This is possible thanks to the SPICE kernels and the reprojection of actual rover-generated imagery. By using the projection of the actual images over the 3D environment, the user can determine whether the reproduction is accurate, and if it is not, to assess the degree of deformation. It is also possible to thus use the actual projected image in order to fulfil the characterization and/or measurement prevented by a flaw in the virtual environment modelling, particularly in areas where the photogrammetric reconstruction could not be achieved and/or where only orbital data are available.
Using the VR application provided with this deliverable, one can attest the nearly limitless potential and power of the VR technology for the geological exploration and characterization of remote outcrops. The VR depiction of the Kimberley outcrop is therefore studied using the dedicated VR tools presented here, allowing to produce a robust sedimentologic study of this area.
The Kimberley formation studied here displays 4 distinct members (Le Deit et al., 2016; Stack et al., 2016, Rice et al., 2017): Square Top, Dillinger, Mt. Remarkable and Beagle (from bottom to top, Fig. 20a). They are composed of siliciclastic rocks ranging from fine sandstones to pebble conglomerates, presumably of fluvial origin. However, if those members can be identified, the outcropping conditions (presence of regolith and dust) and the remote points of view of the 2D images make it difficult to characterize whether their interformational contacts are conformable or not. Moreover, the precise characterization of these contacts and their spatial extension is restricted to the Mount Remarkable itself, with few opportunities at correlating them towards Mounts Christine and Joseph.
Figure 20b shows a VR snapshot of the base of Mt. Remarkable that was investigated in VR. The aim here was to assess a potential lateral variation in the thickness of the Dillinger member. After a new mapping of the contacts between the Square Top, Dillinger, and Mount Remarkable members (Fig. 21a), we produced serialized measurements of the Dillinger member using the VR vertical distance measurement tool (Fig. 20b). These serialized measurements show the presence of a previously unseen but conspicuous lateral variation in the thickness of the Dillinger member, ranging from 55 cm in the northernmost part of the outcrop and up to 85 cm in the southernmost part, that was recognized thanks to the VR environment (Figs. 20b, 21a, 21b; Caravaca et al;, 2019c).
Figure 20: a) Synthetic log of the Kimberley formation at the Mount Remarkable (after Le Deit et al., 2016); b) View of the base of Mount Remarkable in the VR application, investigated with serialized measurements to assess a potential lateral variation of the thickness of this member. VR analysis confirmed the presence of a previously unseen lateral variation in thickness of the Dillinger member.
Going further in the investigation of the Dillinger member, we observe a lateral variation of the sedimentary structures within these sandstone beds (Fig. 21), superimposed over the lateral variations of thickness evidenced within these beds. In the northern part of the outcrop, several sets of well-preserved dm-scale cross-stratifications (plausible TCS, 3 to 5 on Figs. 21b and 21c) are present. Toward the South, the size of these cross-stratifications gradually decreases down to a few cm (2 and 3 on Figs. 21b and 21c). Together with this decrease, we can note the apparition around the middle of the butte of (sub-) cm-scale planar parallel stratifications (1 to 3 on Figs. 21b and 21c). These planar stratifications first alternate with the cross-stratifications before completely replacing them in the southernmost part of the outcrop (Figs. 21b and 21c). Once again, the accurate characterization of the size and spatial distribution of these sedimentary structures and their variation would not have been possible if for the exploration at real scale in VR of the outcrop, the parallax distortion present in 2D panorama preventing this kind of observation. Finally, we also observed that the contact between Square Top and Dillinger members show a ~3° dip toward the Southwest (Figs. 21a and 21b), hinting at the presence of a paleotopography at the moment the Dillinger member was deposited.
All these elements, the southwestward dipping of the Square Top/Dillinger contact, the southward thickening of the Dillinger member, and the shift between energetic TCS structures giving way to low-energy planar stratifications, are evidence favouring a rapidly changing and dynamic depositional setting. This interpretation remains compatible with previous assertions of a marginal fluvio-lacustrine depositional settings for the area, as previously proposed by e.g. Grotzinger et al. (2015) or Le Deit et al. (2016), but the new observations carried out using the VR application advocate for a laterally-evolving pattern and a more complex suite of event at Kimberley (Caravaca et al., 2019c) that has to be finely characterized. Further efforts can be envisaged to extend this localized study to a more regional or global perspective.
Figure 21 a) Interpreted view of the Kimberley outcrop toward the Mount Remarkable in the VR application, with annotations highlighting the newly mapped contacts between the Square Top, Dillinger and Mount Remarkable members, with the contact between Square Top and Dillinger members showing a slight southwestward dip (~3°). The Dillinger member itself shows a lateral variation of its thickness across the outcrop (illustrated by the green bars and associated measures), as well as lateral variations in size and distribution of the sedimentary structures present in the sandstone beds; b) Schematic representation of the observed lateral variations in thickness of the member, relative size and abundance of the sedimentary structures within the Dillinger member; c) Close-up view of the various sedimentary structures observed in the Dillinger member, from mm- to cm-scale laminations of planar parallel stratifications (1) to dm-scale cross-stratifications (TCS?; 5).
The PlanMap-VR application has been developed jointly by LPG and VR2Planets that has been selected as subcontracting company under the public market number 52805.
Agisoft, LLC, 2018. PhotoScan Professional. https://www.agisoft.com, Accessed: 10 October 2018.
Alexander, D. & Deen, R., 2015. Mars Science Laboratory Project Software Interface Specification (SIS); Camera & LIBS Experiment Data Record (EDR) and Reduced Data Record (RDR) Data Products, version 3.0. NASA Planetary Data System. URL: https://pds-imaging.jpl.nasa.gov/data/msl/MSLNAV_0XXX/DOCUMENT/MSL_CAMERA_SIS_latest.PDF
Billant, J., Leclerc, F., Escartin, J., Gracias, N., Istenic, K., Garcia, R., Arnaubec, A., Dano, A., Marchand, C., SUBSAINTES science party, 2019. Development of a Unity package allowing GIS-like mapping in Virtual Reality environment. In Geophysical Research Abstract 21, Abstract #EGU2019-13639-3. URL: https://meetingorganizer.copernicus.org/EGU2019/EGU2019-13639-3.pdf
Calef III, F.J., Parker, T., 2016, MSL Gale Merged Orthophoto Mosaic, Publisher: PDS Annex, U.S. Geological Survey, URL: http://bit.ly/MSL_Basemap
Caravaca, G., Le Mouélic, S., Mangold, N., and the PlanMap consortium, 2019a. Deliverable 5.1: Merged products (in GIS and as maps) of orbital and in situ data of Gale crater. URL: https://wiki.planmap.eu/display/public/D5.1-public.
Caravaca, G., Le Mouélic, S., Mangold, N., and the PlanMap consortium, 2019b. Deliverable 5.2: 3D products of the merged GIS and maps of Gale crater. URL: https://wiki.planmap.eu/display/public/D5.2-public.
Caravaca, G., Mangold, N., Le Mouélic, S., Le Deit, L., Massé, M., 2019c. A new assessment of the depositional record at Kimberley (Gale crater, Mars) using Virtual Reality. In 34th IAS Meeting of Sedimentology. DOI: 10.13140/RG.2.2.14403.35367
Caravaca, G., Le Mouélic, S., Mangold, N., L’Haridon, J., Le Deit, L., Massé, M., 2020. 3D Digital Outcrop Model reconstruction of the Kimberley outcrop (Gale crater, Mars), and its integration into Virtual Reality for simulated geological analysis. Planetary and Space Science 182, 104808. DOI: 10.1016/j.pss.2019.104808
ESRI, ESRI Shapefile Technical Description, 1998, URL: https://www.esri.com/library/whitepapers/pdfs/shapefile.pdf, accessed: 6 March 2020
Gerloni, I.G., Carchiolo, V., Vitello, F.R., Sciacca, E., Becciani, U., Costa, A., Riggi, S., Bonali, F.L., Russo, E., Fallati, L., Marchese, F., Tibaldi, A., 2018. Immersive Virtual Reality for Earth Sciences. In 2018 Federated Conference on Computer Science and Information Systems (FedCSIS) 15, 527-534. DOI: 10.15439/2018F139
Girardeau-Montaut, D., 2015. Cloud compare—3d point cloud and mesh processing software. Open Source Project. EDF R&D, Telecom ParisTech. URL: http://www.danielgm.net/cc
Grotzinger, J. P. , Sumner, D. Y., Kah, L. C., Stack, K., Gupta, S., Edgar, L., Rubin, D., Lewis, K., Schieber, J, Mangold, N., Milliken, R., Conrad, P. G., DesMarais, D., Farmer, J., Siebach, K., Calef III, F., Hurowitz, J., McLennan, S. M., Ming, D., Vaniman, D., Crisp, J., Vasavada, A., Edgett, K. S., Malin, M., Blake, D., Gellert, R., Mahaffy, P., Wiens, R. C., Maurice, S., Grant, J. A., Wilson, S., Anderson, R. C., Beegle, L., Arvidson, R., Hallet, B., Sletten, R. S., Rice, M., Bell III, J., Griffes, J., Ehlmann, B., Anderson, R. B., Bristow, T. F., Dietrich, W.E., Dromart, G., Eigenbrode, J., Fraeman, A., Hardgrove, C., Herkenhoff, K., Jandura, L., Kocurek, G., Lee, S., Leshin, L. A., Leveille, R., Limonadi, D., Maki, J., McCloskey, S., Meyer, M., Minitti, M., Newsom, H., Oehler, D., Okon, A., Palucis, M., Parker, T., Rowland, S., Schmidt, M., Squyres, S., Steele,A., Stolper, E., Summons, R., Treiman, A., Williams, R., Yingst, A., MSL Science Team, 2014. A Habitable Fluvio-Lacustrine Environment at Yellowknife Bay, Gale Crater, Mars. Science 343(6169), 1242777. DOI: 10.1126/science.1242777
Grotzinger, J., Gupta, S., Malin, M., Rubin, D., Schieber, J., Siebach, K., Sumner, D., Stack, K., Vasavada, A., Arvidson, R., Calef III, F., Edgar, L., Fisher, W. F., Grant, J. A., Griffes, J., Kah, L. C., Lamb, M. P., Lewis, K. W., Mangold, N., Minitti, M. E., Palucis, M., Rice, M., Williams, R. M. E., Yingst, R. A., Blake, D., Blaney, D., Conrad, P., Crisp, J., Dietrich, W. E., Dromart, G., Edgett, K. S., Ewing, R. C., Gellert, R., Hurowitz, J. A., Korucek, G., Mahaffy, P., McBride, M. J., McLennan, S. M., Mischna, M., Ming, D., Milliken, R., Hewsom, H., Orhler, D., Parker, T. J., Vaniman, D., Wiens, R. C., Wilson, S.A., 2015. Deposition, exhumation, and paleoclimate of an ancient lake deposit, Gale crater, Mars. Science 350(6257), aac7575. DOI: 10.1126/science.aac7575
Le Deit, L., Mangold, N., Forni, O., Cousin, A., Lasue, J., Schröder, S., Wiens, R. C., Sumner, D., Fabre, C., Stack, K. M., Anderson, R. B., Blaney, D., Clegg, S., Dromart, G., Fisk, M., Gasnault, O., Grotzinger, J. P., Gupta, S., Lanza, N., Le Mouélic, S., Maurice, S., McLennan, S., Meslin, P.-Y., Nachon, M., Newsom, H., Payré6, V., Rapin, W., Rice, M., Sautter, V., Treiman17, A. H., 2016. The potassic sedimentary rocks in Gale Crater, Mars, as seen by ChemCam on board Curiosity. Journal of Geophysical Research: Planets 121. DOI: 10.1002/2015JE004987
Mangold, N., Schmidt, M.E., Fisk, M.R., Forni, O., McLennan, S.M., Ming, D.W., Sautter, V., Sumner, D., Williams, A.J., Clegg, S.M., Cousin, A., Gasnault, O., Gellert, R., Grotzinger, J.P., Wiens, R.C., 2017. Classification scheme for sedimentary and igneous rocks in Gale Crater, Mars. Icarus 284, 1-17. DOI: 10.1016/j.icarus.2016.11.005
Mat, R. C., Shariff, A. R. M., Zulkifli, A. N., Rahim, M. S. M., Mahayudin, M. H., 2014. Using game engine for 3D terrain visualisation of GIS data: A review. In IOP Conference Series: Earth and Environmental Science 20, 012037. URL: https://iopscience.iop.org/article/10.1088/1755-1315/20/1/012037/pdf
Murray, J. W., 2017. Building virtual reality with Unity and Steam VR. AK Peters/CRC Press. DOI: 10.1201/b21862
McGreevy, M. W., 1993. Virtual reality and planetary exploration. In Virtual Reality pp. 163-97. Elsevier. DOI: 10.1016/B978-0-12-745045-2.50018-0
Parker, T., Calef III, F.J., 2016, MSL Gale Merged Digital Elevation Model, Publisher: PDS Annex, U.S. Geological Survey, URL: http://bit.ly/MSL_DEM
Rice, M. S., Gupta, S., Treiman, A. H., Stack, K. M., Calef, F., Edgar, L. A., Grotzinger, J., Lanza, N., Le Deit, L., Lasue, J., Siebach, K. L., Vasavada, A., Wiens, R. C., Williams, J., 2017. Geologic overview of the Mars Science Laboratory rover mission at the Kimberley, Gale crater, Mars. Journal of Geophysical Research: Planets 122(1), 2-20. DOI: 10.1002/2016JE005200
Seidelmann PK, Abalakin VK, Bursa M, Davies ME, de Bergh C, Lieske JH, Oberst J, Simon JL, Standish EM, Stooke P, Thomas P, 2002. Report of the IAU/IAG working group on cartographic coordinates and rotational elements of the planets and satellites: 2000. Celest Mech Dyn Astron 82(1):83–111. DOI: 10.1023/A:1013939327465
Stack, K. M., Edwards, C. S., Grotzinger, J. P., Gupta, S., Summer, D. Y., Calef III, F. J., Edgar, L. A., Edgett, K. S., Framan, A. A., Jacob, S. R., Le Deit, L., Lewis, K. W., Rice, M. S., Rubin, D., Williams, R. M. E., Williford, K. H., 2016. Comparing orbiter and rover image-based mapping of an ancient sedimentary environment, Aeolis Palus, Gale Crater, Mars. Icarus 280, 3-21. DOI: 10.1016/j.icarus.2016.02.024.
Treiman, A. H., Bish, D.L., Taniman, D.T., Chipera, S.J., Blake, D.F., Ming, D.W., Morris, R.V., Bristow, R.W., Morrison, S.M., Baker, M.B., Rampe, E.B., Downs, R.T., Filiberto, J., Glazner, A.F., Gellert, R., Thompson, L.M., Schmidt, M.E., Le Deit, L., Wiens, R.C., McAdam, A.C., Achilles, C.N., Edgett, K.S., Farmer, J.D., Fendrich, K.V., Grotzinger, J.P., Gupta, S., Morookian, J.M., Newcombe, M.E., Rice, M.S., Spray, J.G., Stolper, E.M., Sumner, D.Y., Vasavada, A.R., Yen, A.S., 2016. Mineralogy, provenance, and diagenesis of a potassic basaltic sandstone on Mars: CheMin X-ray diffraction of the Windjana sample (Kimberley area, Gale crater), Journal of Geophysical Research: Planets 121, 75–106. DOI: 10.1002/2015JE004932.
Table A1 lists all files provided by this deliverable:
Data files for the PlanMapVR application and executable (for Windows)