realities.io | realities.io gets featured on Road to VR
16606
post-template-default,single,single-post,postid-16606,single-format-standard,ajax_fade,page_not_loaded,,qode-theme-ver-4.2,wpb-js-composer js-comp-ver-5.0.1,vc_responsive

31 Aug realities.io gets featured on Road to VR

Road to VR, the world’s largest independent news publication dedicated to the consumer virtual reality, featured a project of realities.io. You can find the article here. Also make sure to see this video about the Project:

Further read: some in depth information:

Walking around in a 3D scanned cave and a visualization of the Database of the paleolithic excavation.

The Database consists of ~17000 single measurements from several excavation campaigns over 10 years and is visualized with a different 3D Symbol per Artifact category. Showing all 17000 Symbol Meshes at once is made possible through instancing (and thus reducing draw calls to ~300). Through this method the whole scenes with all Artifacts shown runs in rock solid 90fps on a GTX 970 and i7 4770K.

The (Photo)scan of the Cave itself is an unlit model. This is because delighting and baking Lightmaps afterwards doesn’t make much sense for real world scenes (as the scans are not intended to be “reusable assets” like you would create for a game). Unlit Models also have the advantage that they arent costly to render and more GPU power can be utilized to create more detailed geometry (This is CRUCIAL for VR as stuff like normal maps look flat and Parallax Mapping only works well on flat surfaces). Before processing the Photos the dynamic Range of the Images is altered by tonemapping and the scene looks really close to the actual cave (the archaeologists who are familiar with the cave even tried to not step on certain area like they would in the real cave).

VR Locomotion with the HTC Vive is solved different from the typical teleport approach. Instead to just teleport the user to a position he “flies” there at “ludicrous speed”. This happens distance independent within a fixed time of 100 ms but is clamped to a minimal movement speed in case the distance is really short. The velocity is applied “instant” because acceleration in VR can cause motion sickness because of the mismatch between visual and vestibular information. There are also some user that still feel uncomfortable at high velocities without acceleration, but the time of 100 ms is set roughly below the 2 sigma interval of the typical human reaction time. This means that once you notice you move, you already stopped again. I tested this approach with several users that get sick really fast (including myself – even after working with VR for over a year). One of the tester gets motion sick in 3D cinema, with the Oculus Rift DK2 and sometimes even on a 2D monitor but had no issues with this “ludicrous speed” approach.

Unfortunately i wont be able to invest much time in this project (Thus far i spent ~ one week for it). The reason is that the excavation is not that well funded and i actually only get paid for the scan of the cave and the surrounding area (UAV) with some usual derivatives like Orthophotos and Topological Data. The VR visualization itself is just a “Proof of Concept” Project to get some funding for future projects and for promotion of my work. But i still want to add some minor improvements. I plan to add some more features like measurement and maybe a tool for archeological profile cuts. Also i want reconstruct the surfaces of the of the archaeological layers from their measurement Data to give the archeologist a good spatial understanding of the geological sedimentation/archaeological layers of the cave.

An “official” Video with the university i collaborated with will come in the near future.