Commit ed5cd2e0 by Tobias Rautenkranz

WebVR-VR post

parent db7a21bf
---
layout: post
title: WebVR-VR
lang: en
category: code
tags: code radiology
---
# Virtual Reality Volume Rendering for real-time Visualization of Radiologic Anatomy
![Graphical Abstract with cat!]({{ "/img/webvr-vr/webvr-vr_abstract.png" | relative_url }})
[[larger]]({{ "/img/webvr-vr/webvr-vr_abstract.svg" | relative_url }})
The 3D radiologic volume data (MRI & CT) at
[sectional-anatomy.org](https://sectional-anatomy.org)
is visualized on a 2D computer screen. Virtual reality (VR) promises
to improve the visualization and thus understanding of this 3D data
by allowing it to be presented in a natural, depth information preserving way.
A new Web technology ([WebVR](https://w3c.github.io/webvr/spec/1.1/))
in combination with powerful smartphone graphic processors
and inexpensive virtual reality viewers (Google Cardboard)
have made virtual reality cheap and easily obtainable.
These new developments and the before mentioned possibility of a more natural
visualization have lead me to implement an experimental VR webpage for an MRI
scan.
## <img src="/img/webvr-vr/vr.png" style="height: 0.8em;"/> Try It
You need an Android phone with the
[Google Chrome](https://play.google.com/store/apps/details?id=com.android.chrome)
browser and a virtual reality viewer
([Google Cardboard](https://vr.google.com/cardboard/)) to
explore the [MRA Brain](https://sectional-anatomy.org/mra-brain/) in VR.
### Available Visualizations and Sequences
* [TOF MIP](https://sectional-anatomy.org/webvr_experiment.html#)
* [TOF VR](https://sectional-anatomy.org/webvr_experiment.html#renderer=0)
* [TOF ISO](https://sectional-anatomy.org/webvr_experiment.html#renderer=2)
* [PCA MIP](https://sectional-anatomy.org/webvr_experiment.html#volume=1)
* [T1 VR](https://sectional-anatomy.org/webvr_experiment.html#volume=2)
![cardboard]({{ "/img/webvr-vr/cardboard.jpg" | relative_url }})
At time of writing only
[Google Chrome for Android](https://play.google.com/store/apps/details?id=com.android.chrome)
has native WebVR support.
See [webvr.info](https://webvr.info) for updated information on WebVR
compatibility.
## User Experience
Virtual reality uses a different image for each eye,
allowing depth to be perceived by stereoscopic vision.
Likewise different from a computer screen,
the VR viewer covers the users field of view and the head movement is tracked.
### Depth Information
It is my experience that the depth perception allows the spacial extent of the
data to be better discerned.
![Screenshot]({{ "/img/webvr-vr/screenshot.jpg" | relative_url }})
Contrary to my expectation a
[MIP](https://en.wikipedia.org/wiki/Maximum_intensity_projection)
rendering works well. This is surprising since it does not
represent a reality like rendering. Most obviously voxels may
appear in front when they are in fact behind the other ones,
since their order along a view ray is not considered.
Still, a volume can not be perceived in its entirety.
The stereoscopic rendering at best
doubles the number of channels (from RGB to 2xRGB),
but does not offer an additional dimension to the two of the screen.
Therefore, surfaces, since requiring only a single additional depth value
per pixel, can be quite well visualized.
But to get all the information from a volume the individual slices are still
needed.
### Rotating the Volume
Since Cardboard devices only support tracking of the viewing direction (angles)
but not the position of the head
(xyz), the user can not walk around the volume in VR.
A different approach was needed.
Like in the
[Google Cardboard Demo: Exhibit](https://play.google.com/store/apps/details?id=com.google.samples.apps.cardboarddemo)
the volume is always kept centered in the users field of view
and rotated to reflect the head rotation.
### Selecting Anatomy
Querying the volume and selecting an anatomy from the segmentation to get
its label is achieved with a
[reticle](https://www.google.com/design/spec-vr/interactive-patterns/display-reticle.html).
![]({{ "/img/webvr-vr/reticle.jpg" | relative_url}})
The anatomy behind the reticle, that contributes most to the value of
the center pixel, is selected.
The selection process can be toggled with a click (the Cardboard button).
Only two degrees of freedom from the head rotation are used
to position the reticle. Or more precisely: the reticle is fixed
and the head rotation is used to rotate the volume somewhat unintuitively.
This makes selecting a specific anatomy difficulty or even impossible.
## Implementation Details
The existing volume renderer for
[WebGL](https://www.khronos.org/registry/webgl/specs/1.0/)
was adapted to support the new
[WebVR](https://w3c.github.io/webvr/spec/1.1/)
standard to allow virtual reality rendering in the browser.
Like for most VR renderers a forward renderer is used. The difference being,
that this is normally done to be able to use MSAA
(see [Optimizing the Unreal Engine 4 Renderer for VR,
Pete Demoreuille][UnrealEngine4]);
but this is of no use for volume rendering. Instead, the ability of the
existing WebGL renderer to output an image of the volume and its segmentation
simultaneously is not needed.
On the contrary, they need to be combined in one image.
Thus, the renderer was adapted to a single stage.
As a side effect, color renderings are now possible without using the
[WEBGL_draw_buffers](https://www.khronos.org/registry/webgl/extensions/WEBGL_draw_buffers)
extension.
[UnrealEngine4]: https://developer.oculus.com/blog/introducing-the-oculus-unreal-renderer/
Since multiple layers are not supported by WebVR implementations,
some additional overlay renderers are injected before or after the volume
rendering.
Please note that, due to its experimental nature, the
[source code](https://sectional-anatomy.org/code.html)
is sometimes confusing or even confused.
### Anti-Aliasing
Anti-Aliasing is [recommended][UnrealEngine4] for VR due to the large
apparent pixel size.
Instead of the not applicable MSAA, a full screen antialiasing is used.
Some aliasing is also caused by the insufficient linear interpolation of the
volume voxels
(see: [An Evaluation of Reconstruction Filters for Volume Rendering
Stephen R. Marschner and Richard J. Lobb](http://graphics.stanford.edu/~srm/publications/Vis94-filters-abstract.html));
or even the nearest neighbor interpolation of the segmentation.
[Vlachos]: http://alex.vlachos.com/graphics/Alex_Vlachos_Advanced_VR_Rendering_Performance_GDC2016.pdf
### Dynamic Resolution
The WebVR layers API would easily allow a dynamic resolution
(like recommended in:
[Alex Vlachos, Advanced VR Rendering Performance GDC 2016][Vlachos])
to prevent dropping frames.
The inverted rotation and constant centering of the volume would lead to
discrepancies with
[Reprojection][Vlachos] /
[Timewarp](https://developer.oculus.com/blog/asynchronous-timewarp-examined/)
in the case of a dropped frame.
Therefore, it is important not to drop frames and keep the frame rate.
But there is no way to get the target frame rate and furthermore the needed
WebGL
[EXT_disjoint_timer_query](https://www.khronos.org/registry/webgl/extensions/EXT_disjoint_timer_query)
extension is not widely supported (i.e. not on my phone).
Thus, this is not implemented.
## Discussion
Virtual reality can improve the understanding of a radiologic volume data,
but looking at the slices remains necessary to get all the information.
Due to the high amount of information in the volume data,
a restriction might prove advantageous,
insofar as it includes the significant information.
A next step is to integrate the WebVR renderer in the
[sectional-anatomy.org](https://sectional-anatomy.org)
viewer.
Furthermore, support for room scale VR with tracked hand controllers can be
considered, but the implementation would require considerable work.
This source diff could not be displayed because it is too large. You can view the blob instead.
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment