For my internship at NASA's Goddard Space Flight Center's AR/VR Research group, I created a Volume Rendering tool to import and render NetCDF weather data. This data visualization tool included a few scripts to assist with importing raw NetCDF data into Unity, and a shader that rendered the data directly or by an isosurface. The tool was developed for NASA's Mixed Reality Exploration Toolkit, MRET.
Direct volume rendering is when one renders a 3D texture directly, or just by sampling a density function and mapping its outputs to a colour map. Direct volume rendering would simply be a means to view any data given a colour map and a reading on how to interpret the colour map. I used raymarching to render the data within the 3D Texture that was input into the shader. Below are some examples of specific slices of directly rendered weather data.
Isosurface rendering is the rendering of a group of points with the same value in a given dataset into a 3D object. In the example seen to the right, this isosurface was rendered with the same dataset mentioned above in the Direct Volume Rendering for data at a specific value on the 3D Texture that was passed into the shader. The shader once again uses raymarching to obtain a point on the surface, then uses the surface normal to calculate lighting and colour of the surface. The colour, once again, is mapped using a sampled colour map.
A NetCDF file is a very particular format of file that many meteorologists use to hold weather data. Its typical format involves having subsections for dimensions, variables, and data, for which each subsection in data is an N Dimensional array that corresponds to a particular variable name in either sections variables and dimensions. Typically, for data that is to be plotted on Earth scale, dimensions would hold quantities for latitude, longitude, and height for the data being plotted, and in variables, the list of characterization traits that depend on latitude, longitude, and height positions. For this tool, we standardised the latitude, longitude, and height data to be 1D arrays, while the characterization data are 3D arrays. From parsing this data, we filled a 3D Texture with all of the data points from the 3D array from the selected category of data we wanted to visualize and passed this and a colour map that was created on the Inspector in Unity to the shader.
Photos of the Data