OzViz 2010‎ > ‎

OzViz 2010 Program

OzViz 2010, Queensland Bioscience Precinct, University of Queensland

Schedule for 2nd December 2010

8:30 - 9:00 Registration. Coffee/Tea.
9:00 – 9:15 Con Caris WELCOME
9:15-10:00 Gary Delaney
Visualisation of complex large scale industrial and geophysical fluid and particle flows

Particle methods are powerful tools for predicting the transient behaviour of particle and fluid based systems found in industrial applications and geophysical flows. They present substantial challenges for visualisation. These include methods for rendering individual particulates or methods for generating meshes required for visualising the complex free surfaces that are reconstructed from the particles of specific fluid flows.
10:00 -10:30 David Warne,
Joe Young and
Neil Kelson (QUT)
Developments in Texture-based Vector Field Visualisation Techniques

Vector field visualisation is one of the classic sub-fields of scientific data visualisation. The need for effective visualisation of flow data arises in many scientific domains ranging from medical sciences to aerodynamics. Though there has been much research on the topic, the question of how to communicate flow information effectively in real, practical situations is still largely an unsolved problem. This is particularly true for complex 3D flows. In this presentation we give a brief introduction and background to vector field visualisation and comment on the effectiveness of the most common solutions. We will then give some examples of current development on texture-based techniques, and given practical examples of their use in CFD research and hydrodynamic applications. Download image here.
10:30 -11:00 Mark Barry and Joseph Young (QUT) Rainbows belong in the sky

One of the most basic visualisation techniques is colour mapping - the mapping of a scalar data parameter to a set of colours.  The default offered by many software applications is the ‘rainbow’ colour map.  This talk aims to highlight that care needs to be taken when choosing colours to represent parameter values, and in particular why the rainbow colour map may be an inappropriate choice if one wants to faithfully represent the underlying data.
11:00 – 11:30 Mark K. Ho, Guan H. Yeoh (ANSTO),
Victoria Timchenko, John A. Reizes (UNSW)
Intersection Marker (ISM) method for tracking a deformable 2d surface in 3D Eulerian Space

A new method for tracking a deformable interface in 3D Eulerian space has been developed. The method can model an arbitrary 3D shape immersed inside an array of uniform hexahedral control volumes by using a combination of planar polygons. Since each planar polygon intersects the edges of the control volume and the combination of cell-edge intersections uniquely identifies the type of polygon a control volume holds, this new explicit tracking method has been named the ‘(I)nter(S)ection (M)arker’ (ISM) method for interface tracking. The intended application of the ISM method is in CFD simulations of multiphase flows.
Based on the observation that three points will always define a plane, the simple polygon interface defined in each cell requires further subdivision into triangular elements so that the advected elements will always remain planar. By subdividing the local polygon interface into smaller triangular elements, the curvature and area of the surface can be modelled to a higher degree of accuracy. Calculation of the local fraction-of-volume (VOF) value is simply achieved by summing the volumes of triangular columns beneath the surface. Central to the ISM method is the tightly coupled manner in which information is passed between the Eulerian and Lagrangian meshes. This was designed for the purpose of retaining the volume conservation characteristic of the VOF method whilst avoiding the problem of interface diffusion associated with the VOF method. The F95 program was executed using an Intel Visual Fortran compiler. Graphical output is in Tecplot format and animations to demonstrate the moving mesh will be shown during the presentation. Download PDF here.
11:30 – 12:00 Early lunch
12:00 – 12:30 Travel to Brisbane Planetarium
12:30 – 14:00 Mark Rigby, curator
Sir Thomas Brisbane Planetarium
Brisbane Planetarium - “Visualising a Universe Of Opportunity “

The presentation will highlight examples of fulldome immersive programs, real-time flying through the Solar System and beyond, as well as examples of short artistic films created around the world for the dome environment and shown at various Domefest presentations in recent years.
14:00 – 14:30 Travel back to UQ (CSIRO QBP)
14:30 – 14:40 Coffee/Tea Break
14:40 – 15:10 David McKinnon (QUT) Real-Time 2D to 2D+Depth Video Conversion

A technique is presented for the real-time conversion of 2D video to 2D + Depth video. This technique could be used as a plug-in to a standard matchmoving tool utilising the camera path to determine a stereo depth-map associated with each image of the video stream.  The technique described is in two parts. Firstly, a multi-view stereo adaptation of a cross-based local stereo method is implemented on the GPU to provide depth for each frame of the video sequence.  Secondly, the stereo depth-maps are filtered and refined using a multi-view consistency checking process utilising the capabilities of OpenGL to maintain real-time performance.  The result is a very rapid process for the creation of 3D video suitable for use in VFX or robotics type applications.
15:10-15:40 Tomasz Bednarz (CSIRO CESRE) Demoscene, the art of real time

Demoscene was born in the European computer underground, and demos are the product of extreme programming and self-expression. This presentation will re-reveal best pieces of old-school demoscene art, from the times when the computers were slower than your current mobile device, and no GPUs = all software rendered. In the second part of this presentation, you’ll watch recent Demoscene productions, enjoy the art of accelerated graphics and real-time visualisation techniques all wrapped in a beautiful artwork. Also, find out how to create 4KB intro using OpenGL.
15:40 – 16:10 Jennifer Seevinck and Ernest Edmonds (USYD) Novel method for instantiating emergent shapes

This paper describes a real-time, interactive graphics application that is part a of a Ph.D. submission. This application is the interactive art system +-now; a tangible, augmented reality work by the first author. The paper begins by describing this work. The aesthetic intention behind the work is then described. This intention is the facilitation of a reflective and meditative interaction experience. This design problem has been addressed through emergence. Specifically, the interpretation of new and surprising forms – emergent forms – is analogous to staring at the clouds and interpreting figures there [1]. In addition to describing +-now, its aesthetic intention and the emergent shape solution; an explanation of emergence is also briefly given. This is followed by a description of what is new in this approach. This novel approach is compared to other research into emergent shapes. These are the areas of design research, creativity and collaborative design, briefly described. Some evaluation study feedback on the art system is then summarised. The paper concludes with a discussion of the future directions of the art system.
[1] Emergence is where something new and unexpected occurs. IT is heterogeneously new to what was there before; and it is greater than the sum of its parts.
16:10 – 16:40 Andrew Strange, Jonathon Ralston
Real-time Visualisation of Subsurface Radar Data for Advanced Mining Automation

The CSIRO Mining Technology Research Group has developed a mobile subsurface imaging system that can be used to acquire and visualise subsurface radar data in real-time.   This capability is important in the development of geologically intelligent mining systems.   The presentation will provide an overview of the acquisition and communications system, with an emphasis on the real-time radar 3D data visualisation component.  A live demonstration of the system will also be given to highlight the value of the approach.
17:30 – 20:00 Pizza and Beer at UQ Pizza Store

OzViz 2010, Queensland Bioscience Precinct, University of Queensland

Schedule for 3rd December 2010

8:30 – 9:00 Registration. Coffee/Tea.
9:00 – 9:30 Paul McIntosh (VPAC) Visualisation using the GPU Computing Processor

Presented is an overview of the visualisation techniques developed for the upcoming MASSIVE (Multi-modal Australian ScienceS Imaging and Visualisation Environment) GPU clusters. These techniques have been prototyped using the Monash Sun Grid GPU nodes and demonstrate interactive visualisation based on a GPGPU HPC system.
9:30 – 10:00 Ajay Limaye (ANU) Multi-resolution Volume Rendering on GPU

Growing sizes of volumetric data sets pose a great challenge for interactive exploration.  In this talk I will present my attempts at tackling this problem via multi-resolution volume rendering.
10:00 – 10:30 Derek Gerstmann (UWA) Visualisation for Microscopy & Microanalysis

A survey of recent visualisation projects done at the Centre for Microscopy, Characterisation and Analysis will be presented, along with a discussion of the relevant challenges and difficulties encountered along the way. Highlights include samples from EM tomography, optical coherence tomography, and multi-flourescence confocal microscopy.
10:30 – 10:45 Coffee/Tea Break
10:45 – 11:15 Stuart Ramsden (ANUSF VizLab) Visualizing Periodic Tilings with Circle Packings

After investigating the use of Circle Packings to determine unique embeddings for portraits of Periodic Tilings, we demonstrate how a small, finite description, the Delaney-Dress Tiling Symbol, contains all required information necessary to generate endlessly repeating symmetric tilings.
11:15 – 11:45 Greg Krahenbuhl, George Poropat (CSIRO CESRE) Visualisation of Large Open Pit Mines

It is becoming increasingly common for 3D imaging systems such as Sirovision [1] to generate extremely large mosaics of 3D images. These data typically don’t fit in RAM let alone in VRAM. To allow visualisation and structure mapping on these images we use a combination of a view-dependent Level Of Detail (LOD) and data streaming from disk or networks. In our approach we represent the input 3d image in a tree of constant size chunks where the root node corresponds to the full image at its lowest detail level and the child nodes represents parts of the subdivised image at a higher resolution level and so on. The resolution is typically increased by a factor two at each level of the tree. Depending on the camera parameters chunks of appropriate level of details are loaded and rendered while chunks that are not on display are culled.
[1] http://www.sirovision.com/
11:45 – 12:15 Jeremy Thompson, Mark Dunn,
Chad Hargrave, Con Caris,
Tomasz Bednarz (CSIRO CESRE)
3D Data Fusion and Processing for a Teleoperated Shiploader

Bulk iron ore shiploading is a complex and demanding process that requires great concentration from the local operator in order to ensure safe and efficient loading.  However, due to both safety and productivity incentives, it is desirable that the operator be removed from the immediate vicinity of the shiploader to a non-line-of-sight remote location.  A project to implement such a teleoperation system on a shiploader was recently undertaken by the CSIRO in conjunction with Rio Tinto.  In order to provide the remote operator with equivalent or improved situational awareness from his or her remote location, new sensors were added to instrument the shiploader for teleoperated control.  Two main classes of instrument were provided: video camera technology (including a spherical camera system) to provide the operator with direct visual feedback of the region around the shiploader, and range measurement technologies that facilitated the creation of a 3D model of the ship currently under load.  The output of these systems is presented to the operator via two main interfaces: a Panoramic Display System (PDS) featuring video projection of the spherical camera data onto a panoramic dome, and a custom-built shiploader HMI that features the 3D model and related views.  This presentation will feature some of the processing and display technologies used to create both the PDS and the HMI.
12:15 – 12:45 Lunch Break
12:45 – 13:15 Bernard Meade (University of Melbourne) Vis Projects at the University of Melbourne

This presentation will briefly summarize some of the visualization projects occurring across the University
13:15 – 13:45 Justin Baker (CSIRO) CSIRO High Resolution Visualisation Platform

CSIRO’s eResearch Program has identified visualisation one of several key enabling technologies to help deal with the exponentially increasing data volumes and data complexity affecting all disciplines. The initial focus of the new eResearch Visualisation Service is on the provision of high-end visualisation hardware including a very high resolution display system known as CURVS.
The presentation will give some background on CURVS development, usage and underlying architecture.
13:45 – 14:15 Craig A James, Tomasz P Bednarz, Con Caris,
Kerstin Haustein, Andrew Castleden (CSIRO CESRE), Leila Alem
“You’ve got a friend in me”: An exploration of tele-collaboration and immersive environments

Tele-operation systems aim to remove an operator from a hazardous or inconvenient environment while providing enough sensory information such that efficiency is maintained or even enhanced.  Similar systems are also used to provide collaboration and assistance in remote problem solving situations, including emergency scenarios. The increased availability of high-speed wired and wireless data networks is promoting the use of immersive environments but there is not enough evidence yet to support whether such environments significantly improve the field-tested performance of tele-operation systems or not.  This presentation discusses a recent experiment in tele-collaboration that aims to establish measures for assessing the performance of immersive environments when tele-operating a vehicle.
14:15 – 14:30 Coffee/Tea Break
14:30 – 15:00 Ken Taylor,
Bagus Manuaba, Stephen Bersot, Dingyun Zhu (CSIRO ICT Canberra)
Using computer gaming environments to develop situated virtual reality methods for teleoperation of mining equipment.

An important aspect of teleoperation is situational awareness through visualisation. The actual operation and control of a remote machine must be supported by an interface which provides enough information through visualisation from a remote location to complete a task. This can be achieved with a Mixed Reality (MR) environment. The concept is to combine information from the real world and a virtual world. We built a bespoke interface for a rock breaker which integrated 3D models and video streams and tested this in a production environment where we found the operators mainly used the video feedback. To investigate the opportunity to adapt existing computer gaming technologies as an alternative to bespoke interface development we have built similar interfaces in one of the most popular multi-player gaming environments Second Life as well as the Simmersion - Mycosm and Unity gaming engines. Experiments were conducted to assess the differences between these platforms and to determine interface features required to maximize operator performance and satisfaction. For the supervisory task based model of control concept rather human in the loop control, manipulating a model in a virtual environment and seeing the effect on a video overlay was easier to understand and more effective than specifying a location in a video overlay directly. Therefore gaming environments to be used for teleoperation should allow logic to be applied during object manipulation to facilitate this method of interaction. The more productive interface was not preferred by nearly half of the subjects. Ideally we would substitute completely the view of the physical model by the 3D model and to teleoperate machinery without direct video feedback and have built one version that operates this way but applying this solution requires identification and representation of all elements of the scene important to completing the task. One issue that operators have had difficulty with is the multitasking required for camera viewpoint control while concentrating on the primary task. To address this we have also integrated natural human interaction techniques including eye gaze and head tracking to provide intuitive and hands free camera viewpoint control which in our experiments has been effective in improving performance and which we have integrated into the Unity gaming environment
15:00 – 15:30 Con Caris Virtual Mining Centre

This presentation will briefly present the visualization projects happening at the Queensland Centre for Advanced Technologies in Pullenvale.
15:30 – 16:00 Closing discussion at afternoon tea/coffee