“The research leading to these results has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under grant agreement n° 218814 "PRoVisG".”
You know about the PRoVisG Project and the field trials and developing imaging technology and driving the rover. The main question is then: what do we actually see? What can these complicated-sounding-visual technologies achieve?
You can see the 3D animations created from multiple still images on the Cool Field Trials Videos Page.
Here is an example of a colour product of PRoVisG technology: a 3D data simulation of Clarach Bay.
The 3D data for this simulation has been obtained by a field test of the PRoVisG Consortium at Clarach Bay, Aberystwyth, UK, in July 2010. It was taken by a stereo camera arrangement on board the Bridget Rover, simulating the ExoMars PanCam (to be flown to Mars in 2018), provided by Aberystwyth University. 3D Vision processing was performed by the PRoVisG Processing Suite PRoViP (Planetary Robotics Vision Processing), integrated by Joanneum Research. Rendering took place using the Aardvark Rendering Library by VRVis, customized in preparation for the ExoMars 3D Vision application case.
Credits: PRoVisG Consortium.