Relightable Buildings from Images
This work proposes a complete image-based process that facilitates recovery of both gross scale geometry and local surface structure to create highly detailed 3D models of building façades from photographs. We approximate both albedo and sufficient local geometric structure to compute complex self-shadowing effects, and fuse this with a gross scale 3D model. Our approach yields a perceptually high-quality model, imparting the illusion of measured reflectance. 

The requirements of our approach are that image capture must be performed under diffuse lighting and surfaces in the images must be predominantly Lambertian. Exemplars of materials are obtained through surface depth hallucination, and our novel method matches these with multi-view image sequences that are also used to automatically recover 3D geometry. 
External link

Modelling Appearance and Geometry from Images (Ph.D. Thesis)
Acquisition of realistic and relightable 3D models of large outdoor objects, such as buildings, requires the modelling of detailed geometry and visual appearance. Recovering these material characteristics can be very time consuming and requires specially dedicated equipment. Alternatively, surface detail is conveyed by textures recovered from images, whose appearance is only valid under the originally photographed viewing and lighting conditions. Methods to easily capture local detailed surface, such as cracks in stone walls, and visual appearance, require control of lighting conditions, which restricts them to small portions of the surface captured at close are range. This thesis investigates the acquisition high-quality models from images, using simple photographic equipment and modest user intervention. 

The main focus of this investigation is on approximating detailed local depth information and visual appearance, and combining this with gross-scale 3D geometry obtained using a different image-based approach. This is achieved by capturing these surface characteristics in small accessible regions and transferring them to the complete façade. This approach yields high-quality models, imparting the illusion of measured reflectance. 

In this thesis, we first present two novel algorithms for surface detail and visual appearance transfer, where these material properties are captured for small exemplars, using an image-based technique. Second, we developed an interactive solution to solve the problems of performing the transfer over a large change of scale and to the different materials contained in a complete façade. Aiming to completely automate this process, a novel algorithm to differentiate different materials in the façade and associate them with the correct exemplars is introduced with promising results. Third, we present a new method for texture reconstruction from multiple images that optimises texture quality, by choosing the best view for every point and minimising seams. Material properties are transferred from the exemplars to the texture map, approximating reflectance and meso-structure. The combination of these techniques result in a complete working system capable of producing realistic relightable models of full building façades, containing high-resolution geometry and plausible visual appearance.

External link

Deadalus Project 
I worked on a UK-EPSRC funded project to research the application, to architectural design and planning, of 3D reconstruction and perceptually realistic lighting methods. We will use these to create interactive augmented reality environments (from photographs) which can be realistically re-illuminated and used to visualise the impact of new developments under various lighting conditions. This project is led byProf.  Roger Hubbold (AIG) and working closely with Dr.  Mashhuda Glencross (AIG)Greg Ward (Anyhere Software), Rob Rhodes at Napper Architects, and Warwick Digital Laboratory

During this project, we designed a novel method called Surface Depth Hallucination which offers a simple fast way to acquire albedo and depth for textured surfaces that exhibit mostly Lambertian reflectance. We obtain depth estimates entirely in image space, and from a single view so there are no complications that arise from registering texture with the depth recovered. Further information, examples and videos can be found here.

Resulting the investigations carried out during this project we developed a complete system for building reconstruction from images. Further details can be found here.

External link 

Generación de Imágenes Foto-realistas Mediante Ilumnación Basada en la Imagen (Master Thesis)
Generation of Photo-realistic Images using Image Based Lighting (IBL)

During this project I integrated and compared different sampling methods for efficient Image Based Lighting. The project included manipulation of High Dynamic Range environment maps, implementation of an point-light sources approximation for IBL, and integration of with the Physically Based Renderer PBRT. I reviewed techniques for lighting focus on sampling methods for environment maps, and relighting, which normally include especific reflectance capture and representation.

You can find the thesis in the download area. 

Real-Time Surgical Simulator
This is a on-going project started by JC Nebel in the Galsgow University and currently followed in Kingston University London. Its aim is to investigate techniques which are necessary to generate, model, and deform 3D data for applications related to virtual surgery. 

As part of this project, I developed the first step of a surgical simulator which included algorithms for animating deformable objects in real-time. It focuses on computing the deformation of an object subject to external forces and detecting collisions among deformable and rigid objects. I integrated these algorithms in a 3D real-time user interface, allowing the use of several tools. This was my final bachelor's degree project, and you can find the report describing the techniques implemented in the download area.
External link