Research & Publications


Journal Articles


Patents

Conference Proceedings/Presentations

R & D Projects

Point-of-Care Testing and IoT-Based System for Real-Time Cotton Crop Disease Detection

The agriculture sector is an important pillar of Pakistan’s economy and includes various crops cultivated throughout the year. Among them, the cotton crop is considered as one of the prominent agricultural resources and is widely cultivated in Sindh and Punjab provinces. However, the cotton crop has suffered huge losses due to different infectious diseases such as the cotton leaf curl virus (CLCV/CLCuV), bacterial blight, and ball rot during the last few decades. This project proposes a real-time, distributed, and coordinated system for the detection of cotton crop diseases specially CLCV and bacterial blight in their early stages. 

(Funded By: Sindh Higher Education Commission, Pakistan)      Visit Project Page

Multi-Sided Tabletop Augmented Reality 3D Display Coupled with 3D Imaging

A tabletop augmented reality (AR) 3D display paired with a remote 3D image capture setup that can provide three-dimensional AR visualization of remote objects or persons in real-time. The front-side view is presented in stereo-3D format, while the left-side and right-side views are visualized in 2D format. Transparent glass surfaces are used to demonstrate the volumetric 3D augmentation of the captured object. The developed AR display prototype mainly consists of four 40 × 30 cm2 LCD panels, 54% partially reflective glass, an in-house developed housing assembly, and a processing unit. The capture setup consists of four 720p cameras to capture the front-side stereo view and both the left- and right-side views. The real-time remote operation is demonstrated by connecting the display and imaging units through the Internet.

(Partially Funded By: Ignite, National Technology Fund, Pakistan)

Integrated Multi-view 3D Image Capture and Display System

An integrated 3D image capture and display system using a transversely moving camera, regular 2D display screen and user tracking that can facilitate the multi-view capture of a real scene or object and display the captured perspective views in 3D. The motion parallax 3D technique is used to capture the depth information of the object and display the corresponding views to the user using head tracking. The system is composed of two parts, the first part consists of a horizontally moving camera interfaced with a customized camera control and capture application. The second part consist of a regular LCD screen combined with web camera and user tracking application. The 3D multi-view images captured through the imaging setup are relayed to the display based on the user location and corresponding view is dynamically displayed on the screen based on the viewing angle of the user with respect to the screen. 

Projection Type Interactive Windshield Head-Up Display

A high luminant and transparent interactive HUD display enabled by the micro-structured retroreflective surface. The high brightness is supported by exploiting the narrow-angle scattering property of the retroreflective surfaces while the transparency is achieved by using a transparent glass screen at 45 degrees. The real-time interaction with the displayed content is provided by using a leap motion sensor. The interactive user interface is developed in Unity3D. The experimental setup is demonstrated using the commercial retroreflective fabric and DLP multimedia projector. The developed system is tested in the hatchback car in daylight outdoor settings. The results show high perceived brightness of the display within the viewing angle of ±10 degrees. transmission efficiency of the developed prototype was >60%.

Visual Acuity Response in 3D Head-Up Displays

Pyschophysical study on the visual acuity response when an indigenously developed 75% transparent retroreflective screen is used as a windshield 3D HUD. The simulated optical collimation technique used to provide the virtual content at a farther depth (i.e. on the road while driving). Two user test experiments performed. The results showed a slightly declining trend (from 20/20 to 20/25) in visual acuity response when the HUD screen was placed between the viewer and the scene. An inverse relation between the amount of AC conflict and visual acuity was observed under the simulated collimation condition. The >100 cm user-to-screen distance was found to be comfortable, providing the highest acuity response.

Integrated 3D Display and Imaging System

Integrated 3D display and imaging system using a head-mounted device and a special dual-purpose passive screen that can simultaneously facilitate 3D display and imaging. The screen is mainly composed of two optical layers, the first layer is a projection surface, which are the finely patterned retro-reflective microspheres that provide high optical gain when illuminated with head-mounted projectors. The second layer is an imaging surface made up of an array of curved mirrors, which form the perspective views of the scene captured by a head-mounted camera. The display and imaging operation are separated by performing polarization multiplexing.

Light Field Capture Using Handheld Lens Array Surface

A light field 3D imaging technique based on a handheld optical surface that contains two-dimensional array of thin lenses with embedded tracking markers, a regular camera and backend processing module. Each lens in the surface forms images of the scene from unique perspective and the surface as whole behaves as a multi-perspective camera array. All of the perspective views are recorded by the camera in a single shot and processed computationally.  The geometric optics is utilized to trace and analyze light field rays. The light field image capture and reconstruction are experimentally demonstrated by synthesizing the digitally refocused images and partially occluded objects in real-world scenes. The adaptive parameterizing is performed by determining the spatial coordinates of the surface through tracking markers, while the real-time reconstruction is performed by parallel processing of ray projections.

Light Efficient Augmented Reality 3D Displays

A light efficient 3D display using a highly transparent desktop-size augmented reality screen. The display consists of a specially designed transparent retro-reflective screen and a pair of low power pico-projectors positioned close to the viewer’s eyes to provide stereo views. The transfer screen is an optically clear sheet partially patterned with retro-reflective microspheres for high optical gain. The retro-reflective material buried in the screen reflect incident light back towards the projectors with narrow scattering angle and facilitates the viewer to perceive a very bright content. The tabletop prototype mainly consists of an in-house fabricated large AR screen (60x40cm2) and a pair of laser scanning 30 lumen pico-projectors.

Transparent Retroreflective Screens

A transparent retro-reflective screen, which can be used as head-up-display (HUD) or a see-through screen for head mounted projection displays (HMPD) is proposed. The high optical gain of screen enables the use of low power projectors to produce very bright content. The screen assembly is based on retro-reflective microspheres, patterned on an optically clear substrate using steel stencil as a shadow mask. The incident light is retro-reflected as a narrow angular cone to create eyebox for the viewer. The optical gain and transparency of screen is varied by changing the fill factor of the mask. The optical design and fabrication of the screen is presented. The retro-reflective and transmission characteristics of screen are evaluated. 

Decoupling Content in Spatial Augmented Reality Systems

A novel time sharing based technique that facilitates the real and digital content decoupling in real-time without crosstalk. The proposed technique is based on time sequential operation between a MEMS scanner based mobile projector and rolling shutter image sensor. A MEMS mirror based projector scans light beam in raster pattern pixel-by-pixel and completes full frame projection over a refresh period, while a rolling shutter image sensor sequentially collects scene light row-by-row. In the proposed technique, the image sensor is synchronized with scanning MEMS mirror and precisely follows the display scanner with a half-period lag to make the displayed content completely invisible for camera. An experimental setup consisting of laser pico-projector, an image sensor and a delay and amplifier circuit is developed. 

Mobile 3D Telepresence Using Dual Purpose Screen

A new integrated platform to provide mobile 3D telepresence experience using a head-worn device and a dual-purpose passive screen. At the core of this telepresence architecture, we use a portable multi-layered passive screen which facilitates the stereoscopic 3D display using a pair of head-worn projectors and at the same time, captures the multi-perspective views of the user on a head-worn camera through reflections of the screen. The screen contains retroreflective material for stereo image display and an array of convex mirrors for 3D capture.