Research & Publications
Journal Articles
S.R. Soomro, Santiago Sager, Alba M. Paniagua-Diaz, Pedro M. Prieto, Pablo Artal, "Head-mounted adaptive optics visual simulator," Biomedical Optics Express 15, 608-623 (2024)
M. Saad, S. Iqbal, S.R. Soomro, "Design and Development of a Multi-Sided Tabletop Augmented Reality 3D Display Coupled with Remote 3D Imaging Module ", Journal of Engineering and Technological Sciences, 54(6), 2022.
M. Lal, S.K Baloch, S.R. Soomro,“Integrated Multi-view 3D Image Capture and Motion Parallax 3D Display System” EMITTER International Journal of Engineering Technology, 10 (1), 2022.
S.R Soomro, H. Urey, “Visual acuity response when using 3D head-up-display in the presence of accommodation-convergence conflict ” Journal of Information Display, 21(2), 2020. (Recipient of Annual JID Award by Korean Information Display Society 2021)
S.R Soomro, H. Urey, “Integrated 3D Display and Imaging Using Dual-purpose Passive Screen and Head-mounted Projectors and Camera”, Optics Express 26 (2), 1161-1173 (2018).
S.R Soomro, H. Urey, “Light-efficient Augmented Reality 3D Display using Highly Transparent Retro-reflective Screen”, Applied Optics 56 (22), 6108-6113 (2017).
S.R Soomro, E Ulusoy, H. Urey, “Decoupling of real and digital content in projection-based augmented reality systems using time-multiplexed image capture”. Journal of Imaging Science and Technology, 61(1): 10406-1-6 (2017).
S.R Soomro., H. Urey, “Design, Fabrication and Characterization of Transparent Retro-reflective Screen”. Optics Express 24 (21): 24232-24241(2016).
S. Memon, S.K. Baloch, S.R. Soomro, I.A Tunio, O. Rehman, S. Ahmed "Design and Finite Element Modeling of Electro-Thermal Actuator for Biological Applications",Journal of Computing & Biomedical Informatics, 5(1), 2023.
S. Anwar, S.R. Soomro, S.K. Baloch, A.A. Patoli, A.R. Kolachi, "Performance Analysis of Deep Transfer Learning Models for Automated Detection of Cotton Plant Diseases" Journal of Engineering Technology & Applied Research, 13(5), 2023.
AR Kolachi, S.R. Soomro, S.K Baloch, A.A Patoli, S. Anwar, "Cotton Leaf Disease Classification Using YOLO Deep Learning Framework and Indigenous Dataset", International Journal of Systematic Innovation, 7(7), 2023.
Patents
H. Urey, S.R Soomro, “Physical Object Reconstruction Through a Projection Display System” US Patent US10739670. (Issued: August/2020)
H. Urey, S.R Soomro, M. Eralp, “A Dual-Function Display and Multi-View Imaging System”, US Patent US10531070. (Issued: January/2020)
Conference Proceedings/Presentations
S.R. Soomro, P. Artal, "Adaptive Optics on Head: Towards Wearable AO Instruments for Visual Evaluation", International Workshop on Adaptive Optics for Industry and Medicine, Padova, Italy (2024).
A. Paniagua-Diaz, J. Mompeán, S.R. Soomro, P. Artal "Wearable and see-through visual simulator based on liquid crystals", Optical Architectures for Displays and Sensing in Augmented, Virtual, and Mixed Reality (AR, VR, MR), SPIE Photonics West (2024)
P. Artal, S.R. Soomro, "A Head-Mounted Adaptive Optics Visual Simulator", Investigative Ophthalmology & Visual Science 64 (8), Investigative Ophthalmology & Visual Science 64 (8), 2882-2882(2023)
S.R. Soomro, P. Artal, A handheld adaptive optics device for personalized visual evaluation, Ophthalmic Technologies XXXIII, SPIE Photonics West (2023).
A. M Paniagua-Diaz, J. Mompean, S.R. Soomro, P. Artal "Economic and compact modulation unit for visual simulation: the advantages of vertical aligned liquid crystal devices", Ophthalmic Technologies XXXIII, SPIE Photonics West (2023).
S. Anwar, A. R. Kolachi, S. K. Baloch and S. R. Soomro, "Bacterial Blight and Cotton Leaf Curl Virus Detection Using Inception V4 Based CNN Model for Cotton Crops," 2022 IEEE 5th International Conference on Image Processing Applications and Systems (IPAS), Genova, Italy, 2022, pp. 1-6, doi: 10.1109/IPAS55744.2022.10052835.
S.R. Soomro, P.Artal, "A Portable Adaptive Optics Instrument for Visual Simulation", 10th Visual and Physiological Optics Meeting, Cambridge, UK (2022).
Z.A. Shaikh, S.A Hussain, S.A. Bhangwar, S.R. Soomro, "Projection Type Interactive Head-up Display Based on Retroreflective Surface and Leap Motion" 6th International Multitopic ICT Conference, Pakistan (2021).
M.K. Hedili, E .Ulusoy, S. Kazempour, S.R. Soomro, H. Urey "Next Generation Augmented Reality Displays, IEEE Sensors (2018)
S.R Soomro, O. Eldes, H. Urey, "Towards Mobile 3D Telepresence Using Head-Worn Devices and Dual-Purpose Screen", IEEE Virtual Reality and 3D Interfaces, Munich, Germany (2018)
S.R. Soomro, O Eldes, K Aksit, H. Urey, "Mobile 3D Imaging Using Lens Array Sheet and Single Camera", IS&T Electronic Imaging, Imaging Sensors and Systems, California, US (2018)
S.R Soomro, H. Urey, “Augmented Realit 3D Display Using Head-Worn Projectors and Passive Screens”. In Fotonik 2017 | 19. Ulusal Optik, Elektro-Optik ve Fotonik Çalıştayı, Istanbul.
S.R Soomro., H. Urey, “Desktop Size See-through Screens for 3D Augmented Reality”. In International OSA Network of Students, Paris (2017).
H. Urey, S.R. Soomro, E. Ulusoy, “Wearable Augmented Reality Displays”. In OSA Imaging and Applied Optics Congress, DM4F.2 (2017).
S.R Soomro., H. Urey, “Augmented reality 3D display using head-mounted projectors and transparent retro-reflective screen”. Proc. SPIE 10126E-1, In Advances in Display Technologies VII (2017).
Soomro S.R., Ulusoy E., Eralp M., Urey H., “Dual purpose passive screen for simultaneous display and imaging”, Proc. SPIE 10126N-1, In Advances in Display Technologies VII, (2017).
S.R Soomro., H. Urey, “Retro-reflective Characteristics of Transparent Screen for Head Mounted Projection Displays”. Annual OSA meeting: Frontiers in Optics, pp FTu5A-2 (2016).
C Genç, Ç., S.R. Soomro, Y. Duyan, S. Ölçer, F. Balcı, H. Ürey, and O. Özcan, “Head Mounted Projection Display & Visual Attention: Visual Attentional Processing of Head Referenced Static and Dynamic Displays while in Motion and Standing”. In Proceedings of the CHI Conference on Human Factors in Computing Systems, pp. 1538-1547, ACM (2016).
S.R. Soomro, M.A. Javed, F.A. Memon, “Vehicle Number Recognition System for Automatic Toll Tax Collection”. In IEEE international conference on robotics and artificial intelligence, pp. 125-129, (2012).
R & D Projects
Point-of-Care Testing and IoT-Based System for Real-Time Cotton Crop Disease Detection
The agriculture sector is an important pillar of Pakistan’s economy and includes various crops cultivated throughout the year. Among them, the cotton crop is considered as one of the prominent agricultural resources and is widely cultivated in Sindh and Punjab provinces. However, the cotton crop has suffered huge losses due to different infectious diseases such as the cotton leaf curl virus (CLCV/CLCuV), bacterial blight, and ball rot during the last few decades. This project proposes a real-time, distributed, and coordinated system for the detection of cotton crop diseases specially CLCV and bacterial blight in their early stages.
(Funded By: Sindh Higher Education Commission, Pakistan) Visit Project Page
Multi-Sided Tabletop Augmented Reality 3D Display Coupled with 3D Imaging
A tabletop augmented reality (AR) 3D display paired with a remote 3D image capture setup that can provide three-dimensional AR visualization of remote objects or persons in real-time. The front-side view is presented in stereo-3D format, while the left-side and right-side views are visualized in 2D format. Transparent glass surfaces are used to demonstrate the volumetric 3D augmentation of the captured object. The developed AR display prototype mainly consists of four 40 × 30 cm2 LCD panels, 54% partially reflective glass, an in-house developed housing assembly, and a processing unit. The capture setup consists of four 720p cameras to capture the front-side stereo view and both the left- and right-side views. The real-time remote operation is demonstrated by connecting the display and imaging units through the Internet.
(Partially Funded By: Ignite, National Technology Fund, Pakistan)
Integrated Multi-view 3D Image Capture and Display System
An integrated 3D image capture and display system using a transversely moving camera, regular 2D display screen and user tracking that can facilitate the multi-view capture of a real scene or object and display the captured perspective views in 3D. The motion parallax 3D technique is used to capture the depth information of the object and display the corresponding views to the user using head tracking. The system is composed of two parts, the first part consists of a horizontally moving camera interfaced with a customized camera control and capture application. The second part consist of a regular LCD screen combined with web camera and user tracking application. The 3D multi-view images captured through the imaging setup are relayed to the display based on the user location and corresponding view is dynamically displayed on the screen based on the viewing angle of the user with respect to the screen.
Projection Type Interactive Windshield Head-Up Display
A high luminant and transparent interactive HUD display enabled by the micro-structured retroreflective surface. The high brightness is supported by exploiting the narrow-angle scattering property of the retroreflective surfaces while the transparency is achieved by using a transparent glass screen at 45 degrees. The real-time interaction with the displayed content is provided by using a leap motion sensor. The interactive user interface is developed in Unity3D. The experimental setup is demonstrated using the commercial retroreflective fabric and DLP multimedia projector. The developed system is tested in the hatchback car in daylight outdoor settings. The results show high perceived brightness of the display within the viewing angle of ±10 degrees. transmission efficiency of the developed prototype was >60%.
Visual Acuity Response in 3D Head-Up Displays
Pyschophysical study on the visual acuity response when an indigenously developed 75% transparent retroreflective screen is used as a windshield 3D HUD. The simulated optical collimation technique used to provide the virtual content at a farther depth (i.e. on the road while driving). Two user test experiments performed. The results showed a slightly declining trend (from 20/20 to 20/25) in visual acuity response when the HUD screen was placed between the viewer and the scene. An inverse relation between the amount of AC conflict and visual acuity was observed under the simulated collimation condition. The >100 cm user-to-screen distance was found to be comfortable, providing the highest acuity response.
Integrated 3D Display and Imaging System
Integrated 3D display and imaging system using a head-mounted device and a special dual-purpose passive screen that can simultaneously facilitate 3D display and imaging. The screen is mainly composed of two optical layers, the first layer is a projection surface, which are the finely patterned retro-reflective microspheres that provide high optical gain when illuminated with head-mounted projectors. The second layer is an imaging surface made up of an array of curved mirrors, which form the perspective views of the scene captured by a head-mounted camera. The display and imaging operation are separated by performing polarization multiplexing.
Light Field Capture Using Handheld Lens Array Surface
A light field 3D imaging technique based on a handheld optical surface that contains two-dimensional array of thin lenses with embedded tracking markers, a regular camera and backend processing module. Each lens in the surface forms images of the scene from unique perspective and the surface as whole behaves as a multi-perspective camera array. All of the perspective views are recorded by the camera in a single shot and processed computationally. The geometric optics is utilized to trace and analyze light field rays. The light field image capture and reconstruction are experimentally demonstrated by synthesizing the digitally refocused images and partially occluded objects in real-world scenes. The adaptive parameterizing is performed by determining the spatial coordinates of the surface through tracking markers, while the real-time reconstruction is performed by parallel processing of ray projections.
Light Efficient Augmented Reality 3D Displays
A light efficient 3D display using a highly transparent desktop-size augmented reality screen. The display consists of a specially designed transparent retro-reflective screen and a pair of low power pico-projectors positioned close to the viewer’s eyes to provide stereo views. The transfer screen is an optically clear sheet partially patterned with retro-reflective microspheres for high optical gain. The retro-reflective material buried in the screen reflect incident light back towards the projectors with narrow scattering angle and facilitates the viewer to perceive a very bright content. The tabletop prototype mainly consists of an in-house fabricated large AR screen (60x40cm2) and a pair of laser scanning 30 lumen pico-projectors.
Transparent Retroreflective Screens
A transparent retro-reflective screen, which can be used as head-up-display (HUD) or a see-through screen for head mounted projection displays (HMPD) is proposed. The high optical gain of screen enables the use of low power projectors to produce very bright content. The screen assembly is based on retro-reflective microspheres, patterned on an optically clear substrate using steel stencil as a shadow mask. The incident light is retro-reflected as a narrow angular cone to create eyebox for the viewer. The optical gain and transparency of screen is varied by changing the fill factor of the mask. The optical design and fabrication of the screen is presented. The retro-reflective and transmission characteristics of screen are evaluated.
Decoupling Content in Spatial Augmented Reality Systems
A novel time sharing based technique that facilitates the real and digital content decoupling in real-time without crosstalk. The proposed technique is based on time sequential operation between a MEMS scanner based mobile projector and rolling shutter image sensor. A MEMS mirror based projector scans light beam in raster pattern pixel-by-pixel and completes full frame projection over a refresh period, while a rolling shutter image sensor sequentially collects scene light row-by-row. In the proposed technique, the image sensor is synchronized with scanning MEMS mirror and precisely follows the display scanner with a half-period lag to make the displayed content completely invisible for camera. An experimental setup consisting of laser pico-projector, an image sensor and a delay and amplifier circuit is developed.
Mobile 3D Telepresence Using Dual Purpose Screen
A new integrated platform to provide mobile 3D telepresence experience using a head-worn device and a dual-purpose passive screen. At the core of this telepresence architecture, we use a portable multi-layered passive screen which facilitates the stereoscopic 3D display using a pair of head-worn projectors and at the same time, captures the multi-perspective views of the user on a head-worn camera through reflections of the screen. The screen contains retroreflective material for stereo image display and an array of convex mirrors for 3D capture.