First Workshop on Photorealistic Image and Environment Synthesis
for Mixed Reality (PIES-MR)

To be held virtually in conjunction with IEEE ISMAR 2022.

Promoting the synthesis of photorealistic images and virtual environments for research purposes.

Mission of the project

The First Workshop on Photorealistic Image and Environment Synthesis for Mixed Reality (PIES-MR) will engage experts and researchers on the synthesis of photorealistic images and virtual environments, particularly in the form of public datasets, software tools, and infrastructures, for mixed reality (MR) research, including both augmented reality (AR) and virtual reality (VR). Such public datasets, software tools, and infrastructures will enable researchers to better investigate how photorealism affects MR, AR, and VR experiences. Additionally, such openly available assets should enable easier creation of such photorealistic environments for developers. Photorealistic image and environment synthesis can benefit multiple research areas in addition to MR, such as machine learning, robotics, human perception, and multimedia systems.

Call for Papers

Note: PIES-MR will be held virtually.

Topics of Interest

The topics of interest include but are not limited to:

● 360° image/video capture

● 360° image/video playback

● Automated/semi-automated reconstruction

● Environment capture/scanning

● Evaluations of photorealistic images and environments

● High-polygon mesh representations

● Image/video capture

● Image/video playback

Machine learning models based on photorealistic data

● Point cloud representations

● Real-time raytracing

● Simultaneous localization and mapping (SLAM)

Contribution Types

The following types of contributions will be considered for PIES-MR:

● Research, review or position papers: 4-6 pages excluding references

● Demo / practical papers: 2-4 pages excluding references, plus a video up to 5 minutes.

Submission Guidelines

Authors are invited to submit original contributions. The workshop’s papers will be published in ISMAR 2022 adjunct proceedings and IEEE Xplore. Accepted papers must be formatted using the IEEE Computer Society VGTC format.

All submissions must be done electronically by emailing Charlie Hughes at

Important Dates

Submission deadline: Friday, August 5th, 2022 (23:59 AoE)

Result notification: Monday, August 15th, 2022

Camera-ready deadline: Friday, August 26th, 2022 (23:59 AoE)

Virtual workshop: Friday, October 21st, 2022 (15:30 - 18:30 SGT)


Charlie Hughes, University of Central Florida

Jeanine Stefanucci, University of Utah

Ryan P. McMahan, University of Central Florida

Examples of PIES


Chang, A., Dai, A., Funkhouser, T., Halber, M., Niessner, M., Savva, M., Song, S., Zeng, A. and Zhang, Y. 2017. Matterport3d: Learning from rgb-d data in indoor environments. arXiv preprint arXiv:1709.06158. (2017).


Fu, H., Cai, B., Gao, L., Zhang, L., Li, C., Xun, Z., Sun, C., Fei, Y., Zheng, Y. and Li, Y. 2020. 3D- FRONT: 3D Furnished Rooms with layOuts and semaNTics. arXiv preprint arXiv:2011.09127. (2020).

Replica Dataset

Straub, J., Whelan, T., Ma, L., Chen, Y., Wijmans, E., Green, S., Engel, J.J., Mur-Artal, R., Ren, C. and Verma, S. 2019. The Replica dataset: A digital replica of indoor spaces. arXiv preprint arXiv:1906.05797. (2019).


Zheng, J., Zhang, J., Li, J., Tang, R., Gao, S. and Zhou, Z. 2019. Structured3d: A large photo- realistic dataset for structured 3d modeling. arXiv preprint arXiv:1908.00222. 2, 7 (2019).


Contact to get more information on the workshop.