Time: Monday 11:00 - 12:30
Description: Understanding the structural and photometric properties of star clusters is crucial for interpreting resolved and unresolved stellar populations in the LSST era. In this hands-on workshop, participants will learn how to generate realistic mock catalogs of star clusters using the mksample module of nProFit — a Python-based tool designed for modeling surface brightness profiles and creating synthetic observations consistent with LSST-like conditions. The workshop will emphasize reproducible workflows, encouraging participants to adapt these mock catalogs for testing analysis pipelines, completeness studies, or developing machine-learning approaches for LSST data. This session is ideal for researchers interested in stellar populations, extragalactic star clusters, and LSST time-domain or image-domain.
Time: Monday 15:00 - 16:30
Description: The forthcoming Large Synoptic Survey Telescope (LSST) presents an unprecedented opportunity to study stars and galaxies through a massive dataset spanning about a dozen billion images. However, the sheer scale of this data poses significant challenges to create angular masks suitable to isolate areas of interest, intersect footprints of multiple surveys, or generate synthetic sources at random positions. These are crucial tasks not only for analysis of large-scale structure such as two-point correlation functions or lensing, but also for a broad range of astrophysical investigations such as the mapping of galactic stellar streams and the identification of new Milky Way satellite galaxies. We will present novel software tools and algorithms developed jointly with the LINCC Frameworks Collaboration to address these tasks by pixelizing catalog data and other geometric primitives in order to properly account for survey geometry, depth variations, completeness, among other effects. We will showcase two major examples : (1) How to build a complete mask for Subaru HSC-SSP+WISE surveys in less than a minute, and (2) how to build a 10.000 deg^2, 1.5 billion pixel stellar mask for Rubin in the "austere" environment of Rubin Science Platform.
by Emille Ishida (LPCA, France)
Time: Tuesday 9:30 - 11:00
Description: Next generation experiments such as the Vera Rubin Observatory Legacy Survey of Space and Time (LSST) will provide an unprecedented volume of time-domain data opening a new era of big data in astronomy. To fully harness the power of these surveys, we require analysis methods capable of dealing with large data volumes that can identify promising transients within minutes for follow-up coordination. In this tutorial, I will present Fink, a broker developed to face these challenges. Fink is based on high-end technology and designed for fast and efficient analysis of big data streams. It has been chosen as one of the official LSST brokers and will receive the full data stream. We will go through services provided by the broker, such as data transfer, bulk cross-match and API. I will demonstrate how to interact and modify tools already available for your specific science case and highlight the developments that can be expected once we start receiving Rubin data.
Requirements: The tutorial will use online available tools and the participants do not need to prepare any special material. The only thing required is internet connection and a computer.
by Melissa Graham (U. Washington, US)
Time: Tuesday 11:30 - 13:00
Description: This will be a hands-on tutorial with the Rubin Science Platform and Data Preview 1. Participants will learn the basics of how to access data with the butler, and how to run individual tasks or multi-task "steps" of the Rubin Science Pipelines. As examples, we will detect and measure sources in a single image and create custom coated images.
Requirements: In order to participate, it is required to have Rubin data rights and an account in the Rubin Science Platform: follow the instructions at ls.st/rsp-signup at least one week in advance, and ensure you can login at data.lsst.cloud. There will not be time to fix account issues during this session. Some familiarity with python, Jupyter Notebooks, and optical image analysis is recommended.
By Hector Hernandez (UNAM), Jose Vasquez (UNAM), Garreth Martin (U.Nottingham)
Time: Tuesday 15:00 - 16:45
Description: The LSB Workshop will focus on the detection and analysis of low surface brightness (LSB) structures in astronomical images. We will introduce the LSB field, its goals, key scientific problems (e.g., interacting galaxies, intracluster light, gravitational lenses), and the role of LSST in addressing them. Participants will use Jupyter notebooks to visualize and annotate LSB features (such as tidal tails and streams) in processed images and compare results collaboratively. In a second part, we will centre on image post-processing techniques to enhance LSB detection. Activities include sky subtraction, adaptive smoothing, galaxy model subtraction, and segmentation using photutils. Participants will test these methods on astronomical images containing faint tidal streams and assess the recovery of LSB flux and structures relative to ground truth data. The workshop concludes with discussions on method sensitivity, reproducibility, and preparation for large-scale surveys like LSST.
by Julia Gschwend (LIneA, Brazil)
Time: Wednesday 9:30 - 11:00
Description: This hands-on session will provide a practical introduction to the Brazilian Independent Data Access Center (IDAC-BR) science platforms, using examples based on LSST DP1 data. Participants will explore how to access, visualize, and analyze data through the integrated tools developed at LIneA, including the JupyterHub environment, User Query interface, Sky Viewer, Target Viewer, and OnDemand processing services. The session will highlight how these platforms interoperate to support end-to-end scientific workflows, from data discovery to custom analysis. Step-by-step demonstrations will guide users through common use cases and best practices, while also showcasing the flexibility of the system for different research needs. Educational materials and documentation will be made available for those who wish to review or continue exploring the tools after the session.
by Lorena Hernandez (UDP, Chile)
Time: Wednesday 11:30 - 13:00
Description: Join us to explore how ALeRCE, the Chilean broker for time-domain astronomy, enables scientific discovery from millions of astronomical alerts. In this hands-on session, we will introduce the main tools to classify and study variable and transient objects, and give a preview of how ALeRCE will work once the Rubin Observatory LSST alert stream comes online. After a guided tutorial, participants will work through a Jupyter notebook to learn how to interact with the ALeRCE client and database. Depending on the audience’s interests, we may then explore additional examples or leave time for participants to experiment with the data on their own. Everyone is welcome, no prior experience required.
Requirements: Register early to secure a spot.Check info about the ALeRCE broker in the website: https://alerce.science/. Try a starter jupyter notebook in Google Colab following this link: https://colab.research.google.com/github/alercebroker/usecases/blob/master/notebooks/ALeRCE_Client_starter.ipynb. This will help you to familiarize yourself with ALeRCE tools, without the need of installing packages in your own machine.
If any issues are encountered, please contact the ALeRCE team so that we can help resolve them before the tutorial. Feel free to contact Lorena Hernandez-García (lorena.hernandez@mail.udp.cl)
Time: Thursday 11:00 - 13:00
Description: The passing of a massive body in front of a star produces a magnification of its brightness due to gravitational lensing. This generates a characteristic microlensing light-curve, which encodes information about the object acting as a lens, which can be a compact object (Black Hole, Neutron Star, White Dwarf), a planet, a normal star, or multiple systems of the former. Rubin LSST is expected to detect thousands of microlensing events with implications for stellar evolution, planet formation and Dark Matter. To be able to use Rubin data to find these events and to extract physical information from them, we need to carry out simulations to train and characterize classification algorithms and to assess the efficiency of modeling tools. To this end, we have developed a framework to simulate microlensing light-curves for Rubin, both at the catalog level and at the image level. These tools could be easily adapted to generate light-curves for other variable sources of interest.
In this tutorial we will address the several aspects involved in the simulation of light curves using tools developed for Rubin. First, we shall see the steps to simulate directly a Rubin-like light curve, i.e., reading the time-stamps from the Rubin Operation Simulatior (OpSims) and adding realistic error bars (with rubin_sim) to a light-curve model. Next, we will see how to work by injecting and extracting variable objects at the image level. We will discuss how to download individual calibrated exposures from a given selected region of the sky, from both the simulated DP0 images as well as for the real ComCam DP1 images; how to retrieve the associated catalogs; how to inject new sources on the images and how to make forced photometry on them, in order to build a light-curve from the images. Finally we will use a pipeline developed by our team that integrates these and other tasks - such as reading stars from the TIRLEGAL LSST catalog, generating microlensing parameters and the associated microlensing light curves - to produce realistic Rubin-like microlensing light curves.
Requirements: There is no formal requirement to attend the tutorial, both in terms of installed software or previous knowledge. However, the work on LSST images and catalogs, either real (DP1) or simulated (DP0) requires access to the Rubin Science Platform (RSP), which in turn requieres data rights. Therefore, for attendants to be able to run the full tutorial on their own machines, it is required that they have an account on RSP. The parts of the tutorial regarding simulations only at the catalog level can be run using public tools developed by LSST.
by Michael Wood-Vasey and Christopher Hernandez. (U. Pittsburgh)
Time: Thursday 15:00 - 16:30
Description: The Pitt-Google Alert Broker is designed to provide global public access to astronomical alert stream data using tools that mitigate the challenges our field currently faces in data transport, processing, storage, and access through Google Cloud services. In this tutorial, we will explore the pittgoogle-client, a Python library developed by our team for accessing data served by our broker and demonstrate the variety of science cases that can benefit from the services our broker provides.
We kindly ask participants to complete the following one-time setup prior to the tutorial session at https://mwvgroup.github.io/pittgoogle-client/one-time-setup/index.html#one-time-setup
by Melissa Graham (U. Washington, US)
Time: Friday 9:00 - 10:30
Description: This tutorial will provide a brief overview of the Data Preview 1 data products, but the main focus will be how to use the three aspects of the Rubin Science Platform: the Portal, Notebook, and API (remote access). The particular strengths of each aspect for data discovery, access, and analysis will be highlighted, and multi-aspect workflows illustrated. All are welcome, no past experience is necessary. Participants who already have RSP accounts can opt-in to hands-on experience, but those who don't can still follow along and learn.