The 4th Hands-on Lab Streaming Layer Workshop, hosted by the Swartz Center for Computational Neuroscience at the University of California San Diego (UCSD), will take place on Saturday, November 22, 2025. The tutorial workshop will introduce and demonstrate the use of the Lab Streaming Layer (LSL) software environment, the associated Extensible data Format (XDF), as well as the LSL applications programming interface (api) and associated Neuropipe data recording and visualization and MoBILAB data review and analysis software. The format will be a lecture by principal LSL developer Christian Kothe followed by live hands-on applications demonstrations and api programming sessions. An on-site lunch and concluding tea will enhance opportunities for social networking among LSL users and code developers. See the workshop program.
Lab Streaming Layer (LSL) for multimodal data collection
1:45PM-2:00PM - LSL introduction and demonstration (Christian Kothe)
2:00PM-2:15PM - The lsl_app_matlabviewer EEGLAB plug-in (Arnaud Delorme)
2:15PM-2:45PM - LSL synchronization and other topics (Christian Kothe)
2:45PM-3:15PM - LSL for Mobile Brain Imaging (Yahya Shirazi)
3:15PM-3:30PM - LSL wrap up (Tim Mullen)
This is a provisional program.
The LSL core library and first applications were created by Christian Kothe at SCCN in 2012 in response to an urgent need to record and synchronize multi-modal data in experiments using Mobile Brain/Body Imaging (MoBI) paradigms.The lab streaming layer (LSL) is an open source software framework for the unified collection of measurement time series in research experiments that handles both the networking, time-synchronization, (near-) real-time access as well as optionally the centralized collection, online viewing and disk recording of the data. In ensuing years, many research users and equipment manufacturers have contribute drivers for a wealth of data collection devices. LSL thus appears on its way to becoming a community standard for software-based fusion and synchronization of multiple data streams, particularly for multimodal brain/body imaging and brain-computer interface (BCI) or brain-machine interface (BMI) applications.
The LSL distribution consists of:
-- The core transport library (liblsl) and its language interfaces (C, C++, Python, Java, C#, MATLAB). The library is general-purpose and cross-platform (Win/Linux/MacOS, 32/64) and forms the heart of the project.
-- A suite of tools built on top of the library, including a recording program, online viewers, importers, and apps that make data from a large and growing range of acquisition hardware available on the lab network (for example audio, EEG, or motion capture).
-- There is an intro lecture/demo on LSL by Christian Kothe on YouTube (part of an online course on EEG-based brain-computer interfaces).
-- The source code for the project is hosted on github.
-- There is also an LSL mailing list
The liblsl library provides the following abstractions for use by client programs:
-- Stream Outlets: for making time series data streams available on the lab network. The data is pushed sample-by-sample or chunk-by-chunk into the outlet, and can consist of single- or multi-channel data, regular or irregular sampling rate, with uniform value types (integers, floats, doubles, strings). Streams can have arbitrary XML meta-data (akin to a file header). By creating an outlet the stream is made visible to a collection of computers (defined by the network settings/layout) where one can subscribe to it by creating an inlet.
-- Resolve functions: these allow to resolve streams that are present on the lab network according to content-based queries (for example, by name, content-type, or queries on the meta-data). The service discovery features do not depend on external services such as zeroconf and are meant to drastically simplify the data collection network setup.
-- Stream Inlets: for receiving time series data from a connected outlet. Allows to retrieve samples from the provider (in-order, with reliable transmission, optional type conversion and optional failure recovery). Besides the samples, the meta-data can be obtained (as XML blob or alternatively through a small built-in DOM interface).
-- Built-in clock: Allows to time-stamp the transmitted samples so that they can be mutually synchronized.
The following reliability features are implemented by the library (transparently):
-- LSL data transport inherits the reliability of TCP, is message-oriented (partitioned into samples) and type safe.
-- The LSL library provides automatic failure recovery from application or computer crashes to minimize data loss (optional); this makes it possible to replace a computer in the middle of a recording without having to restart the data collection.
-- LSL data is buffered both at the sender and receiver side (with configurable and arbitrarily large buffers) to tolerate intermittent network failures.
-- The LSL data transmission is type safe and supports type conversions as necessary.
LSL comes with a built-in synchronized time facility for all recorded data which is designed to achieve sub-millisecond accuracy on a local network of computers. This facility serves to provide out-of-the-box support for synchronized data collection but does not preclude the use of user-supplied alternative timestamps, for example from commercial timing middleware or high-quality clocks. The built-in time synchronization implemented in the LSL library is designed after the widely deployed Network Time Protocol (NTP).
While the LSL transport API itself does not endorse or provide a particular file format, the provided recording program (LabRecorder) records into the XDF file format (Extensible Data Format). XDF was designed concurrently with the lab streaming layer and supports the full feature set of LSL (including multi-stream container files, per-stream arbitrarily large XML headers, all sample formats as well as time-synchronization information). The MoBILAB application running on MATLAB provides a multi-modal data browser, data selector and preprocessor, with a direct conduit to EEGLAB for electrophysiological data analysis (when relevant).
Makeig S, Gramann K, Jung T-P, Sejnowski TJ, Poizner H. "Linking brain, mind and behavior: The promise of mobile brain/body imaging (MoBI)", International Journal of Psychophysiology, 2009.
Delorme A, Mullen T, Kothe C, Akalin Acar Z, Bigdely-Shamlo N, Vanko A, Makeig, "EEGLAB, SIFT, NFT, BCILAB, and ERICA: New tools for advanced EEG processing." Computational Intelligence and Neuroscience 2011:130714, 12, 2011.
Ojeda A, Bigdely-Shamlo N, Makeig S. "MoBILAB: An open source toolbox for analysis and visualization of mobile brain/body imaging data." Frontiers in Human Neuroscience, 2014.
Mullen T, Kothe C, Chi M, Ojeda A, Kerth T, Makeig S, Jung T-P, Cauwenberghs G. "Real-time neuroimaging and cognitive monitoring using wearable dry EEG." IEEE Transactions on Biomedical Engineering 62.11:2553-2567, 2015.