The world is becoming increasingly more instrumented and connected. This drives the increased availability of data from the nature, man made systems, people, and the interactions between all. The resulting data deluge brings both challenges and opportunities. The challenge is to build scalable, usable, and extensible systems to analyze this data. The opportunity is to build novel applications that can extract new insights from the data and take actions that bring competitive advantage. These challenges and opportunities have resulted in the development of Big Data technologies, where scalability to large data sets is a key characteristic.
A significant portion of the data available for analysis is in the form of live data streams and requires timely processing. We call this form of Big Data, Fast Data. Examples include analyzing live stock ticker data in financial markets, call detail records in telecommunications, video streams in surveillance, production line status feeds in manufacturing, and vital body signals in health-care, to name a few. In all of these domains, there is a need to gather, process, and analyze data streams, detect emerging patterns and outliers, extract valuable insights, and generate actionable results. Most importantly, this analysis often needs to happen in near real-time.
The goal of this workshop is to bring together research from the systems, analytics, and application areas, where the focus is on Fast Data. We are particularly interested in research papers that make use of techniques, theories, and methodologies from distributed systems.
Topics of interest include, but are not limited to: