Home

Algorithms and Systems for MapReduce and Beyond (BeyondMR) is a workshop for research at the frontier of large-scale computations, with topics ranging from algorithms, to computational models, to the systems themselves. BeyondMR will be held in conjunction with SIGMOD/PODS2018, in Houston, TX, USA on Friday Jun 15, 2018.

The BeyondMR workshop aims to explore algorithms, computational models, architectures, languages and interfaces for systems that need large-scale parallelization and systems designed to support efficient parallelization and fault tolerance. These include specialized programming and data-management systems based on MapReduce and extensions, graph processing systems, data-intensive workflow and dataflow systems.

We invite submission on topics such as:

  • Cost models: Formal models that evaluate the efficiency of algorithms in large-scale parallel systems taking into account the different architectural properties and parameters of such systems.

  • Task Scheduling, Load-Balancing and Fault Tolerance: Methods and algorithms that avoid data and computational skew in large-scale parallel systems. Design of scheduling algorithms for balanced task distribution. Techniques for supporting fault-tolerance.

  • Algorithms and Applications: Algorithmic design for specific data processing tasks in large-scale parallel systems. These include query processing and graph processing tasks, iterative and recursive computational tasks, machine learning and general data analytics. Applications built using large-scale parallel systems.

  • New Parallel Architectures: Novel large-scale parallel architectures and systems that support various types of data processing tasks, such as graph processing, log processing, data analytics and machine learning. Extensions of current systems to provide additional functionality, improve performance, and support processing of more complex tasks.

Keynotes:

    Author: TBA
        Title: TBA

    Author: TBA
        Title: TBA



The prior occurrences of BeyondMR were held in 2014 and 2015 together with EDBT/ICDT and in 2016 and 2017 together with SIGMOD/PODS.