FGCS Special Issue:

Serverless Computing in the Cloud-Edge Continuum (Call for Papers)

Motivation and Scope

Serverless computing is a novel paradigm for the operation of next-generation cloud data centers. Serverless applications are developed as an ecosystem of microservices – called functions – which are loosely coupled and highly scalable. Each function is instantiated in a set of stateless and equivalent containers. As a result, consecutive invocations of a function from a user can reach different containers, and a container can serve different users. Furthermore, serverless enables a pure pay-per-use pricing model such that users pay for the consumed resources rather than for the allocated ones.


At the same time, edge computing complements the cloud by pervasively deploying compute nodes over a continuum from the cloud to the network edge, where user devices are. This proximity to the final users paves the way to a plethora of emerging applications (e.g., Internet of Things, Augmented/Virtual Reality, Vehicular Ad-Hoc Networks) having stringent requirements such as low latency, high throughput, and context awareness.


An urgent question is how to extend serverless technologies from cloud data centers to the Cloud-to-Edge continuum. On the one hand, serverless opens plenty of opportunities at the edge. For example, it saves energy and compute resources, which can be limited on edge nodes, by letting users share containers and by automatically deallocating containers after a period of idleness (i.e., scale-to-zero). On the other hand, however, adopting serverless at the edge raises several challenges, which are due to the gap between the cloud-oriented design of serverless and the peculiar characteristics of edge systems. Some of the research questions in the field are: the management of the state of functions; the mitigation of the cold-start effect caused by scaling-to-zero; the execution of long-running workloads such as federated learning; the adaptability of serverless to the wide-area distribution of edge networks and to the heterogeneity of edge nodes.


This special issue aims at bringing together new ideas, latest findings, and novel results from researchers and practitioners working on this research area. Topics of interest include, but are not limited to, the following:

● Serverless computing in architectures spanning the Cloud-to-Edge continuum

● Protocols and algorithms for serverless computing in wide-area networks

● Integration of serverless computing with edge standards (e.g., ETSI MEC, 5G/6G)

● Machine Learning in/for serverless-operated cloud-edge systems

● Energy efficiency of serverless in edge computing infrastructures

● Economical study on serverless in edge computing systems

● Development and testing of serverless-operated edge computing platforms

● Performance evaluation of serverless computing at the network edge, by means of simulation or experiments on real testbeds

Guest Editors

Corresponding Editor: Carlo Puliafito, University of Pisa, Italy, carlo.puliafito@unipi.it

Omer Rana, Cardiff University, UK

Luiz Bittencourt, University of Campinas, Brazil

Hao Wu, Beijing Normal University, China

Important Dates

● Submission portal opens: January 10, 2023

● Deadline for paper submission: July 15, 2023


Manuscript Submission Instructions

The FGCS’s submission system (https://www.editorialmanager.com/FGCS/default.aspx) will be open for submissions to our Special Issue from January 10, 2023. When submitting your manuscript please select the article type

VSI: Serverless_edge.


All submissions deemed suitable by the editors to be sent for peer review will be reviewed by at least two independent reviewers. Once your manuscript is accepted, it will go into production to be published in the special issue.


FGCS Journal information:

https://www.sciencedirect.com/journal/future-generation-computer-systems