Any of the worker device may also act like parameter server, storing the variables. Normally, we make high computing devices as workers and the device which has good connectivity with all the workers is made parameter server. For example, in distributed training, parameter server can be used to store the variables (weights), and different workers can run the training parallely. All the changes to the weight are propagated to the central parameter server, which maintains the most updated weight, and this most updated weight is maintained by the parameter server, which is avaialble to any of the worker.