As mentioned above, Keras is a wrapper library for DL libraries of lower level of implementation abstractions. There are also more wrapper libraries, some of them are quite popular, other have unique design. These wrapper libraries differ each from others in transparency levels towards underlying frameworks or libraries as well as they are chosen based on users preferences and popularity.
Tensorflow has a lot of wrappers. External wrapper packages are TensorLayer [TensorLayer], TFLearn [TFLean] and Keras. Wrappers from Google are Sonnet (Deepmind) [Sonnet] and PrettyTensor [PrettyTensor]. Wrappers are TF-Slim [TFSlim], tf.contrib.learn, tf.layers, and tf.keras [TensorFlow].
Gluon is a wrapper for MXNet [Gluon]. Gluon's API specification is an effort to improve speed, flexibility, and accessibility of DL technology for all developers, regardless of their DL framework choice. Gluon is a product from Amazon Web Services (AWS) and Microsoft's AI. It is released under Apache 2.0 licence.
NVidia Digits - Deep Learning GPU Training System [DIGITS] is web application for training DNNs for image classification, segmentation and object detection tasks using DL backends such as Caffe, Torch and TensorFlow with a wide variety of image formats and sources with DIGITS plug-ins. DIGITS simplifies common DL tasks such as managing data, designing and training NNs on multi-GPU systems, monitoring performance in real time with advanced visualisations, and selecting the best performing model from the results browser for deployment. DIGITS is mainly interactive (GUI). It provides availability of pre-trained models such as AlexNet, GoogLeNet, LeNet and UNET from the DIGITS Model Store and is released under BSD 3-clause license.
Lasagne is lightweight library to build and train NNs in Theano with six principles: Simplicity, Transparency, Modularity, Pragmatism, Restraint and Focus [Lasagne]. Other wrappers for Theano are Blocks and Pylearn2. Due to the fact that Theano is not under active development, the popularity of these wrappers is bound to decrease.