DeepArc:

Modularizing Neural Networks for the Model Maintenance

Abstract

Neural networks are an emerging data-driven programming paradigm widely used in many areas. Unlike traditional software systems consisting of decomposable modules, a neural network is usually delivered as a monolithic package, raising challenges for some maintenance tasks such as model restructure and retraining. we propose DeepArc, a novel modularization method for neural networks, to reduce the cost of model maintenance tasks such as model restructure and re-adaption. Specifically, DeepArc decomposes a neural network into several consecutive modules, each of which encapsulates consecutive layers with similar semantics. The network modularization facilitates practical tasks such as refactoring the model to preserve existing features (e.g., model compression) and enhancing the model with new features (e.g., fitting new samples). The modularization and encapsulation allow us to restructure or retrain the model by only pruning and tuning a few localized neurons and layers. (1) the architectural bad smell of a network model so that we can compress modules for effective model compression and (2) the cost-saving opportunities to boost the model performance by only retraining few module weights. Our experiments show that, (1) DeepArc can boost the runtime efficiency of the state-of-the-art model compression techniques by 14.8%;(2) compared to the traditional model retraining, DeepArc only needs to train less than 20% of the neurons to fit adversarial samples and repair under-performing models, leading to 32.85% faster training performance while achieving similar model prediction performance.

Motivating Example

We use a ResNet-110 model trained on CIFAR-10 dataset to illustrate the modularization.

The model consists of 54 blocks, 110 convolution layers, and total of 385 layers.

It can be modularized into a semantic architecture containing some semantic modules consisting of a set of continuous layers extracting similar features.

Taking the similarity matrix of the left figure as input, the network can be modularized by Algorithm 1.

The top-3 semantic modules are [4,8] , [25,36], [38,50].

Deep Architecture

The figure above shows an overview of our DeepArc framework. Given a DNN model, DeepArc first extracts several modules to form a semantic architecture, each of which includes semantically similar model layers. Based on the semantic architecture, we can support (1) model restructure to modify the network while preserving its behaviors, and (2) re-adaption to fix mis-prediction or fit a model on new samples. Moreover, the restructured or improved DNN can be further modularized to support new model restructuring or re-adaption tasks.

RQ1 and RQ2 evaluate the effectiveness of our modularization method and the properties of the module itself. RQ3 and RQ4 evaluate the usefulness on two application tasks.

See the subpage in the upper right corner for details.