Summary

In summary

In this work, we propose ArchRepair, a very first DNN repairing method based on adjusting both the model's architectures and weights in the block-level. Throughout this work, we find the potential of optimizing DNN architecture through Network Architecture Search (NAS) in DNN repairing, bringing the new idea for this area. The existed DNN repairing methods are focusing on only adjusting model's weights, either on neuron-level or network-level. We believe these methods have their own limitations: the performance of a repaired DNN is limited by only optimizing model's weight, however, adjusting the architecure could move these limitations. To this end, we use the data from training set to guide the NAS and train the new net architecture at the same time. We design 4 research questions and corresponding experiments to validate our ideas, and the results fully proof the effectiveness of optimizing architecture in DNN repairing. We believe these architecture-based methods will play an important role in the future DNN repairing research.

Besides, ArchRepair is repairing model on block level. According to the result of Empirical Study, repairing on the whole network is time-consuming, while repairing on certain neurons cannot have a good performance. On the other hands, optimizing architecture on whole network is also time-consuming, while only optimizing a specific layer's architecture is not enough (ref. to RQ4). Therefore, we decide to optimize the block-level architecture. According to the experimental results in Empirical Study and RQ4, repairing on block level can make a good balance between efficiency and effectiveness. We believe the block-level repairing will also initialize a promissing direction for future DNN repairing research.