Deep Learning Algorithms

[10] W. Hao^, C. Wang^*, X. Xu^, H. Yang^. Deep Learning via Neural Energy Descent. Submitted. [pdf]

[9] Z. Huang, S. Liang, M. Liang, W. He, H. Yang*. Efficient Attention Network: Accelerate Attention by Searching Where to Plug. Submitted. [pdf] 

[8] Z. Huang, S. Liang, M. Liang, H. Yang, L. Lin*.  The Lottery Ticket Hypothesis for Self-attention Networks in Computer Vision. IEEE Conference on Multimedia Expo 2024. [pdf]

[7] S. Liang^, L. Lyu^, C. Wang ^, H. Yang^*. Reproducing Activation Function for Deep Learning. Communication in Mathematical Sciences, 2024. [pdf] [doi]

[6] Y. Ong^, Z. Shen^, H. Yang^*. IAE-Net: Integral Autoencoders for Discretization-Invariant Learning. Journal of Machine Learning Research, 2022.  [pdf] [doi]

[5] W. He^, Z. Huang^, M. Liang, S. Liang, H. Yang*. Blending Pruning Criteria for Efficient Convolutional Neural Networks. 30th International Conference on Artificial Neural Networks, ICANN, 2021. [pdf] [doi]

[4] Y. Liu, T. Gao, H. Yang*, SelectNet: Learning to Sample from the Wild for Imbalanced Data Training. Mathematical and Scientific Machine Learning Conference 2020. [pdf] [doi]

[3] S. Liang, Y. Kwoo, H. Yang*, Drop-Activation: Implicit Parameter Reduction and Harmonic Regularization. Communications on Applied Mathematics and Computation, 2020. [pdf] [doi]

[2] S. Liang^, Z. Huang^, M. Liang, H. Yang*, Instance Enhancement Batch Normalization: an Adaptive Regulator of Batch Noise. Proceedings of the AAAI Conference on Artificial Intelligence, 2020. [pdf] [doi]

[1] Z. Huang^, S. Liang^, M. Liang, H. Yang*, DIANet: Dense-and-Implicit Attention Network. Proceedings of the AAAI Conference on Artificial Intelligence, 2020. [pdf] [doi]


^: Equal contribution; *: Corresponding author.