Future Readings
Ramachandran, Prajit, et al. "Stand-alone self-attention in vision models." Advances in neural information processing systems 32 (2019).
Cordonnier, Jean-Baptiste, Andreas Loukas, and Martin Jaggi. "On the relationship between self-attention and convolutional layers." arXiv preprint arXiv:1911.03584 (2019).
Wang, Wenhai, et al. "Pyramid vision transformer: A versatile backbone for dense prediction without convolutions." Proceedings of the IEEE/CVF international conference on computer vision. 2021.
Alec Radford, et al. "Improving Language Understanding by Generative Pre-Training"
Long Ouyang, et al. "Training language models to follow instructions with human feedback" Advances in neural information processing systems (2022).
Barret Zoph, et al. "Learning Transferable Architectures for Scalable Image Recognition" 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
Implicit Neural Representations with Periodic Activation Functions
Space-Time Implicit Neural Representations for Atomic Electron Tomography on Dynamic Samples