The Luong's attention used in the Encoder-Decoder model:
A generic definiton of attention
Self attention and Encoder-decoder attention are just specific cases of the generic attention. Those are the concepts used in Transformer and the Google BERT model.