Transformer 아키텍처Transformer — Self-Attention, Multi-Head Attention, Positional Encoding, Encoder-Decoder 구조