# 2부: Transformer의 구조

- [4장: Transformer의 기초](/booil-jung/docs/_9990/__-will-delete/transformer.book/002/004.md)
- [1. Attention 메커니즘](/booil-jung/docs/_9990/__-will-delete/transformer.book/002/004/001.md)
- [2. Self-Attention과 Multi-Head Attention](/booil-jung/docs/_9990/__-will-delete/transformer.book/002/004/002.md)
- [3. Position Encoding](/booil-jung/docs/_9990/__-will-delete/transformer.book/002/004/003.md)
- [5장: Transformer의 구성 요소](/booil-jung/docs/_9990/__-will-delete/transformer.book/002/005.md)
- [1. 인코더와 디코더의 구조](/booil-jung/docs/_9990/__-will-delete/transformer.book/002/005/001.md)
- [2. 각 층의 역할](/booil-jung/docs/_9990/__-will-delete/transformer.book/002/005/002.md)
- [3. Residual Connection과 Layer Normalization](/booil-jung/docs/_9990/__-will-delete/transformer.book/002/005/003.md)
- [6장: Attention 메커니즘의 원리](/booil-jung/docs/_9990/__-will-delete/transformer.book/002/006.md)
- [1. Dot-Product Attention](/booil-jung/docs/_9990/__-will-delete/transformer.book/002/006/001.md)
- [2. Scaled Dot-Product Attention](/booil-jung/docs/_9990/__-will-delete/transformer.book/002/006/002.md)
- [3. Multi-Head Attention의 작동 방식](/booil-jung/docs/_9990/__-will-delete/transformer.book/002/006/003.md)
