# Transformer

- [TOC](/booil-jung/docs/_9990/__-will-delete/transformer.book/toc.md)
- [서문](/booil-jung/docs/_9990/__-will-delete/transformer.book/000.md)
- [대상 독자](/booil-jung/docs/_9990/__-will-delete/transformer.book/000/001-target.md)
- [1부: Transformer의 기초](/booil-jung/docs/_9990/__-will-delete/transformer.book/001.md)
- [1장: 인공지능과 딥러닝의 배경](/booil-jung/docs/_9990/__-will-delete/transformer.book/001/001.md)
- [인공지능의 역사](/booil-jung/docs/_9990/__-will-delete/transformer.book/001/001/001.md)
- [딥러닝의 발전](/booil-jung/docs/_9990/__-will-delete/transformer.book/001/001/002.md)
- [자연어 처리(NLP)의 필요성](/booil-jung/docs/_9990/__-will-delete/transformer.book/001/001/003.md)
- [2장: 딥러닝의 기본 개념](/booil-jung/docs/_9990/__-will-delete/transformer.book/001/002.md)
- [2. 인공 신경망(ANN)](/booil-jung/docs/_9990/__-will-delete/transformer.book/001/002/001.md)
- [2. 학습 알고리즘](/booil-jung/docs/_9990/__-will-delete/transformer.book/001/002/002.md)
- [3. 활성화 함수와 손실 함수](/booil-jung/docs/_9990/__-will-delete/transformer.book/001/002/003.md)
- [3장: 순환 신경망(RNN)과 LSTM](/booil-jung/docs/_9990/__-will-delete/transformer.book/001/003.md)
- [1. RNN의 개념과 문제점](/booil-jung/docs/_9990/__-will-delete/transformer.book/001/003/001.md)
- [2. LSTM과 GRU의 발전](/booil-jung/docs/_9990/__-will-delete/transformer.book/001/003/002.md)
- [003](/booil-jung/docs/_9990/__-will-delete/transformer.book/001/003/003.md)
- [2부: Transformer의 구조](/booil-jung/docs/_9990/__-will-delete/transformer.book/002.md)
- [4장: Transformer의 기초](/booil-jung/docs/_9990/__-will-delete/transformer.book/002/004.md)
- [1. Attention 메커니즘](/booil-jung/docs/_9990/__-will-delete/transformer.book/002/004/001.md)
- [2. Self-Attention과 Multi-Head Attention](/booil-jung/docs/_9990/__-will-delete/transformer.book/002/004/002.md)
- [3. Position Encoding](/booil-jung/docs/_9990/__-will-delete/transformer.book/002/004/003.md)
- [5장: Transformer의 구성 요소](/booil-jung/docs/_9990/__-will-delete/transformer.book/002/005.md)
- [1. 인코더와 디코더의 구조](/booil-jung/docs/_9990/__-will-delete/transformer.book/002/005/001.md)
- [2. 각 층의 역할](/booil-jung/docs/_9990/__-will-delete/transformer.book/002/005/002.md)
- [3. Residual Connection과 Layer Normalization](/booil-jung/docs/_9990/__-will-delete/transformer.book/002/005/003.md)
- [6장: Attention 메커니즘의 원리](/booil-jung/docs/_9990/__-will-delete/transformer.book/002/006.md)
- [1. Dot-Product Attention](/booil-jung/docs/_9990/__-will-delete/transformer.book/002/006/001.md)
- [2. Scaled Dot-Product Attention](/booil-jung/docs/_9990/__-will-delete/transformer.book/002/006/002.md)
- [3. Multi-Head Attention의 작동 방식](/booil-jung/docs/_9990/__-will-delete/transformer.book/002/006/003.md)
- [3부: Transformer의 발전](/booil-jung/docs/_9990/__-will-delete/transformer.book/003.md)
- [7장: Transformer 모델의 발전](/booil-jung/docs/_9990/__-will-delete/transformer.book/003/007.md)
- [1. BERT (Bidirectional Encoder Representations from Transformers)](/booil-jung/docs/_9990/__-will-delete/transformer.book/003/007/001.md)
- [2. GPT (Generative Pre-trained Transformer)](/booil-jung/docs/_9990/__-will-delete/transformer.book/003/007/002.md)
- [3. T5 (Text-To-Text Transfer Transformer)](/booil-jung/docs/_9990/__-will-delete/transformer.book/003/007/003.md)
