김기현의 딥러닝을 활용한 자연어처리 입문 올인원 패키지
Part1. 딥러닝 초급
51 / 58
Ch 01. Orientation - 01. Orientation
31:04
Ch 01. Orientation - 02. MNIST Classification 실습 리뷰
19:47
Ch 01. Orientation - 03. MNIST Classification 실습 리뷰 - 실습
43:46
Ch 02. Representation Learning - 01. 특징(feature)이란
13:01
Ch 02. Representation Learning - 02. 원핫 인코딩
11:04
Ch 02. Representation Learning - 03. 오토 인코더
10:07
Ch 02. Representation Learning - 04. Hidden Representations
12:11
Ch 02. Representation Learning - 05. 실습 오토인코더
40:29
Ch 03. Probabilistic Perspective - 01. 들어가며
11:50
Ch 03. Probabilistic Perspective - 02. 기본 확률 통계
39:41
Ch 03. Probabilistic Perspective - 03. Maximum Likelihood Estimation (MLE)
25:42
Ch 03. Probabilistic Perspective - 04. 신경망과 MLE
14:08
Ch 03. Probabilistic Perspective - 05. 수식 MLE
10:52
Ch 03. Probabilistic Perspective - 06. Maximum A Posterior (MAP)
11:47
Ch 03. Probabilistic Perspective - 07. KL-Divergence
9:36
Ch 03. Probabilistic Perspective - 08. Information & Entropy
30:53
Ch 03. Probabilistic Perspective - 09. Appendix - MSE Loss
11:21
Ch 03. Probabilistic Perspective - 10. 정리하며
5:36
Ch 04. Geometric Perspective - 01. 차원의 저주
10:39
Ch 04. Geometric Perspective - 02. 차원 축소
13:35
Ch 04. Geometric Perspective - 03. 매니폴드(Manifold) 가설
18:24
Ch 04. Geometric Perspective - 04. 실습 매니폴드 가설 실습
8:51
Ch 04. Geometric Perspective - 05. 정리하며
10:46
Ch 05. Advanced PyTorch Tutorials - 01. PyTorch Dataset
14:00
Ch 05. Advanced PyTorch Tutorials - 02. 실습 PyTorch Dataset을 활용하여 구현하기
21:55
Ch 05. Advanced PyTorch Tutorials - 03. PyTorch Ignite
28:40
Ch 05. Advanced PyTorch Tutorials - 04. 실습 PyTorch Ignite를 활용하여 구현하기
43:53
Ch 06. Convolutional Neural Networks - 01. CNN 소개
30:55
Ch 06. Convolutional Neural Networks - 02. CNN 활용 사례
11:13
Ch 06. Convolutional Neural Networks - 03. Max-pooling & Stride
9:59
Ch 06. Convolutional Neural Networks - 04. 실제 구현할 때 팁
8:31
Ch 06. Convolutional Neural Networks - 05. 실습 브리핑
8:00
Ch 06. Convolutional Neural Networks - 06. 실습 CNN으로 MNIST 분류 구현하기
23:04
Ch 06. Convolutional Neural Networks - 07. 정리하며
7:03
Ch 07. Computer Vision Introductions - 01. 영상 처리 소개
16:28
Ch 07. Computer Vision Introductions - 02. VGG 소개
11:13
Ch 07. Computer Vision Introductions - 03. ResNet 소개
16:15
Ch 07. Computer Vision Introductions - 04. 전이학습(transfer learning) 소개
15:44
Ch 07. Computer Vision Introductions - 05. 실습 브리핑
10:37
Ch 07. Computer Vision Introductions - 06. 실습 백본 네트워크를 활용한 전이학습
23:03
Ch 08. Recurrent Neural Networks - 01. RNN 소개
10:52
Ch 08. Recurrent Neural Networks - 02. RNN Step-by-Step 들여다보기
36:35
Ch 08. Recurrent Neural Networks - 03. RNN 활용 사례
17:52
Ch 08. Recurrent Neural Networks - 04. RNN에서의 Back-propagation (BPTT)
11:48
Ch 08. Recurrent Neural Networks - 05. 수식 BPTT
14:11
Ch 08. Recurrent Neural Networks - 06. Long-Short Term Memory (LSTM)
14:58
Ch 08. Recurrent Neural Networks - 07. Gradient Vanishing과 LSTM
11:18
Ch 08. Recurrent Neural Networks - 08. 실습 브리핑
8:28
Ch 08. Recurrent Neural Networks - 09. 실습 LSTM으로 MNIST 분류 구현하기
20:12
Ch 08. Recurrent Neural Networks - 10. Gradient Clipping
9:13
Ch 08. Recurrent Neural Networks - 11. 실습 Gradient Clipping 구현
6:50
Ch 08. Recurrent Neural Networks - 12. 정리하며
8:24
Ch 09. Career Guide - 01. 커리어 가이드
29:49
Ch 09. Career Guide - 02. 어떤 회사들이 인공지능을 연구개발할까
17:11
Ch 09. Career Guide - 03. 머신러닝 프로젝트 수행 팁
9:26
Ch 09. Career Guide - 04. 혼자 공부하는 방법
17:28
Ch 09. Career Guide - 05. 논문 읽는 방법
22:17
Ch 10. Summary - 01. 클래스 요약
12:52
2
Part2. 자연어처리 입문
34 / 57
Ch 01. Orientation - 01. Orientation
21:52
Ch 02. Introduction - 01. 자연어처리란 무엇인가
11:37
Ch 02. Introduction - 02. NLP with Deep Learning
16:57
Ch 02. Introduction - 03. 자연어처리와 다른 분야의 차이점
19:58
Ch 02. Introduction - 04. 왜 자연어처리는 어려운가
15:35
Ch 02. Introduction - 05. 왜 한국어 자연어처리는 더 어려운가
22:09
Ch 02. Introduction - 06. 딥러닝 자연어처리 주제 및 역사
21:09
Ch 02. Introduction - 07. 최근 흐름
23:25
Ch 03. Preprocessing - 01. 전처리 파이프라인
19:20
Ch 03. Preprocessing - 02. 코퍼스 수집
21:24
Ch 03. Preprocessing - 03. 코퍼스 정제
16:55
Ch 03. Preprocessing - 04. 정규식 (Regular Expression)
22:36
Ch 03. Preprocessing - 05. 실습 정규식 실습
21:08
Ch 03. Preprocessing - 06. 코퍼스 레이블링
14:04
Ch 03. Preprocessing - 07. 한,중,영,일 코퍼스 분절(tokenization)
18:45
Ch 03. Preprocessing - 08. 실습 형태소 분석기를 활용한 분절하기
9:30
Ch 03. Preprocessing - 09. 분절 길이에 따른 장단점
12:45
Ch 03. Preprocessing - 10. 서브워드 분절
27:18
Ch 03. Preprocessing - 11. 실습 Subword segmentation
18:00
Ch 03. Preprocessing - 12. 분절 복원 (detokenization)
7:05
Ch 03. Preprocessing - 13. 실습 분절 복원
3:24
Ch 03. Preprocessing - 14. 병렬 코퍼스 정렬 시키기
9:42
Ch 03. Preprocessing - 16. TIP 전처리의 중요성, 경험담
8:56
Ch 03. Preprocessing - 17. 미니배치 만들기
21:40
Ch 03. Preprocessing - 18. 실습 TorchText
15:52
Ch 03. Preprocessing - 19. 정리하며
20:23
Ch 04. Word Embedding - 01. 들어가며
7:00
Ch 04. Word Embedding - 02. Word Sense
10:05
Ch 04. Word Embedding - 03. WordNet
9:17
Ch 04. Word Embedding - 04. 실습 WordNet을 활용한 단어 유사도 계산
5:42
Ch 04. Word Embedding - 05. 딥러닝 이전의 단어 임베딩
21:31
Ch 04. Word Embedding - 06. 단어간 유사도(거리) 구하기
10:33
Ch 04. Word Embedding - 07. 실습 딥러닝 이전의 단어 임베딩 구현하기
12:18
Ch 04. Word Embedding - 08. Word2Vec
13:45
Ch 04. Word Embedding - 09. GloVe
6:25
10:33
Ch 04. Word Embedding - 11. 수식 Word2Vec, GloVe & FastText
24:10
Ch 04. Word Embedding - 12. 차원 축소 관점에서 이해하기
8:55
Ch 04. Word Embedding - 13. 실습 Word Embedding
8:09
Ch 04. Word Embedding - 14. Embedding Layer
18:43
Ch 04. Word Embedding - 15. 타 분야 적용 사례
15:03
Ch 04. Word Embedding - 16. Appendix - Sentence Embedding
8:09
Ch 04. Word Embedding - 17. 정리하며
15:20
Ch 05. Text Classification - 01. 들어가며
15:46
Ch 05. Text Classification - 02. RNN을 활용한 텍스트 분류
18:27
Ch 05. Text Classification - 03. 실습 - 실습 소개
11:01
Ch 05. Text Classification - 04. 실습 RNN 분류기 구현하기
26:03
Ch 05. Text Classification - 05. CNN을 활용한 텍스트 분류
32:42
Ch 05. Text Classification - 06. 실습 CNN 분류기 구현하기
25:23
Ch 05. Text Classification - 07. 실습 Trainer 구현하기
17:45
Ch 05. Text Classification - 08. 실습 train.py 구현하기
14:26
Ch 05. Text Classification - 09. 실습 classify.py 구현하기
10:47
Ch 05. Text Classification - 10. 실습 결과 확인
27:47
Ch 05. Text Classification - 11. 정리하며
9:28
Ch 05. Text Classification - 12. Appendix Text Classification with BERT
21:18
Ch 05. Text Classification - 13. Appendix Text Classification with FastText
13:25
Ch 06. Summary - 01. Summary
김기현의 딥러닝을 활용한 자연어생성 올인원 패키지
Chapter1. Orientation
4 / 4
Ch 01. Orientation - 01. Orientation
37:22
Ch 01. Orientation - 02. Stat & Geo Perspective for Deep Learning
17:22
Ch 01. Orientation - 03. Review Introduction to NLP
13:31
Ch 01. Orientation - 04. 자연어 생성이란
9:37
2
Chapter2. Language Modeling
11 / 11
Ch 02. Language Modeling - 01. 들어가며
15:36
Ch 02. Language Modeling - 02. 언어모델 수식
15:06
Ch 02. Language Modeling - 03. n-gram
22:03
Ch 02. Language Modeling - 04. Smoothing and Discounting
17:15
Ch 02. Language Modeling - 05. Interpolation and Backoff
18:37
Ch 02. Language Modeling - 06. Perplexity
13:13
Ch 02. Language Modeling - 07. n-gram 정리
6:44
Ch 02. Language Modeling - 08. RNN을 활용한 LM
17:54
Ch 02. Language Modeling - 09. Perplexity and Cross Entropy
18:08
Ch 02. Language Modeling - 10. Autoregressive and Teacher Forcing
23:15
Ch 02. Language Modeling - 11. 정리하며
9:05
3
Chapter3. Data Preparation
6 / 6
Ch 03. Data Preparation - 01. AI-Hub 소개
8:20
Ch 03. Data Preparation - 02. 실습 번역 말뭉치 신청 및 다운로드
9:48
Ch 03. Data Preparation - 03. 실습 데이터 살펴보기
12:46
Ch 03. Data Preparation - 04. Review Preprocessing
17:54
Ch 03. Data Preparation - 05. 실습 Tokenization
19:44
Ch 03. Data Preparation - 06. 실습 Subword Segmentation
24:23
4
Chapter4. Sequence-to-Sequence
20 / 22
Ch 04. Sequence-to-Sequence - 01. Machine Translation 소개
16:54
Ch 04. Sequence-to-Sequence - 02. Sequence to Sequence
9:38
Ch 04. Sequence-to-Sequence - 03. Encoder
10:38
Ch 04. Sequence-to-Sequence - 04. Decoder
7:28
Ch 04. Sequence-to-Sequence - 05. Generator
10:59
Ch 04. Sequence-to-Sequence - 06. Attention
30:13
Ch 04. Sequence-to-Sequence - 07. Masking
13:14
Ch 04. Sequence-to-Sequence - 08. Input Feeding
21:42
Ch 04. Sequence-to-Sequence - 09. Teacher Forcing
7:38
Ch 04. Sequence-to-Sequence - 10. 실습 실습 소개
20:02
Ch 04. Sequence-to-Sequence - 11. 실습 Encoder 구현하기
14:13
Ch 04. Sequence-to-Sequence - 12. 실습 Attention 구현하기
8:40
Ch 04. Sequence-to-Sequence - 13. 실습 Decoder 구현하기
9:55
Ch 04. Sequence-to-Sequence - 14. 실습 Generator 구현하기
3:40
Ch 04. Sequence-to-Sequence - 15. Appendix Gradient Accumulations
28:05
Ch 04. Sequence-to-Sequence - 16. Appendix Automatic Mixed Precision
11:55
Ch 04. Sequence-to-Sequence - 17. 실습 각 모듈 통합하여 구현하기
14:30
Ch 04. Sequence-to-Sequence - 18. 실습 Trainer 구현하기
32:38
Ch 04. Sequence-to-Sequence - 19. 실습 Data Loader 구현하기
7:29
Ch 04. Sequence-to-Sequence - 20. 실습 train.py 구현하기
24:55
Ch 04. Sequence-to-Sequence - 21. 실습 continue_train.py 구현하기
25:30
Ch 04. Sequence-to-Sequence - 22. 정리하며
12:29
5
Chapter5. Inference for NLG
6 / 8
Ch 05. Inference for NLG - 01. 들어가며
17:11
Ch 05. Inference for NLG - 02. Greedy & Sampling
9:09
Ch 05. Inference for NLG - 03. Length & Coverage Penalty
9:45
Ch 05. Inference for NLG - 04. 실습 실습 소개
5:14
Ch 05. Inference for NLG - 05. 실습 추론 코드 작성하기
16:12
Ch 05. Inference for NLG - 06. 실습 translate.py 구현하기
14:01
Ch 05. Inference for NLG - 07. 실습 결과 확인
21:19
Ch 05. Inference for NLG - 08. 정리하며
4:19
6
Chapter6. Evaluations
3 / 5
Ch 06. Evaluations - 01. 들어가며
13:46
Ch 06. Evaluations - 02. Perplexity and BLEU
13:50
Ch 06. Evaluations - 03. TIP 프로젝트 경험담
19:41
Ch 06. Evaluations - 04. 실습 BLEU 구하는 방법
17:43
Ch 06. Evaluations - 05. 정리하며
8:09
7
Chapter7. Beam Search
6 / 6
Ch 07. Beam Search - 01. Introduction
26:32
Ch 07. Beam Search - 02. Beam Search 소개
12:46
Ch 07. Beam Search - 03. 실습 실습 소개
6:33
Ch 07. Beam Search - 04. 실습 Beam Search 함수 구현
45:42
Ch 07. Beam Search - 05. 실습 결과 확인
20:40
Ch 07. Beam Search - 06. 정리하며
4:43
8
Chapter8. Transformer
0 / 18
Ch 08. Transformer - 01. Transformer 소개
12:21
Ch 08. Transformer - 02. Multi-head Attention
22:08
Ch 08. Transformer - 03. Encoder
9:45
Ch 08. Transformer - 04. Decoder with Masking
17:43
Ch 08. Transformer - 05. Positional Encoding
8:50
Ch 08. Transformer - 06. Learning rate warm-up and linear decay
22:06
Ch 08. Transformer - 07. Appendix Beyond the paper
20:10
Ch 08. Transformer - 08. 실습 소개
12:11
Ch 08. Transformer - 09. 실습 Multi-head Attention 구현하기
22:30
Ch 08. Transformer - 10. 실습 Encoder Block 구현하기
8:49
Ch 08. Transformer - 11. 실습 Decoder Block 구현하기
15:55
Ch 08. Transformer - 12. 실습 Transformer Class 구현하기
28:26
Ch 08. Transformer - 13. 실습 train.py & Trainer 구현하기
30:57
Ch 08. Transformer - 14. 실습 Search 함수 구현하기
25:08
Ch 08. Transformer - 15. 실습 Beam Search 리뷰
7:06
Ch 08. Transformer - 16. 실습 Beam Search 함수 구현하기
15:13
Ch 08. Transformer - 17. 실습 결과 확인
24:37
Ch 08. Transformer - 18. 정리하며
15:12
9
Chapter9. Advanced Topics on NLG
0 / 16
Ch 09. Advanced Topics on NLG - 01. Introduction
7:12
Ch 09. Advanced Topics on NLG - 02. Multilingual Machine Translation
13:19
Ch 09. Advanced Topics on NLG - 03. Language Model Ensemble
8:57
Ch 09. Advanced Topics on NLG - 04. Back Translation
23:53
Ch 09. Advanced Topics on NLG - 05. Motivations for RL in NLG
23:44
Ch 09. Advanced Topics on NLG - 06. RL Introduction
26:22
Ch 09. Advanced Topics on NLG - 07. Policy Gradients
31:22
Ch 09. Advanced Topics on NLG - 08. Minimum Risk Training (MRT)
18:59
Ch 09. Advanced Topics on NLG - 09. TIP 이 섹션에서 얻어갔으면 하는 것
8:28
Ch 09. Advanced Topics on NLG - 10. 실습 소개
28:11
Ch 09. Advanced Topics on NLG - 11. 실습 rl_trainer.py 구현하기
15:34
Ch 09. Advanced Topics on NLG - 12. 실습 Reward 함수 구현하기
19:49
Ch 09. Advanced Topics on NLG - 13. 실습 loss 구현하기
30:02
Ch 09. Advanced Topics on NLG - 14. 실습 train.py 나머지 구현하기
13:48
Ch 09. Advanced Topics on NLG - 15. 실습 결과 확인
26:35
Ch 09. Advanced Topics on NLG - 16. 정리하며
9:05
10
Chapter10. Advanced Machine Translations
0 / 14
Ch 10. Advanced Machine Translations - 01. Dual Learning이란-
17:00
Ch 10. Advanced Machine Translations - 02. Dual Supervised Learning (DSL)
18:45
Ch 10. Advanced Machine Translations - 03. 실습 실습 소개
22:17
Ch 10. Advanced Machine Translations - 04. 실습 LM 구현하기
6:49
Ch 10. Advanced Machine Translations - 05. 실습 LM Trainer 구현하기
13:43
Ch 10. Advanced Machine Translations - 06. 실습 Dual Learning Trainer 구현하기
16:43
Ch 10. Advanced Machine Translations - 07. 실습 loss 구현하기
16:31
Ch 10. Advanced Machine Translations - 08. 실습 dual_train.py 구현하기
15:48
Ch 10. Advanced Machine Translations - 09. 실습 translate.py 추가 구현하기
28:15
Ch 10. Advanced Machine Translations - 10. 실습 결과 확인
27:29
Ch 10. Advanced Machine Translations - 11. Dual Learning for Machine Translation
11:49
Ch 10. Advanced Machine Translations - 12. Dual Unsupervised Learning (DUL)
16:15
Ch 10. Advanced Machine Translations - 13. Back Translation Review
17:42
Ch 10. Advanced Machine Translations - 14. 정리하며
8:45
11
Chapter11. Summary
0 / 1
Ch 11. Summary - 01. Summary
22:58
0 Comments
Post a Comment