Day73 Deep Learning Lecture Review - Lecture 5
Transformers and Foundation Models: GELU, Layer Norm, Key Concepts & Workflow
Transformers and Foundation Models: GELU, Layer Norm, Key Concepts & Workflow
Brief Explanation of Basic Algebra and Machine Learning
Primary Goals, Common Tasks, and Deep Learning NLP
Transformer Architecture, How the Models Are Different, and Q,K,V in Self-Attention
Basic Concepts and the Detailed Architecture
Neural Net Zoo: Transformers, Recurrent Neural Networks (RNNs) and Graph Neural Networks (GNNs)
Types of Learning and Neural Net Zoo: Fully Connected Networks (MLPs), Inductive Bias, and Convolutional Neural Networks (CNNs)
Basic Mathematics, Supervised ML, and Review of Multi-Layered Perceptron
Bagging & Boosting : Basic Concepts & Code Implementation
Using the Majority Voting Principle to Make Predictions, and Evaluating & Tuning the Ensemble Classifier