Course Overview
TOPThis course provides a comprehensive introduction to Large Language Models (LLMs), focusing on what they are, how to build them using PyTorch, and how to use them for inference in language tasks. Participants will learn about the history of LLMs, how LLMs fit into the larger AI/Generative AI landscape, neural-network-based language models, and how to use RNNs, LSTMs, and transformers for natural language processing tasks.
Scheduled Classes
TOPOutline
TOP- Introduction to NLP
- What is NLP?
- NLP Basics: Text Preprocessing and Tokenization
- NLP Basics: Word Embeddings
- Introducing Traditional NLP Libraries
- A brief history of modeling language
- Introducing PyTorch and HuggingFace for Text Preprocessing
- Neural Networks and Text Data
- Building Language Models using RNNs and LSTMs
- Transformers and LLMs
- Introduction to Transformers
- Using Hugging Face s Transformers for inference
- LLMs and Generative AI
- Current LLM Options
- Fine tuning GPT
- Aligning LLMs with Human Values
- Retrieval-Augmented Generation (RAG) Systems
Prerequisites
TOP- Proficiency in Python programming
- Familiarity with data analysis using Pandas
Who Should Attend
TOP- AI/ML Enthusiasts interested in learning about NLP (Natural Language Processing) and Large Language Models (LLMs).
- Data Scientists/Engineers interesting in using LLMs for inference and finetuning
- Software Developers wanting basic practical experience with NLP frameworks and LLMs
- Students and Professionals curious about the basics of transformers and how they power AI models