Subscribe and Access : 5200+ FREE Videos and 21+ Subjects Like CRT, SoftSkills, JAVA, Hadoop, Microsoft .NET, Testing Tools etc..
Batch
Date: May 22nd @9:00PM
Faculty: Mr. Naveen Mourya (8+ Yrs Of Exp,..)
Duration: 3 Months
Venue
:
DURGA SOFTWARE SOLUTIONS,
Flat No : 202,
2nd Floor,
HUDA Maitrivanam,
Ameerpet, Hyderabad - 500038
Ph.No: +91 - 9246212143, 80 96 96 96 96
Syllabus:
GENERATIVE AI
Module 1: Python Fundamentals
Objective: Build strong Programming Fundamentals for AI and data processing
1.1 Python Basics
- What is Python and why it is popular in AI?
- Syntax, Variables, Data Types (int, float, str, list, dict, tuple, set)
- Operators and Expressions
- Input/Output and String Manipulation
1.2 Control Flow
- What are Control Flow Statements in Python?
- Conditional Statements (if, elif, else)
- Loops (for, while)
- Loop Control Statements (break, continue, pass)
1.3 Functions & Modules
- What are Functions in Python and why are they used?
- Creating and Calling Functions
- *args and **kwargs
- Lambda, Map, Filter, Reduce
- Standard Modules: datetime, os, math, random, re, json, time
1.4 File & API Handling
- What is File Handling and why it is important in AI?
- Reading/Writing Text, CSV, JSON
- Calling REST APIs using requests
- Error Handling with try/except
Libraries:
NumPy (for tensors, embeddings, vector math)
- Creating arrays and reshaping
- Matrix multiplication
- Element-wise operations (addition, subtraction, multiplication)
- Broadcasting (for scaling vectors)
- Generating random values
- Basic statistics
Pandas (for cleaning and preparing tabular/nested data)
- Reading JSON/CSV files
- Selecting and filtering rows
- Handling missing values
- Converting nested JSON to flat table
- Useful during preprocessing datasets for training or fine-tuning LLMs
Matplotlib (for visualizing model performance & embeddings)
- Line plots for loss/accuracy over epochs
- Scatter plots for embeddings or clusters
- Titles, axis labels, legends
- Saving plots for reports or monitoring
Module 2: Machine Learning Essentials
Objective: Learn Machine Learning basics required for Generative AI
2.1 Supervised Learning
- What is Supervised Learning and why it is important for AI
- Linear & Logistic Regression
2.2 Unsupervised Learning
- What is Unsupervised Learning and its real-world applications?
- PCA for Dimensionality Reduction
2.3 Model Evaluation
- Train-Test Split, Cross Validation
- Confusion Matrix, Precision, Recall, F1 Score
- ROC-AUC, Overfitting vs Underfitting
2.4 Feature Engineering
- One-Hot Encoding, Label Encoding
- Feature Scaling (MinMax, StandardScaler)
Module 3: Deep Learning & Neural Networks
Objective: Build Deep Learning Models using TensorFlow and PyTorch
3.1 Introduction to Neural Networks
- Perceptron, Feedforward Networks
- Backpropagation, Loss Functions
- Activation Functions: ReLU, Sigmoid, Softmax
3.2 Training Deep Learning Models
- Optimizers: SGD, Adam, RMSProp
- Epochs, Batches
- Learning Rate Schedules
3.3 Advanced Architectures
- Network Design Best Practices
- Transfer Learning
- Multi-modal Deep Learning
3.4 RNN, LSTM, GRU
- Sequence Modeling
- Time-Series Data Processing
- Sentiment Classification
3.5 Deployment Basics
- Saving & Loading Models
- TensorFlow Serving / TorchScript Introduction
- Mini project, Exercises on this topic are by default covered (Addtl Maths problem)
Module 4: Natural Language Processing (NLP)
Objective: Enable machines to understand and Process Human Language
4.1 Text Preprocessing
What is Text Pre-processing and why is it essential for NLP?
- Tokenization, Stopword Removal
- Stemming, Lemmatization
- POS Tagging, Named Entity Recognition (NER)
4.2 Text Vectorization
What are Word Embeddings and why are they important?
- Bag of Words, Count Vectorizer
- TF-IDF
- Word2Vec, GloVe, FastText
- Sentence Embeddings using BERT
4.3 Applications
- Text Classification
- Spam Detection
- Sentiment Analysis (Movie Reviews, Product Reviews)
Mini project and Exercises on this topic are by default covered
Module 5: Foundations of Generative AI
Objective: Understand the Principles of Generative AI and its real-world uses.
5.1 What is Generative AI?
What is Generative AI and how is it different from traditional AI?
- Generative vs Discriminative Models
- Use Cases: Text, Audio, Image, Chatbots
- Text Generation Applications
5.2 Generative Models
- Transformer-based Generation
- Text Generation with RNN/LSTM
- Diffusion Models (Text Context)
5.3 Tools and Datasets
- Hugging Face Transformers
- OpenAI GPT APIs
- Google Gemini, Mistral, LLaMA
Mini project and Exercises on this topic are by default covered
Module 6: Transformers & Large Language Models
Objective: Master the architecture that powers GPT, BERT, and LLaMA
6.1 Transformers Architecture
- Encoder-Decoder Mechanism
- Self-Attention, Multi-Head Attention
- Positional Encoding
- Cross vs Self Attention
Detailed problem explaining the Architecture will be covered.
Exercises based on Transfer learning are by default covered in this stage.
6.2 Transfer Learning
- What is Transfer Learning and why is it important in AI?
- Pretrained Models and Fine-Tuning
- Use Cases and Real-World Examples
Module 7: Large Language Models (LLMs)
Objective: Understand the core concepts behind LLMs like GPT, BERT, and LLaMA.
7.1 What are LLMs?
- What are Large Language Models and why are they important?
- How LLMs differ from traditional NLP models
- Evolution from RNNs to Transformers to LLMs
7.2 Types of LLMs
- GPT, BERT, RoBERTa, T5, LLaMA
- Instruction-Tuned Models
- Parameter Sizes: 7B, 13B, 70B
- Fine-Tuning vs Prompt Tuning
7.3 Fine-Tuning Fundamentals
- What is Fine-Tuning and why is it critical for Generative AI?
- Full Fine-Tuning vs. Parameter-Efficient Methods
- Data Requirements and Preparation
- Avoiding Catastrophic Forgetting
- Evaluation Strategies
7.4 Parameter-Efficient Fine-Tuning
- LoRA (Low-Rank Adaptation)& QLoRA
- PEFT (Parameter Efficient Fine-Tuning)
- Adapter Tuning
Module 8: Embeddings & Semantic Search
Objective: Understand how LLMs represent text and retrieve meaning.
8.1 Embedding Fundamentals
- What Are Embeddings?
- High-Dimensional Representation
- Embedding Use Cases (Text, Audio, Images)
8.2 Embedding Models
- SentenceTransformers
- BERT-based Embeddings
- OpenAI Embedding Models
8.3 Semantic Search
- Cosine Similarity
- KNN Search in Vector Space
- Use Cases: FAQ Bots, Contextual Search
8.4 Working with Vector Stores
- Inserting Embeddings into FAISS, Pinecone, ChromaDB
- Mini project/ Exercises are by default covered in this stage
Module 9: RAG, LangChain & Orchestration
Objective: Build systems that combine LLMs with your custom data
9.1 LangChain Framework
What is LangChain and why is it important in LLM workflows?
- PromptTemplate, LLMChain, SequentialChain
- Memory, Tools, Agents
- Output Parsers
9.2 Vector Databases
What are Vector Databases and why are they important for storing embeddings?
- FAISS, Pinecone, ChromaDB
- Sentence Embeddings with Transformers
- Storing & Retrieving Documents
9.3 Retrieval-Augmented Generation (RAG)
What is RAG and how it improves LLM performance?
- End-to-End Architecture: Query → Embed → Retrieve → Prompt
- Custom Loaders (PDFs, Docs)
Mini project / Exercises are by default covered in this stage
Module 10: Fine-Tuning & Model Customization
Objective: Tailor LLMs to your specific domain and tone.
10.1 Fine-Tuning Techniques
What is Fine-Tuning and why is it critical for Generative AI?
- LoRA (Low-Rank Adaptation)
- PEFT (Parameter Efficient Fine-Tuning)
- Hugging Face Trainer for Custom Models
10.2 Infra & Safety
- Google Colab / AWS / GCP for Fine-Tuning
Module 11: Capstone Projects
Objective: Showcase your knowledge in real-world applications.
Capstone Projects
Along with multiple mini projects covered for each and every topic in the flow. In this stage, we will create specific GEN AI applications like
Mini Project 1: PDF Q&A Chatbot
Mini Project 2: Product Description Generator
Mini Project 3: Fine-Tuned GPT for Finance
Enterprise-Grade Capstone Projects
Project 1: End-to-End AWS GenAI Application
Project 2: End-to-End GCP GenAI Application
Module 12: Interview Preparation
- Common Gen AI Interview Questions
- Resume & Portfolio Preparation
- Industry Knowledge
- Advanced Learning Resources
- Staying Current with GenAI Research