8293  Reviews star_rate star_rate star_rate star_rate star_half

Fundamentals of Deep Learning and Generative AI

Deep learning and generative AI are closely related fields within the broader domain of artificial intelligence, with deep learning serving as a fundamental framework for building generative models....

Read More
$2,995 USD
Duration 5 days
Course Code WA3462
Available Formats Classroom

Overview

Deep learning and generative AI are closely related fields within the broader domain of artificial intelligence, with deep learning serving as a fundamental framework for building generative models. Understanding the fundamentals of both subjects, including related terminology and concepts, is critical to effectively applying these technologies in practice. This training teaches participants the core ideas and connects the conceptual dots behind deep learning and generative AI. Students will reinforce their theoretical knowledge of the topics by working through numerous hands-on exercises and labs.

Skills Gained

  • Neural networks and deep learning fundamentals
  • Concepts and terminology
  • TensorFlow and Keras
  • RNNs
  • CNNs
  • Embeddings
  • The semantic aspect
  • Generative AI fundamentals
  • Concepts and terminology
  • Large Language Models (LLMs)
  • Application of LLMs across various domains, including natural language processing, creative text generation, and code development
  • Multimodality
  • Foundation models
  • Transfer learning and fine-tuning
  • Prompt engineering
  • Generative adversarial networks (GANs)
  • Diffusion models
  • Transformers
  • AI alignment
  • Ethical AI

Who Can Benefit

Data practitioners, business analysts, software engineers, and IT architects.

Prerequisites

Basic knowledge of Python and familiarity with the NumPy library.

Course Details

Outline

Chapter 1 - Introduction to Neural Networks and Deep Learnin

  • What is an Artificial Neural Network?
  • Types of Neural Networks
  • Machine Learning with Neural Networks
  • Deep Learning
  • Navigating Neural Networks Layers
  • Positional Types of Layers
  • The Network and the Model
  • Model Properties
  • A Bit of Terminology
  • Data Pre-processing
  • How Does My Network Know Which Problem I Want It to Solve?
  • A Neuron
  • The Artificial Neuron
  • The Perceptron
  • The Perceptron Symbol
  • A Breakthrough in Neural Networks Design
  • Perceptrons and MLPs
  • A Basic Neural Network Example
  • A Quiz
  • Popular Activation Functions
  • Quizzes
  • A Sample Neural Network Diagram
  • Supervised Model Training
  • Measuring the Error with the Loss (Cost) Function
  • Mini-batches and Epochs
  • Neural Network Training Steps (1/2)
  • Neural Networks Training Steps (2/2)
  • Applying Efficiencies with Autodiff ...
  • Neural Network Libraries and Frameworks
  • Summary

Chapter 2 - Neural Network Concepts and Terminology

  • Why We Need Terminology ...
  • Features and Targets
  • Observations (Examples)
  • Notation for Observations
  • Data Structures: Tensors
  • Continuous and Categorical Features
  • Continuous Features
  • Categorical Features
  • Feature Types Visually
  • Feature Importance
  • Supervised and Unsupervised Machine Learning
  • Self-Supervised Learning
  • Common Distance Metrics
  • The Euclidean Distance
  • Visualizing Data on the X-Y Plane
  • A Quiz
  • The Coded Solution to the Quiz
  • A Hands-On Exercise
  • What is a Model?
  • Model Life-Cycles
  • Model Parameters and Hyperparameters
  • The Train/Validate/Test Machine Learning Triad
  • Training/Validation/Test Data Split Ratios
  • Data Splitting Considerations
  • Cross-Validation Technique
  • Test Data Leakage
  • Bias-Variance (Underfitting vs Overfitting) Trade-off
  • Bias and Variance Visually
  • Model Underfitting vs Model Overfitting Visually
  • A Quiz
  • Ways to Balance Off the Bias-Variance Ratio
  • Training Error vs Validation Error Diagram
  • Loss (Cost) Functions
  • Loss Function Properties
  • Mean Squared Error (MSE)
  • A Quiz
  • Mean Absolute Error (MAE)
  • (Categorical) Cross Entropy Loss
  • The Cross Entropy Loss Visually
  • The Chain Rule in Calculus
  • The Chain Rule in Neural Networks
  • Gradient Descent in Neural Networks (1/2)
  • Gradient Descend Visually
  • Gradient Descent in Neural Networks (2/2)
  • An Annotated Example of Gradient Calculation
  • A Hands-On Exercise
  • The softmax Function
  • Coding Softmax
  • Model Accuracy in Classification Tasks
  • Confusion Matrix
  • The Binary Classification Confusion Matrix
  • Multi-class Classification Confusion Matrix Example
  • Feature Engineering
  • Data Scaling and Normalization
  • The Data Normalization Tooling
  • Regularization
  • A Hands-On Exercise
  • Mathematical Formulations ...
  • Dimensionality Reduction
  • Online Machine Learning Glossaries
  • Summary

Chapter 3 - TensorFlow Introduction

  • What is TensorFlow?
  • The TensorFlow Logo
  • Tensors and Python API
  • Python TensorFlow Interfaces Diagram
  • GPUs and TPUs
  • Google Colab
  • Data Tools
  • TensorFlow Variants
  • TensorFlow Core API
  • TensorFlow Lite
  • TFX (TensorFlow Extended)
  • A TFX Pipeline Example
  • XLA Optimization
  • TensorFlow Toolkit Stack
  • Keras
  • TensorBoard
  • Summary

Chapter 4 - Introduction to Keras

  • What is Keras?
  • Keras 3.0
  • Core Keras Data Structures
  • Layers in Keras
  • The Dense Layer
  • Defining the Layer Activation Function
  • Models in Keras
  • Components of a Keras Model
  • Creating Neural Networks in Keras
  • The Sequential Model
  • A Sequential Model Code Example
  • The Strengths and Weaknesses of Sequential Models
  • The Functional API
  • A Functional API Example
  • The Strengths and Weaknesses of the Functional API
  • Making New Layers and Models via Subclassing
  • A Layer Subclassing Example
  • A Model Subclassing Example
  • The Strengths and Weaknesses of Subclassing
  • Summary

Chapter 5 - Introduction to CNNs

  • Convolutional Neural Networks (CNNs)
  • Kernels and Convolutions
  • A Convolution Mathematically
  • A Convolution Visually
  • A Quiz
  • Kernels and Feature Maps
  • Feature Maps in CNNs
  • CNN Efficiencies
  • Feature Maps Visually
  • The Stride Hyperparameter
  • The CNN Architecture
  • The Conv2D Class
  • A Quiz
  • An Example of Pooling Layer
  • Finally, Putting it All Together
  • Summary

Chapter 6 - Introduction to RNNs

  • Recurrent Neural Networks (RNNs)
  • An Example of Predicting the Next Value in a Time Series
  • How Do RNNs Do It?
  • A Quiz
  • Feedforward Neural Networks vs RNNs
  • Mathematical Formulations
  • (Simplified) RNN Visual Representations
  • A Quiz
  • A Quiz
  • A More Accurate RNN Diagram
  • An Example to Build Intuition
  • A Quiz
  • A Quiz
  • Sampling the Data
  • Problems with RNNs
  • LSTM and GRU Networks
  • Problems with LSTM and GRU Networks
  • RNNs as a Precursor of Generative AI
  • Summary

Chapter 7 - Embeddings

  • Embeddings ...
  • Understanding the Embeddings Visually
  • Dimensionality
  • The Semantic Aspect of Embeddings
  • Word Embeddings in NLP
  • Embeddings in Transformers
  • Cosine Similarity
  • A Quiz
  • An Exercise
  • Summary

Chapter 8 - Introduction to Generative AI

  • The Age of Digital Assistants ...
  • What is Generative AI?
  • Applications
  • What Are Natural Language Models?
  • The Probabilistic Language Model
  • Training a Language Model to Predict the Next Word
  • Generative AI, the Pre-Cursor Technologies
  • RNN Limitations
  • Wait, there is More ...
  • A Quiz
  • A Hands-On Exercise
  • Transformers
  • The Problem Domain
  • LLMs
  • Multimodality of LLMs
  • Infographic of Multimodality Tasks
  • Generative Foundation Models
  • Example of an LLM Explaining a Joke
  • Example of Cause & Effect Reasoning
  • Inferring Movie from Emoji
  • Fine-Tuning and Transfer Learning
  • A Quiz
  • Transfer Learning in Computer Vision
  • The Transfer Learning Diagram
  • Can I Have My Very Own Model?
  • The Age of Digital Assistants ... Transformed ...
  • The Training Datasets
  • The Training Techniques
  • Hugging Face
  • The Evolutionary Tree of LLMs
  • The LLM Capabilities vs LLM Size (in Parameters)
  • Does the Model Size Matter?
  • Inference Accuracy vs LLM Size
  • OpenAI GPT Models
  • ChatGPT
  • The Microsoft 365 Copilot Ecosystem
  • The LLaMA Family of LLMs
  • LLaMA 2
  • The AI-Powered Chatbots
  • Options for Accessing LLMs
  • Cloud Hosting
  • Prompt Engineering
  • Context Window and Prompts
  • Zero- and Few-Shot Prompting
  • A Hands-On Exercise
  • Understanding Model Sizes
  • Physical Model Sizes
  • Quantization
  • A Quiz
  • Generative Adversarial Networks
  • Generator and Discriminator Networks
  • A High-Level GAN Diagram
  • The Above Generator's Sample Output
  • A Quiz
  • The Diffusion Models, Names, and “Competition”
  • The Core Diffusion Modeling Idea
  • The Diffusion Process
  • Example of a Denoising Process
  • AI Alignment
  • Ethical AI
  • Summary

Chapter 9 - Introduction to Transformers

  • What is a Transformer?
  • Transformer Use Cases
  • Transformers, Encoders, and Decoders
  • Recurrent Neural Networks
  • Why Transformers?
  • The Transformer Evolution Path
  • A Short Summary of the Transformer Inner Workings
  • N-Grams
  • Tokenization
  • Two Types of Weights
  • (Self-)Attention (1 of 3)
  • (Self-)Attention (2 of 3)
  • (Self-)Attention (3 of 3)
  • Multi-Head Attention
  • The Encoder
  • The Decoder
  • The Head-First Approach ...
  • The Transformer Model Architecture
  • Model Training
  • The Overall Translation (Inference) Process
  • An Example of Sentence Syntax Analysis
  • Positional Encoding
  • The Encoding Part (a Big Picture)
  • The Decoder Attention Units
  • Cross-Attention
  • The Decoder Part (a Big Picture)
  • The Attentions Weights Matrix
  • Summary

Hands-on Exercises

  • Lab 1 - Learning the Colab Jupyter Notebook Environment
  • Lab 2 - Neural Network Playground Web App
  • Lab 3 - Multi-layer Perceptron Classifier
  • Lab 4 - Vectors and Matrix Operations
  • Lab 5 - Understanding the Gradient Descent Algorithm
  • Lab 6 - Understanding Regularization
  • Lab 7 - TensorFlow Basics
  • Lab 8 - Using Keras for Image Classification
  • Lab 9 - Using CNNs for Image Classification
  • Lab 10 - Understanding RNNs
  • Lab 11 - Word2vec Pre-Trained Embeddings
  • Lab 12 - Hello, Generative AI!
  • Lab 13 - Using OpenAI API
  • Lab 14 - Colab Code Assistant
  • Lab 15 - NLP and NLU with Transformers

Schedule

FAQ

Does the course schedule include a Lunchbreak?

Classes typically include a 1-hour lunch break around midday. However, the exact break times and duration can vary depending on the specific class. Your instructor will provide detailed information at the start of the course.

What languages are used to deliver training?

Most courses are conducted in English, unless otherwise specified. Some courses will have the word "FRENCH" marked in red beside the scheduled date(s) indicating the language of instruction.

What does GTR stand for?

GTR stands for Guaranteed to Run; if you see a course with this status, it means this event is confirmed to run. View our GTR page to see our full list of Guaranteed to Run courses.

Does Ascendient Learning deliver group training?

Yes, we provide training for groups, individuals and private on sites. View our group training page for more information.

What does vendor-authorized training mean?

As a vendor-authorized training partner, we offer a curriculum that our partners have vetted. We use the same course materials and facilitate the same labs as our vendor-delivered training. These courses are considered the gold standard and, as such, are priced accordingly.

Is the training too basic, or will you go deep into technology?

It depends on your requirements, your role in your company, and your depth of knowledge. The good news about many of our learning paths, you can start from the fundamentals to highly specialized training.

How up-to-date are your courses and support materials?

We continuously work with our vendors to evaluate and refresh course material to reflect the latest training courses and best practices.

Are your instructors seasoned trainers who have deep knowledge of the training topic?

Ascendient Learning instructors have an average of 27 years of practical IT experience and have also served as consultants for an average of 15 years. To stay current, instructors spend at least 25 percent of their time learning new, emerging technologies and courses.

Do you provide hands-on training and exercises in an actual lab environment?

Lab access is dependent on the vendor and the type of training you sign up for. However, many of our top vendors will provide lab access to students to test and practice. The course description will specify lab access.

Will you customize the training for our company’s specific needs and goals?

We will work with you to identify training needs and areas of growth.  We offer a variety of training methods, such as private group training, on-site of your choice, and virtually. We provide courses and certifications that are aligned with your business goals.

How do I get started with certification?

Getting started on a certification pathway depends on your goals and the vendor you choose to get certified in. Many vendors offer entry-level IT certification to advanced IT certification that can boost your career. To get access to certification vouchers and discounts, please contact info@ascendientlearning.com.

Will I get access to content after I complete a course?

You will get access to the PDF of course books and guides, but access to the recording and slides will depend on the vendor and type of training you receive.

How do I request a W9 for Ascendient Learning?

View our filing status and how to request a W9.

Reviews

Concise and good to follow along. Although it is a lot to take in under a short period of time.

ExitCertified gave a great course on AWS that covered all of the basics in depth with good lab materials.

I was very pleased with the course setup by ExitCertified and the instructor.

Easy to use and exactly what I was looking for. Value for money was exceptional.

The class was very vast paced however the teacher was very good at checking in on us while giving us time to complete the labs.