English | MP4 | AVC 1280×720 | AAC 44KHz 2ch | 132 Lessons (23h 35m) | 9.97 GB
A Casual Guide for Artificial Intelligence, Deep Learning, and Python Programmers
Common scenario: You try to get into machine learning and data science, but there’s SO MUCH MATH.
Either you never studied this math, or you studied it so long ago you’ve forgotten it all.
What do you do?
Well my friends, that is why I created this course.
Probability is one of the most important math prerequisites for data science and machine learning. It’s required to understand essentially everything we do, from the latest LLMs like ChatGPT, to diffusion models like Stable Diffusion and Midjourney, to statistics (what I like to call “probability part 2”).
Markov chains, an important concept in probability, form the basis of popular models like the Hidden Markov Model (with applications in speech recognition, DNA analysis, and stock trading) and the Markov Decision Process or MDP (the basis for Reinforcement Learning).
Machine learning (statistical learning) itself has a probabilistic foundation. Specific models, like Linear Regression, K-Means Clustering, Principal Components Analysis, and Neural Networks, all make use of probability.
In short, probability cannot be avoided!
If you want to do machine learning beyond just copying library code from blogs and tutorials, you must know probability.
This course will cover everything that you’d learn (and maybe a bit more) in an undergraduate-level probability class. This includes random variables and random vectors, discrete and continuous probability distributions, functions of random variables, multivariate distributions, expectation, generating functions, the law of large numbers, and the central limit theorem.
Most important theorems will be derived from scratch. Don’t worry, as long as you meet the prerequisites, they won’t be difficult to understand. This will ensure you have the strongest foundation possible in this subject. No more memorizing “rules” only to apply them incorrectly / inappropriately in the future! This course will provide you with a deep understanding of probability so that you can apply it correctly and effectively in data science, machine learning, and beyond.
Suggested prerequisites:
- Differential calculus, integral calculus, and vector calculus
- Linear algebra
- General comfort with university/collegelevel mathematics
What you’ll learn
- Conditional probability, Independence, and Bayes’ Rule
- Use of Venn diagrams and probability trees to visualize probability problems
- Discrete random variables and distributions: Bernoulli, categorical, binomial, geometric, Poisson
- Continuous random variables and distributions: uniform, exponential, normal (Gaussian), Laplace, Gamma, Beta
- Cumulative distribution functions (CDFs), probability mass functions (PMFs), probability density functions (PDFs)
- Joint, marginal, and conditional distributions
- Multivariate distributions, random vectors
- Functions of random variables, sums of random variables, convolution
- Expected values, expectation, mean, and variance
- Skewness, kurtosis, and moments
- Covariance and correlation, covariance matrix, correlation matrix
- Moment generating functions (MGF) and characteristic functions
- Key inequalities like Markov, Chebyshev, Cauchy-Schwartz, Jensen
- Convergence in probability, convergence in distribution, almost sure convergence
- Law of large numbers and the Central Limit Theorem (CLT)
- Applications of probability in machine learning, data science, and reinforcement learning
Who this course is for:
- Python developers and software developers curious about Data Science
- Professionals interested in Machine Learning and Data Science but haven’t studied college-level math
- Students interested in ML and AI but find they can’t keep up with the math
- Former STEM students who want to brush up on probability before learning about artificial intelligence
Table of Contents
Welcome
1 Introduction
2 Outline
3 Where to Get the Code
4 How to Succeed in this Course
Probability Basics
5 Probability Basics Section Introduction
6 What Is Probability
7 Wrong Definition of Probability (Common Mistake)
8 Wrong Definition of Probability (Example)
9 Probability Models
10 Venn Diagrams
11 Properties of Probability Models
12 Union Example
13 Law of Total Probability
14 Conditional Probability
15 Bayes’ Rule
16 Bayes’ Rule Example
17 Independence
18 Mutual Independence Example
19 Probability Tree Diagrams
20 Probability Basics Section Summary
21 Suggestion Box
Random Variables and Probability Distributions
22 Discrete Random Variables and Distributions Section Introduction
23 What is a Random Variable
24 The Bernoulli Distribution
25 The Categorical Distribution
26 The Binomial Distribution
27 The Geometric Distribution
28 The Poisson Distribution
29 Visualizing Probability Distributions in Python
30 Discrete Random Variables Section Summary
Continuous Random Variables and Probability Density Functions
31 Continuous Random Variables and Distributions Section Introduction
32 Continuous Random Variables and Continuous Distributions
33 Physics Analogy
34 More About Continuous Distributions
35 The Uniform Distribution
36 The Exponential Distribution
37 The Normal Distribution (Gaussian Distribution)
38 The Laplace (Double Exponential) Distribution
39 Visualizing Continuous Probability Distributions in Python
40 Continuous Random Variables Section Summary
More About Probability Distributions and Random Variables
41 More About Probability Distributions and Random Variables Section Introduction
42 Cumulative Distribution Function (CDF)
43 Exercise CDF of Geometric Distribution
44 CDFs for Continuous Random Variables
45 Exercise CDF of Normal Distribution
46 Change of Variables (Functions of Random Variables) pt 1
47 Change of Variables (Functions of Random Variables) pt 2
48 Joint and Marginal Distributions pt 1
49 Joint and Marginal Distributions pt 2
50 Exercise Marginal of Bivariate Normal
51 Conditional Distributions and Bayes’ Rule
52 Exercise Conditioning with Joint Normal and Linear Regression
53 Independence
54 Exercise Bivariate Normal with Zero Correlation
55 Multivariate Distributions and Random Vectors
56 Multivariate Normal Distribution Vector Gaussian
57 Multinomial Distribution
58 Exercise MVN to Bivariate Normal
59 Exercise Multivariate Normal, Zero Correlation Implies Independence
60 Multidimensional Change of Variables (Discrete)
61 Multidimensional Change of Variables (Continuous)
62 Convolution From Adding Random Variables
63 Exercise Sums of Jointly Normal Random Variables (Optional)
64 Visualizing CDFs and Joint Distributions in Python
65 CDFs and Multiple Random Variables Section Summary
Expectation and Expected Values
66 Expectation Section Introduction
67 Expected Value and Mean
68 Properties of the Expected Value
69 Variance
70 Exercise Mean and Variance of Bernoulli
71 Exercise Mean and Variance of Poisson
72 Exercise Mean and Variance of Normal
73 Exercise Mean and Variance of Exponential
74 Moments, Skewness and Kurtosis
75 Exercise Kurtosis of Normal Distribution
76 Covariance and Correlation
77 Exercise Covariance and Correlation of Bivariate Normal
78 Exercise Zero Correlation Does Not Imply Independence
79 Exercise Correlation Measures Linear Relationships
80 Conditional Expectation pt 1
81 Conditional Expectation pt 2
82 Law of Total Expectation
83 Exercise Linear Combination of Normals
84 Exercise Mean and Variance of Weighted Sums
85 Expectation Section Summary
Generating Functions
86 Generating Functions Section Introduction
87 Moment Generating Functions (MGF)
88 Exercise MGF of Exponential
89 Exercise MGF of Normal
90 Characteristic Functions
91 Exercise MGF Doesn’t Exist
92 Exercise Characteristic Function of Normal
93 Sums of Independent Random Variables
94 Exercise Distribution of Sum of Poisson Random Variables
95 Exercise Distribution of Sum of Geometric Random Variables
96 Moment Generating Functions for Random Vectors
97 Characteristic Functions for Random Vectors
98 Exercise Weighted Sums of Normals
99 Generating Functions in Python
100 Generating Functions Section Summary
Inequalities
101 Inequalities Section Introduction
102 Monotonicity
103 Markov Inequality
104 Chebyshev Inequality
105 Cauchy-Schwartz Inequality
106 Inequalities Section Summary
Limit Theorems
107 Limit Theorems Section Introduction
108 Convergence In Probability
109 Weak Law of Large Numbers
110 Convergence With Probability 1 (Almost Sure Convergence)
111 Strong Law of Large Numbers
112 Application Frequentist Perspective Revisited
113 Convergence In Distribution
114 Central Limit Theorem
115 LLN and CLT in Python
116 Limit Theorems Section Summary
Advanced and Other Topics
117 The Gamma Distribution
118 The Beta Distribution
119 Chain Rule of Probability
120 Why Does the Normal Distribution Integrate to 1
Appendix FAQ Intro
121 What is the Appendix
Setting Up Your Environment (AppendixFAQ by Student Request)
122 Pre-Installation Check
123 Anaconda Environment Setup
124 How to Install Numpy, Scipy, Matplotlib, Pandas, PyTorch, and TensorFlow
125 Where To Get the Code Troubleshooting
126 How to use Github & Extra Coding Tips (Optional)
Effective Learning Strategies for Machine Learning FAQ
127 Math Order for Machine Learning & Data Science
128 Can YouTube Teach Me Calculus (Optional)
129 Is this for Beginners or Experts Academic or Practical Fast or slow-paced
130 What order should I take your courses in (part 1)
131 What order should I take your courses in (part 2)
Appendix FAQ Finale
132 BONUS
Resolve the captcha to access the links!
