Total Internal Reflection
Technology and Art
Enter your email to receive updates
Home
Machine Learning
Mathematics
Engineering
References
Art
Archives
About Me
Resume
Code
Cobol REKT
Tape/Z
Plenoxels
Transformer
Basis-Processing
Cataract
COMRADE
Duck-Angular
Exo
IRIS
MuchHeap
Snail-MapReduce
Underline
Lambda-Queuer
jQuery-Jenkins Radiator
Contact
Github
Twitter
LinkedIn
Site Feed
Miscellaneous Topics
Avishek Sen Gupta
on 6 April 2021
Miscellaneous Topics
Newton-Raphson Method
Gradient Descent and Stochastic Gradient Descent
Dynamic Programming
Missing Data and Data Imputation
L1 and L2 Regularisation
p-values and hypothesis testing
Bagging and Boosting
Expectation Maximisation
Naive Bayes
Cross Validation
k-means clustering
Basic Explanations of Expectation and Conditional Probability
Latest ML Papers
Relationship between MLE and MAP
Multivariate Gaussian and the Covariance Matrix
Quadratic Approximation
Logistic Regression MLE Cost Function
Bayesian interpretation of Linear Regression
Basic Probability, Expectation, etc.
Gaussian Processes and Intuition
Information Geometry
Singular Value Decomposition
ML Practice
Newton-Raphson Method
Gradient Descent and Stochastic Gradient Descent
Dynamic Programming
Sequential Minimal Optimisation
Schur Complement
Cholesky Factorisation
tags:
Machine Learning
-
Theory
-
Probability