# A Fish without a Bicycle

Technology and Art

## Machine Learning Theory Track

(Click the image below to see the full-size image) • ## Kernel Functions: Functional Analysis and Linear Algebra Preliminaries

Avishek Sen Gupta on 17 July 2021

This article lays the groundwork for an important construction called Reproducing Kernel Hilbert Spaces, which allows a certain class of functions (called Kernel Functions) to be a valid representation of an inner product in (potentially) higher-dimensional space. This construction will allow us to perform the necessary higher-dimensional computations, without projecting every point in our data set into higher dimensions, explicitly, in the case of Non-Linear Support Vector Machines, which will be discussed in the upcoming article.

• ## Real Analysis: Patterns for Proving Irrationality of Square Roots

Avishek Sen Gupta on 9 July 2021

ld Continuing on my journey through Real Analysis, we will focus here on common proof patterns which apply to irrational square roots. These patterns apply to the following sort of proof exercises:

• ## The Cholesky and LDL* Factorisations

Avishek Sen Gupta on 8 July 2021

This article discusses a set of two useful (and closely related) factorisations for positive-definite matrices: the Cholesky and the $$LDL^T$$ factorisations. Both of them find various uses: the Cholesky factorisation particularly is used when solving large systems of linear equations.

• ## The Gram-Schmidt Orthogonalisation

Avishek Sen Gupta on 27 May 2021

We discuss an important factorisation of a matrix, which allows us to convert a linearly independent but non-orthogonal basis to a linearly independent orthonormal basis. This uses a procedure which iteratively extracts vectors which are orthonormal to the previously-extracted vectors, to ultimately define the orthonormal basis. This is called the Gram-Schmidt Orthogonalisation, and we will also show a proof for this.

• ## Real Analysis Proofs #1

Avishek Sen Gupta on 18 May 2021

Since I’m currently self-studying Real Analysis, I’ll be listing down proofs I either initially had trouble understanding, or enjoyed proving, here. These are very mathematical posts, and are for personal documentation, mostly.

• ## Support Vector Machines from First Principles: Linear SVMs

Avishek Sen Gupta on 10 May 2021

We have looked at how Lagrangian Multipliers and how they help build constraints as part of the function that we wish to optimise. Their relevance in Support Vector Machines is how the constraints about the classifier margin (i.e., the supporting hyperplanes) is incorporated in the search for the optimal hyperplane.

• ## Quadratic Optimisation: Lagrangian Dual, and the Karush-Kuhn-Tucker Conditions

Avishek Sen Gupta on 10 May 2021

This article concludes the (very abbreviated) theoretical background required to understand Quadratic Optimisation. Here, we extend the Lagrangian Multipliers approach, which in its current form, admits only equality constraints. We will extend it to allow constraints which can be expressed as inequalities.

• ## Quadratic Optimisation: Mathematical Background

Avishek Sen Gupta on 8 May 2021

This article continues the original discussion on Quadratic Optimisation, where we considered Principal Components Analysis as a motivation. Originally, this article was going to begin delving into the Lagrangian Dual and the Karush-Kuhn-Tucker Theorem, but the requisite mathematical machinery to understand some of the concepts necessitated breaking the preliminary setup into its own separate article (which you’re now reading).

• ## Common Ways of Looking at Matrix Multiplications

Avishek Sen Gupta on 29 April 2021

We consider the more frequently utilised viewpoints of matrix multiplication, and relate it to one or more applications where using a certain viewpoint is more useful. These are the viewpoints we will consider.

• ## Intuitions about the Implicit Function Theorem

Avishek Sen Gupta on 29 April 2021

We discussed the Implicit Function Theorem at the end of the article on Lagrange Multipliers, with some hand-waving to justify the linear behaviour on manifolds in arbitrary $$\mathbb{R}^N$$.

• ## Quadratic Optimisation using Principal Component Analysis as Motivation: Part Two

Avishek Sen Gupta on 28 April 2021

We pick up from where we left off in Quadratic Optimisation using Principal Component Analysis as Motivation: Part One. We treated Principal Component Analysis as an optimisation, and took a detour to build our geometric intuition behind Lagrange Multipliers, wading through its proof to some level.

• ## Vector Calculus: Lagrange Multipliers, Manifolds, and the Implicit Function Theorem

Avishek Sen Gupta on 24 April 2021

In this article, we finally put all our understanding of Vector Calculus to use by showing why and how Lagrange Multipliers work. We will be focusing on several important ideas, but the most important one is around the linearisation of spaces at a local level, which might not be smooth globally. The Implicit Function Theorem will provide a strong statement around the conditions necessary to satisfy this.

• ## Vector Calculus: Graphs, Level Sets, and Constraint Manifolds

Avishek Sen Gupta on 20 April 2021

In this article, we take a detour to understand the mathematical intuition behind Constrained Optimisation, and more specifically the method of Lagrangian multipliers. We have been discussing Linear Algebra, specifically matrices, for quite a bit now. Optimisation theory, and Quadratic Optimisation as well, relies heavily on Vector Calculus for many of its results and proofs.

• ## Quadratic Optimisation using Principal Component Analysis as Motivation: Part One

Avishek Sen Gupta on 19 April 2021

This series of articles presents the intuition behind the Quadratic Form of a Matrix, as well as its optimisation counterpart, Quadratic Optimisation, motivated by the example of Principal Components Analysis. PCA is presented here, not in its own right, but as an application of these two concepts. PCA proper will be presented in another article where we will discuss eigendecomposition, eigenvalues, and eigenvectors.

• ## Road to Gaussian Processes

Avishek Sen Gupta on 17 April 2021

This article aims to start the road towards a theoretical intuition behind Gaussian Processes, another Machine Learning technique based on Bayes’ Rule. However, there is a raft of material that I needed to understand and relearn before fully appreciating some of the underpinnings of this technique.

• ## Support Vector Machines from First Principles: Part One

Avishek Sen Gupta on 14 April 2021

We will derive the intuition behind Support Vector Machines from first principles. This will involve deriving some basic vector algebra proofs, including exploring some intuitions behind hyperplanes. Then we’ll continue adding to our understanding the concepts behind quadratic optimisation.

• ## Dot Product: Algebraic and Geometric Equivalence

Avishek Sen Gupta on 11 April 2021

The dot product of two vectors is geometrically simple: the product of the magnitudes of these vectors multiplied by the cosine of the angle between them. What is not immediately obvious is the algebraic interpretation of the dot product.

• ## Linear Regression: Assumptions and Results using the Maximum Likelihood Estimator

Avishek Sen Gupta on 5 April 2021

Let’s look at Linear Regression. The “linear” term refers to the fact that the output variable is a linear combination of the input variables.

• ## Matrix Rank and Some Results

Avishek Sen Gupta on 4 April 2021

I’d like to introduce some basic results about the rank of a matrix. Simply put, the rank of a matrix is the number of independent vectors in a matrix. Note that I didn’t say whether these are column vectors or row vectors; that’s because of the following section which will narrow down the specific cases (we will also prove that these numbers are equal for any matrix).

• ## Assorted Intuitions about Matrices

Avishek Sen Gupta on 3 April 2021

Some of these points about matrices are worth noting down, as aids to intuition. I might expand on some of these points into their own posts.

• ## Matrix Outer Product: Columns-into-Rows and the LU Factorisation

Avishek Sen Gupta on 2 April 2021

We will discuss the Column-into-Rows computation technique for matrix outer products. This will lead us to one of the important factorisations (the LU Decomposition) that is used computationally when solving systems of equations, or computing matrix inverses.

• ## Intuitions about the Orthogonality of Matrix Subspaces

Avishek Sen Gupta on 2 April 2021

This is the easiest way I’ve been able to explain to myself around the orthogonality of matrix spaces. The argument will essentially be based on the geometry of planes which extends naturally to hyperplanes.

• ## Matrix Outer Product: Value-wise computation and the Transposition Rule

Avishek Sen Gupta on 1 April 2021

We will discuss the value-wise computation technique for matrix outer products. This will lead us to a simple sketch of the proof of reversal of order for transposed outer products.

• ## Matrix Outer Product: Linear Combinations of Vectors

Avishek Sen Gupta on 30 March 2021

Matrix multiplication (outer product) is a fundamental operation in almost any Machine Learning proof, statement, or computation. Much insight may be gleaned by looking at different ways of looking matrix multiplication. In this post, we will look at one (and possibly the most important) interpretation: namely, the linear combination of vectors.

• ## Vectors, Normals, and Hyperplanes

Avishek Sen Gupta on 29 March 2021

Linear Algebra deals with matrices. But that is missing the point, because the more fundamental component of a matrix is what will allow us to build our intuition on this subject. This component is the vector, and in this post, I will introduce vectors, along with common notations of expression.

• ## Machine Learning Theory Track

Avishek Sen Gupta on 28 March 2021

I’ve always been fascinated by Machine Learning. This began in the seventh standard when I discovered a second-hand book on Neural Networks for my ZX Spectrum.

• ## First Post

Avishek Sen Gupta on 28 March 2021

My previous Wordpress site went kaput. With it went most of the content that I’ve posted over the years. I have switched over to Jekyll, and will be adding new material, as well as reintroducing old material as needed, from a backup that I have. Stay posted!