Skip to content

AI Math

Master AI Math with 500 free flashcards. Study using spaced repetition and focus mode for effective learning in AI.

🎓 500 cards ⏱️ ~250 min Advanced
Study Full Deck →
Share: 𝕏 Twitter LinkedIn WhatsApp

🎯 What You'll Learn

Preview Questions

12 shown

What is a vector in the context of machine learning?

Show ▼

\[\mathbf{x} = \begin{bmatrix}x_1 \\ x_2 \\ \vdots \\ x_n\end{bmatrix} \in \mathbb{R}^n\]
Symbols: \(x_i\) = i-th component; \(\mathbb{R}^n\) = n-dimensional real space.
Intuition: A vector is an ordered list of n numbers. In ML it represents a data point, feature embedding, or parameter set as a point in n-dimensional space.

How do you add two vectors?

Show ▼

\[\mathbf{a} + \mathbf{b} = \begin{bmatrix}a_1+b_1 \\ a_2+b_2 \\ \vdots \\ a_n+b_n\end{bmatrix}\]
Symbols: \(a_i, b_i\) = corresponding components of vectors a and b.
Intuition: Add element-wise. Geometrically, place the tail of b at the head of a; the sum points to the new head.

How do you multiply a vector by a scalar?

Show ▼

\[c\mathbf{x} = \begin{bmatrix}cx_1 \\ cx_2 \\ \vdots \\ cx_n\end{bmatrix}\]
Symbols: \(c \in \mathbb{R}\) = scalar; \(x_i\) = i-th component.
Intuition: Scales every component by c. Stretches (|c|>1), shrinks (|c|

What is the L2 (Euclidean) norm of a vector?

Show ▼

\[\|\mathbf{x}\|_2 = \sqrt{\sum_{i=1}^n x_i^2}\]
Symbols: \(\|\cdot\|_2\) = L2 norm; \(x_i\) = i-th component; \(n\) = dimension.
Intuition: The straight-line (Euclidean) distance from the origin to the tip of the vector. Most common norm in ML for measuring distances.

What is the L1 norm of a vector?

Show ▼

\[\|\mathbf{x}\|_1 = \sum_{i=1}^n |x_i|\]
Symbols: \(|x_i|\) = absolute value of i-th component.
Intuition: Sum of absolute values. Also called the 'Manhattan' or 'taxicab' distance. Encourages sparsity in optimization (L1 regularization).

What is the general Lp norm of a vector?

Show ▼

\[\|\mathbf{x}\|_p = \left(\sum_{i=1}^n |x_i|^p\right)^{1/p}\]
Symbols: \(p \geq 1\) = norm order; \(x_i\) = i-th component.
Intuition: Generalizes L1 (p=1) and L2 (p=2). As \(p \to \infty\) it becomes the max norm \(\|\mathbf{x}\|_\infty = \max_i |x_i|\).

What is the Frobenius norm of a matrix?

Show ▼

\[\|A\|_F = \sqrt{\sum_{i,j} A_{ij}^2}\]
Symbols: \(A_{ij}\) = element at row i, column j.
Intuition: The L2 norm applied to a matrix by treating it as a flattened vector. Equivalent to \(\sqrt{\mathrm{tr}(A^\top A)}\).

What is the dot product of two vectors?

Show ▼

\[\mathbf{a} \cdot \mathbf{b} = \sum_{i=1}^n a_i b_i = \|\mathbf{a}\|_2\|\mathbf{b}\|_2\cos\theta\]
Symbols: \(a_i, b_i\) = components; \(\theta\) = angle between vectors.
Intuition: Measures how aligned two vectors are. Zero = orthogonal; positive = same direction; negative = opposite direction.

How do you use the dot product to check orthogonality?

Show ▼

\[\mathbf{a} \perp \mathbf{b} \iff \mathbf{a} \cdot \mathbf{b} = 0\]
Symbols: \(\perp\) = perpendicular symbol.
Intuition: Two vectors are orthogonal (perpendicular) iff their dot product is zero — cos 90° = 0.

How is the dot product used to compute the cosine similarity?

Show ▼

\[\cos\theta = \frac{\mathbf{a} \cdot \mathbf{b}}{\|\mathbf{a}\|_2\|\mathbf{b}\|_2}\]
Symbols: \(\theta\) = angle between a and b.
Intuition: Normalizes the dot product by the product of norms to get a similarity score in [-1, 1]. Widely used in NLP to compare embedding vectors.

What is the Cauchy-Schwarz inequality?

Show ▼

\[|\mathbf{a} \cdot \mathbf{b}| \leq \|\mathbf{a}\|_2 \|\mathbf{b}\|_2\]
Symbols: \(|\cdot|\) = absolute value; \(\|\cdot\|_2\) = L2 norm.
Intuition: The dot product can never exceed the product of the norms. Equality holds iff vectors are parallel.

What is a matrix and how is it defined?

Show ▼

\[A \in \mathbb{R}^{m \times n},\quad A = \begin{bmatrix}a_{11} & \cdots & a_{1n} \\ \vdots & \ddots & \vdots \\ a_{m1} & \cdots & a_{mn}\end{bmatrix}\]
Symbols: \(m\) = rows; \(n\) = columns; \(a_{ij}\) = element at row i, col j.
Intuition: A rectangular array of numbers. In ML, represents linear maps, datasets (rows = samples, cols = features), or weight tensors.

🎓 Start studying AI Math

🎮 Study Modes Available

🔄

Flashcards

Flip to reveal

🧠

Focus Mode

Spaced repetition

Multiple Choice

Test your knowledge

⌨️

Type Answer

Active recall

📚

Learn Mode

Multi-round mastery

🎯

Match Game

Memory challenge

Related Topics in AI

📖 Learning Resources