### Induced matrix norm

If $\mathbb{F}^m$ and $\mathbb{F}^n$ are vector spaces with respective norms $\lVert\cdot\rVert_{(m)}$ and $\lVert\cdot\rVert_{(n)}$, then the **induced matrix norm** induced by this norm, denoted $\lVert\cdot\rVert_{(m,n)}$, is a norm on $\mathbb{F}^{m\times n}$ defined as follows: where the second equality follows from the homogeneity of vector norms. Intuitively, an induced matrix norm measures "the greatest factor by which that matrix can stretch out a vector's length". Accordingly, we have the following inequality:

For the p-norms, we may use $\lVert\cdot\rVert_p$ to denote the matrix norm induced by the p-norm:

**Theorem 1.** The 1-norm of a matrix $A\in\mathbb{F}^{m\times n}$ is given by the maximum sum of the absolute values of the entries in one of its columns.

*Proof.* Suppose $x$ is a vector with $\lVert x\rVert_1 = 1$, so that $|x_1|+...+|x_n|=1$. Then we have that

if $c_j$ is the sum of the absolute values of the entries of column $j$ of the matrix $A$, and $c_M$ is the largest among these absolute column sums. This shows that $\lVert A\rVert_1 \leq c_M$. To see why equality holds, i.e. $\lVert A\rVert_1 = c_M$, consider the vector $x$ whose entries satisfy $x_i=0$ for $i\ne M$ and $x_i=1$ for $i=M$. Clearly we have $\lVert x\rVert_1 = 1$ and $\lVert Ax\rVert = c_M$, showing that the bound is tight, as desired. $\blacksquare$

The proof of the following theorem is similar:

**Theorem 2.** The infinity-norm of a matrix $A\in\mathbb F^{m\times n}$ is the maximum sum of the absolute values of the entries in one of its rows.

# linear-algebra

# matrices

# norm

back to home page