A **unitary matrix** $Q$ over an inner product vector space $\mathbb{F}^n$ on an ordered field $\mathbb{F}$ is defined as a matrix whose columns are orthonormal, or, equivalently, a matrix with the property that $Q^\ast Q = I$, where $Q^\ast$ is the Hermitian conjugate of $Q$.

Unitary matrix transformations have the nice property that they always preserve lengths of vectors or distances between points (so long as the notion of "distance" is induced by an inner product):

Proposition 1.If $Q\in \mathbb{F}^{n\times n}$ is unitary, then $\lVert Qx\rVert =\lVert x\rVert$ for all $x\in\mathbb{F}^n$.

Proof.Let $e_1,...,e_n$ be the standard basis for $\mathbb{F}^n$, so that Then, if $q_1,...,q_n$ are the column vectors of $Q$, we have that Since the columns of $Q$ are orthonormal, by the Pythagorean Theorem , we have that

and so $\lVert Qx\rVert = \lVert x\rVert$ as desired. $\blacksquare$

This suggests the question of whether the converse of the above proposition is true: if $A$ is a matrix whose corresponding transformation preserves lengths, is it necessarily unitary? This is true in $\mathbb{R}^n$, but, surprisingly, it is not the case in $\mathbb{C}^n$, for reasons related to the failure of the converse of the Pythagorean Theorem in $\mathbb{C}^n$.

Proposition 2.In $\mathbb{R}^n$, the only length-preserving matrix transformations are given by unitary matrices, but in $\mathbb{C}^n$ this is not the case.

Proof.Suppose a matrix $A\in\mathbb{R}^{n\times n}$ is a length-preserving linear transformation with column vectors $a_1,...,a_n$. Then, if $x=x_1e_1+...+x_ne_n$, we have that First consider the vector $x=e_i$ for some integer $i$ between $1$ and $n$, so that $x_i=1$ and all other entries of $x$ are equal to $0$. Then $Ax=a_i$, and since $\lVert Ax\rVert = \lVert x\rVert$, we have that $\lVert a_i\rVert = 1$ for each column $a_i$ of $A$. Hence, $A$ has normal columns.Next, consider the vector $x=e_i+e_j$, so that $x_i=x_j=1$ and $x_k=0$ for all $k\ne i,j$, where $i,j$ are distinct integers between $1$ and $n$. Then we have that $Ax = a_i+a_j$, and since $\lVert Ax\rVert = \lVert x\rVert$, we have that Thus, by the converse of the Pythagorean Theorem (which holds in $\mathbb{R}^n$ specifically - this is the step of the proof that fails in $\mathbb{C}^n$) we have that $a_i$ and $a_j$ are orthogonal, meaning that $A$ has orthonormal columns and is therefore a unitary matrix.

As for a counterexample in $\mathbb{C}^n$, consider the following matrix: It's easy to show through straighforward algebra that this preserves the lengths of vectors in $\mathbb{C}^n$, but the columns are clearly not orthogonal. $\blacksquare$

The eigendecomposition of a unitary matrix has the following nice property:

Proposition 3.The eigenvalues of a unitary matrix in $\mathbb{C}^n$ are complex numbers with $|\lambda|=1$.

Proof.Let $Q$ be a unitary matrix, and let $\lambda$ be one of its eigenvalues with corresponding eigenvector $v$. Then, since the linear transformation induced by $Q$ preserves distances, we have that $\lVert Qv\rVert = |\lambda|\lVert v\rVert = \lVert v\rVert$, meaning that $|\lambda|=1$. $\blacksquare$

In linear algebra over $\mathbb{R}$ and $\mathbb{C}$, we often think of unitary matrices as "special" or "rare" because of the nice properties that they possess, and because an arbitrarily chosen matrix usually isn't unitary. However, the continuity of these spaces makes it hard to describe exactly how "numerous" they are. But in linear algebra over finite fields $\mathbb{F}_p$, this changes: we can use combinatorics to count approximately how many unitary matrices of a certain size exist. (Note: over finite fields, there is no norm, so the term *unitary* just refers to matrices with *orthogonal* columns.)

Proposition 4.If $U_{p,n}$ denotes the number of unitary matrices in $\mathbb{F}_p^{n\times n}$, then we have

Proof.In $\mathbb{F}_p^n$, every subspace of dimension $d$ contains $p^d$ vectors, and its orthogonal complement contains $p^{n-d}$ vectors. If we construct an arbitrary unitary element of $\mathbb{F}_p^{n\times n}$ by choosing its columns one at a time, we find that there exist exactly unitary matrices. Clearly this is less than $p^n\cdot p^{n-1}\cdot ... \cdot p$, which equals $p^{\frac{n^2+n}{2}}$, which gives us our upper bound. For the lower bound, consider the logarithm of this quantity: For $x\in [0,1/2]$, we may analytically derive the bound $\ln(1-x)\ge x\ln(2)$, which yields the inequality and exponentiating both sides of this inequality yields the desired lower bound. $\blacksquare$