# Some mathematical preliminaries

## 0.1 Euclidean vectors

We assume that you are familiar with Euclidean vectors — those arrow-like geometric objects which are used to represent physical quantities, such as velocities, or forces.
You know that any two velocities can be added to yield a third, and the multiplication of a “velocity vector” by a real number is another “velocity vector”.
So a **linear combination** of vectors is another vector.
Mathematicians have simply taken these properties and defined vectors as *anything* that we can add and multiply by numbers, as long as everything behaves in a nice enough way.
This is basically what an Italian mathematician Giuseppe Peano (1858–1932) did in a chapter of his 1888 book with an impressive title: *Calcolo geometrico secondo l’Ausdehnungslehre di H. Grassmann preceduto dalle operazioni della logica deduttiva*.

## 0.2 Vector spaces

Following Peano, we define a **vector space** as a mathematical structure in which the notion of linear combination “makes sense”.

More formally, a **complex vector space** is a set **vectors** ^{2}

A **subspace** of **ket** vectors, or simply **kets**.
(We will deal with “bras” in a moment).
A **basis** in *exactly* one way) as a linear combination of the basis vectors; **dimension** of

In fact, this is the space we will use most of the time.
Throughout the course we will deal only with vector spaces of *finite* dimensions.
This is sufficient for all our purposes and we will avoid many mathematical subtleties associated with infinite dimensional spaces, for which we would need to tools of **functional analysis**.

## 0.3 Bras and kets

An **inner product** on a vector space

\langle u|v\rangle=\langle v|u\rangle^\star ;\langle v|v\rangle\geqslant 0 for all|v\rangle ;\langle v|v\rangle= 0 if and only if|v\rangle=0 .

The inner product must also be *linear* in the second argument but *antilinear* in the first argument:

With any physical system we associate a complex vector space with an inner product, known as a **Hilbert space**^{3}

For example, for column vectors **bra** vector, or a **bra**, and can be represented by a row vector:

Bras are vectors: you can add them, and multiply them by scalars (which, here, are complex numbers), but they are vectors in the space **dual** to **linear functionals**, that is, linear maps from

All Hilbert spaces of the same dimension are isomorphic, so the differences between quantum systems cannot be really understood without additional structure. This structure is provided by a specific algebra of operators acting on

## 0.4 Daggers

Although **dagger**:^{4}

The dagger operation, also known as **Hermitian conjugation**, is *antilinear*:
^{5}

## 0.5 Geometry

The inner product brings geometry: the **length**, or **norm**, of **orthogonal** if ^{6} forms an orthonormal basis, and so any vector can be expressed as a linear combination of the basis vectors:
**dual basis**

To make the notation a bit less cumbersome, we will sometimes label the basis kets as *do not confuse |0\rangle with the zero vector*!
We

*never*write the zero vector as

With any

*isolated*quantum system, which can be prepared inn *perfectly distinguishable*states, we can associate a Hilbert space\mathcal{H} of dimensionn such that each vector|v\rangle\in\mathcal{H} of unit length (i.e.\langle v|v\rangle =1 ) represents a quantum state of the system.The overall phase of the vector has no physical significance:

|v\rangle ande^{i\varphi}|v\rangle (for any real\varphi ) both describe the same state.The inner product

\langle u|v\rangle is the*probability amplitude*that a quantum system prepared in state|v\rangle will be found in state|u\rangle upon measurement.States corresponding to orthogonal vectors (i.e.

\langle u|v\rangle=0 ) are*perfectly distinguishable*, since, if we prepare the system in state|v\rangle , then it will never be found in state|u\rangle , and vice versa. In particular, states forming orthonormal bases are always perfectly distinguishable from each other. Choosing such states, as we shall see in a moment, is equivalent to choosing a particular quantum measurement.

## 0.6 Operators

A **linear map** between two vector spaces **endomorphisms**, that is, maps from **operators**.
The symbol *does* matter: in general, **commute**.
The inverse of *one* of these two conditions, since any one of the two implies the other, whereas, on an infinite-dimensional space, *both* must be checked.
Finally, given a particular basis, an operator **adjoint**, or **Hermitian conjugate**, of

An operator

**normal**ifAA^\dagger = A^\dagger A ,**unitary**ifAA^\dagger = A^\dagger A = \mathbf{1} ,**Hermitian**(or**self-adjoint**) ifA^\dagger = A .

Any physically admissible evolution of an isolated quantum system is represented by a unitary operator.
Note that unitary operators preserve the inner product: given a unitary operator *unitary operations are the isometries of the Euclidean norm*.

## 0.7 Outer products

Apart from the inner product **outer product**

- The result of
|u\rangle\langle v| acting on a ket|x\rangle is|u\rangle\langle v|x\rangle , i.e. the vector|u\rangle multiplied by the complex number\langle v|x\rangle . - Similarly, the result of
|u\rangle\langle v| acting on a bra\langle y| is\langle y|u\rangle\langle v| , i.e. the functional\langle v| multiplied by the complex number\langle y|u\rangle .

The product of two maps,

Any operator on *any* orthonormal basis, and it is one of the most ubiquitous and useful formulas in quantum theory.
For example, for any vector

## 0.8 The trace

The **trace** is an operation which turns outer products into inner products,
*not* depend on the choice of the basis.

**!!to-do: mention what this whole package of data all bundled up looks like from the categorical pov!!**

## 0.9 Some useful identities

|a\rangle^\dagger = \langle a| \langle a|^\dagger = |a\rangle (\alpha|a\rangle+\beta|b\rangle)^\dagger = \alpha^\star\langle a|+\beta^\star\langle b| (|a\rangle\langle b|)^\dagger = |b\rangle\langle a| (AB)^\dagger=B^\dagger A^\dagger (\alpha A+\beta B)^\dagger=\alpha^\star A^\dagger+\beta^\star B^\dagger (A^\dagger)^\dagger=A \operatorname{tr}(\alpha A+ \beta B) = \alpha \operatorname{tr}(A)+\beta\operatorname{tr}(B) \operatorname{tr}|a\rangle\langle b| = \langle b|a\rangle \operatorname{tr}(ABC) = \operatorname{tr}(CAB) = \operatorname{tr}(BCA)

As we said, there are certain “nice properties” that these things must satisfy. Addition of vectors must be commutative and associative, with an identity (the zero vector, which will always be written as

\mathbf{0} ) and an inverse for eachv (written as-v ). Multiplication by complex numbers must obey the two distributive laws:(\alpha+\beta)v = \alpha v+\beta v and\alpha (v+w) = \alpha v+\alpha w .↩︎The term “Hilbert space” used to be reserved for an infinite-dimensional inner product space that is

**complete**, i.e. such that every Cauchy sequence in the space converges to an element in the space. Nowadays, as in these notes, the term includes finite-dimensional spaces, which automatically satisfy the condition of completeness.↩︎“Is this a

\dagger which I see before me…”↩︎Recall that the conjugate transpose, or the Hermitian conjugate, of an

(n\times m) matrixA is an(m\times n) matrixA^\dagger , obtained by interchanging the rows and columns ofA and taking complex conjugates of each entry inA , i.e.A^\dagger_{ij}=A^\star_{ji} . In mathematics texts it is often denoted by{}^\star rather than{}^\dagger .↩︎That is, consider sets of vectors

|e_i\rangle such that\langle e_i|e_j\rangle=\delta_{ij} (where the**Kronecker delta**\delta_{ij} is0 ifi\neq j , and1 ifi=j .), and then pick any of the largest such sets (which must exist, since we assume our vector spaces to be finite dimensional).↩︎