Quantum mechanics has never made an incorrect prediction. What does it predict? Numbers basically, and real numbers at that. When you read a dial, or measure an energy in a spectrum you get a (real) number. Imaginary currents exist, but I don’t know if you can measure them (I”ll ask the EE who just married into the family this weekend). So couple the real number output of a measurement with the postulate of quantum that tells you how to get them and out pop Hermitian matrices.
A variety of equivalent postulate systems for QM exist (Atkins uses 5, our instructor used 4). All of them say that the state of the system is described by a wavefunction (which we’re going to think of as a vector, since we’re in linear algebra land). In LASGFQM – V the equivalence of the integral of a function and a vector in infinite dimensional space was explained. LASGFOM – VII explained why every linear transformation could be represented by a matrix, and why every matrix represents a linear transformation.
An operator is just a linear transformation of a vector space to itself. This means that if we’re dealing with a finite dimensional vector space, the matrix representing the operator will be square. Recalling the rules for matrix multiplication (LASGFQM – IV), this means that you can do things like this
x x x
x x x
x x x
y y y giving the row vector xy xy xy
and things like this
x x x giving the column vector xz
x x x xz
x x x xz
Of course way back at the beginning it was explained why the inner product of a V vector with itself, had to make one the complex conjugate (V*) of the other (so the the inner product of a vector with itself was a real number), and in LASGFQM - VI it was explained why multiplying a row vector by a column vector gives a number . Here it is
y y y yz
So given that < V | V > really means < V* | V > to physicists, the inner product can be regarded as just another form of matrix multiplication, with the row vector being the conjugate transpose of the column vector.
If you reverse the order of multiplication (column vector first, row vector second), you get an n x n matrix, not a number. It should be pretty clear by now that you can multiply all 3 matrices together (row vector, n x n matrix, column vector) as long as you keep the order correct. After all this huffing an puffing, you wind up with — drum roll — a number, which is complex because the vectors of quantum mechanics have complex coefficients (another one of the postulates).
We’re at a fairly high level of abstraction here. We haven’t chosen a basis, but all vector spaces have one (even infinite vector spaces). We’ll talk about them in the next (and probably final) post.
Call the column vector Y, the row vector X, and the matrix M. We have Y M X = some number. It should be clear that it doesn’t matter which two matrices we multiply together first e.g. (Y M) X = Y (M X).
Recall that differentiation and integration are linear operators, so they can be represented by matrices. The wavefunction is represented by a column vector. Various things you want to know (kinetic energy, position) are represented by linear operators in QM.
Here’s the postulate: For a given wavefunction Y, any measurement on it (given by a linear operator M ) is always a REAL number and is given by the
conjugate transpose of Y times M times Y (the column vector).
You have to accept the postulate (because it works ! ! !) as the QM instructor said many times. Don’t ask how it can be like that (Feynman).
This postulate is all that it takes to make the linear transformation M a very special one — e.g. a Hermitian matrix, with all sorts of interesting properties. Hermite described these matrices in 1855, long before QM. I’ve tried to find out what he was working on without success. More about the properties of Hermitian matrices next time, but to whet your appetite, if an element of M is written Mij, where i is the row and j is the column, and Mij is a complex number, then Mji is the complex conjugate of Mij. Believe it or not, this all follows from the postulate.