The heavy lifting is pretty much done. Now for some fairly spectacular results, and then back to reading Clayden et. al. To make things concrete, let Y be a 3 dimensional vector with complex coefficients c1, c2 and c3. The coefficients multiply a set of basis vectors (which exist since all finite and infinite vector spaces have a basis). The glory of abstraction is that we don’t actually have worry about what the basis vectors actually are, just that they exist. We are free to use their properties, one of which is orthogonality (I may not have proved this, you should if I haven’t). So the column vector is
c1
c2
c3
and the corresponding row vector (the conjugate transpose) is
c1* c2* c3*
Next, I’m going to write a corresponding hermitian matrix M as follows where Aij is an arbitrary complex number.
A11 A12 A13
A21 A22 A23
A31 A32 A33
Now form the product
A11 A12 A13
A21 A22 A23
A31 A32 A33
c1* c2* c3* X Y Z
The net effect is to form another row vector with 3 components. All we need for what I want to prove is an explicit formula for X
X = c1*(A11) + c2*(A21) + c3*(A31)
When we multiply the row vector obtained by the column vector on the right we get
c1 [ c1*(A11) + c2*(A21) + c3*(A31) ] + c2 [ Y ] + c3 [ Z ] — which by assumption must be a real number
Next, form the product of M with the column vector
c1
c2
c3
A11 A12 A13 X’
A21 A22 A23 Y’
A31 A32 A33 Z’
This time all we need is X’ which is c1(A11) + c2(A12) + c3(A13)
When we multiply the column vector obtained by the row vector on the left we get
c1* [ c1(A11) + c2(A12) + c3(A13) ] + c2* Y’ + c3* Z’ — the same number as
c1 [ c1*(A11) + c2*(A21) + c3*(A31) ] + c2 [ Y ] + c3 [ Z ]
Notice that c1, c2, c3 can each be any of the infinite number of complex numbers, without disturbing the equality. The ONLY way this can happen is if
c1*[c1(A11)] = c1[c1*(A11)] — this is obviously true
and c1*[c2[A12)] = c1[c2*(A21)] — something fishy
and c1*[c3[A13)] = c1[c3*(A31)] ditto
The last two equalities look a bit strange. If you go back to LASGFQM – II , you will see that c1*(c2) does NOT equal c1(c2*). However
c1*(c2) does equal [ c1 (c2* ) ]*. They aren’t the same, but at least they are the complex conjugates of the other. This means that to make
c1*[c2[A12)] = c1[c2*(A21)], A12 = A21* or A12* = A21 which is the same thing.
So just by following the postulate of quantum mechanics about the type of linear transformation (called Hermitian) which can result in a measurement, we find that the matrix representing the linear transformation, the Hermitian matrix, has the property that Mij = Mji* (the first letter is the row index and the second is the column index). This also means that the diagonal elements of any Hermitian matrix are real. Now when I first bumped up against Hermitian matrices they were DEFINED this way, making them seem rather magical. Hermitian matrices are in fact natural, and they do just what quantum mechanics wants them to do.
Some more nomenclature: Mij = Mji* means that a Hermitian matrix equals its conjugate transpose (which is another even more obscure way to define them). The conjugate transpose of a matrix is called the adjoint. This means that the row vector as we’ve defined it is the adjoint of the column vector. This also is why Hermitian matrices are called self-adjoint.
That’s about it. Hopefully when you see this stuff in the future, you won’t be just mumbling incantations. But perhaps you are wondering, where are the eigenvectors, where are the eigenvalues in all this? What happened to the Schrodinger equation beloved in song and story? That’s for the course you’re taking, but briefly and without explanation, the basis vectors I’ve been talking about (without explictly describing them) all result as follows:
Any Hermitian operator times wavefunction = some number times same wavefunction. [1]
Several points: many Hermitian operators change one wave function into another, so [ 1 ] doesn’t always hold.
IF [1] does hold the wavefunction is called an eigenfunction, and ‘some number’ is the eigenvalue.
There is usually a set of eigenfunctions for a given Hermitian operator — these are the basis functions (basis vectors of the infinite dimensional Hilbert space) of the vector space I was describing. You find them by finding solutions of the Schrodinger equation H Psi = E Psi, but that’s for your course, but at least now you know the lingo. Hopefully, these last few words are less frustrating than the way Tom Wolfe ended “The Bonfire of the Vanities” years ago — the book just stopped rather than ended.
I thought the course I audited was excellent, but we never even got into bonding. Nonetheless, I think the base it gave was quite solid and it’s time to find out. Michelle Francl recommended “Modern Quantum Chemistry” by Atilla (yes Atilla ! ) Szabo and Neil Ostlund as the next step. You can’t beat the price as it’s a Dover paperback. I’ve taken a brief look at ‘”Molecular Quantum Mechanics” by Atkins and Friedman — it starts with the postulates and moves on from there. Plenty of pictures and diagrams, but no idea how good it is. Finally, 40 years ago I lived across the street from a Physics grad student (whose name I can’t recall), and the real hot stuff back then was a book by Prugovecki called “Quantum Mechanics in Hilbert Space”. Being a pack rat, I still have it. We’ll see.
One further point. I sort of dumped on Giancoli”s book on Physics, which I bought when the course was starting up 9/09 — pretty pictures and all that. Having been through the first 300 pages or so (all on mechanics), I must say it’s damn good. The pictures are appropriate, the diagrams well thought out, the exposition clear and user friendly without being sappy.
Time to delve.
Amen Selah
Comments
9 AM EST 5 Feb ’10
Apologies to those who looked at this post earlier. One of the crucial points was written so it didn’t make sense. Here is the corrected version
c1*(c2) does equal [ c1 (c2* ) ]*.
Earlier I had left out a parenthesis so it looked like this
c1*(c2) does equal c1 (c2* ) )* which makes no sense at all.
Sorry
It is not necessarily true that the eigenvectors of a particular operator form a basis for the vector space on which they operate. For example, consider a rotation operaton in
. These operators will have only one eigenvector (the vector along the principal axis of the rotation) so its eigenvectors clearly do not form a basis for
.
However, an important theorem in linear algebra, the spectral theorem, states that you can construct an orthonormal basis from eigenvectors of any self-adjoint operator. This fact is crucial for many proofs in quantum mechanics which rely on breaking arbitrary wavefunctions down into a sum of eigenstates of the Hamiltonian.
Yggdrasil — thanks. We spent a fair amount of time in class undegenerating degenerate eigenfunctions. I should have mentioned this.
Szabo/Ostlund is the only QC book I know which delves into LA. Speaking of Dover, you may know that Pauling/Wilson is also a Dover paperback and cheap.
Trackbacks
[…] Linear Algebra survival guide for Quantum Mechanics -IX … […]