Tag Archives: Richard Feynman

The Pleasures of Reading Feynman on Physics – IV

Chemists don’t really need to know much about electromagnetism.  Understand Coulombic forces between charges and you’re pretty much done.   You can use NMR easily without knowing much about magnetism aside from the shielding of the nucleus from a magnetic field by  charge distributions and ring currents. That’s  about it.  Of course, to really understand NMR you need the whole 9 yards.

I wonder how many chemists actually have gone this far.  I certainly haven’t.  Which brings me to volume II of the Feynman Lectures on Physics which contains over 500 pages and is all about electromagnetism.

Trying to learn about relativity told me that the way Einstein got into it was figuring out how to transform Maxwell’s equations correctly (James J. Callahan “The Geometry of Spacetime” pp. 22 – 27).  Using the Galilean transformation (which just adds velocities) an observer moving at constant velocity gets a different set of Maxwell equations, which according to the Galilean principle of relativity (yes Galileo got there first) shouldn’t happen.

Lorentz figured out a mathematical kludge so Maxwell’s equations transformed correctly, but it was just that,  a kludge.  Einstein derived the Lorentz transformation from first principles.

Feynman back in the 60s realized that the entering 18 yearolds had heard of relativity and quantum mechanics.  He didn’t like watching them being turned off to physics by studying how blocks travel down inclined planes for 2 or more years before getting to the good stuff (e. g. relativity, quantum mechanics).  So there is special relativity (no gravity) starting in volume I lecture 15 (p. 138) including all the paradoxes, time dilation length contraction, a very clear explanation fo the Michelson Morley experiment etc. etc.

Which brings me to volume II, which is also crystal clear and contains all the vector calculus (in 3 dimensions anyway) you need to know.  As you probably know, moving charge produces a magnetic field, and a changing magnetic field produces a force on a moving charge.

Well and good but on 144 Feynman asks you to consider 2 situations

  1. A stationary wire carrying a current and a moving charge outside the wire — because the charge is moving, a magnetic force is exerted on it causing the charge to move toward the wire (circle it actually)

2. A stationary charge and a  moving wire carrying a current

Paradox — since the charge isn’t moving there should be no magnetic force on it, so it shouldn’t move.

Then Feynman uses relativity to produce an electric force on the stationary charge so it moves.  (The world does not come equipped with coordinates) and any reference frame you choose should give you the same physics.

He has to use the length (Fitzgerald) contraction of a moving object (relativistic effect #1) and the time dilation of a moving object (relativistic effect #2) to produce  an electric force on the stationary charge.

It’s a tour de force and explains how electricity and magnetism are parts of a larger whole (electromagnetism).  Keep the charge from moving and you see only electric forces, let it move and you see only magnetic forces.  Of course there are reference frames where you see both.


The pleasures of reading Feynman on Physics — III

The more I read volume III of the Feynman Lectures on Physics about Quantum Mechanics the better I like it.  Even having taken two courses in it 60 and 10 years ago, Feynman takes a completely different tack, plunging directly into what makes quantum mechanics different than anything else.

He starts by saying “Traditionally, all courses in quantum mechanics have begun in the same way, retracing the path followed in the historical development of the subject.  One first learns a great deal about classical mechanics so that he will be able to understand how to solve the Schrodinger equation.  Then he spends a long time working out various solutions.  Only after a detailed study of this equation does he get to the advanced subject of the electron’s spin.”

Not to worry, he gets to the Hamiltonian on p. 85 and  the Schrodinger equation p. 224.   But he is blunt about it “We do not intend to have you think we have derived the Schrodinger equation but only wish to show you one way of thinking abut it.  When Schrodinger first wrote it down, he gave a kind of derivation based on some heuristic arguments and some brilliant intuitive guesses.  Some of the arguments he used were even false, but that does not matter. “

When he gives the law correct of physics for a particle moving freely in space with no forces, no disturbances (basically the Hamiltonian), he says “Where did we get that from”  Nowhere. It’s not possible to derive it from anything you know.  It came out of the mind of Schrodinger, invented in his struggle to find an understanding of the experimental observations of the real world.”  How can you not love a book written like this?

Among the gems are the way the conservation laws of physics arise in a very deep sense from symmetry (although he doesn’t mention Noether’s name).   He shows that atoms radiate photons because of entropy (p. 69).

Then there is his blazing honesty “when philosophical ideas associated with science are dragged into another field, they are usually completely distorted.”  

He spends a lot of time on the Stern Gerlach experiment and its various modifications and how they put you face to face with the bizarrities of quantum mechanics.

He doesn’t shy away from dealing with ‘spooky action at a distance’ although he calls it the Einstein Podolsky Rosen paradox.  He shows why if you accept the way quantum mechanics works, it isn’t a paradox at all (this takes a lot of convincing).

He ends up with “Do you think that it is not a paradox, but that it is still very peculiar?  On that we can all agree. It is what makes physics fascinating”

There are tons more but I hope this whets your appetite

The pleasures of reading Feynman on Physics – II

If you’re tired of hearing and thinking about COVID-19 24/7 even when you don’t want to, do what I did when I was a neurology resident 50+ years ago making clever diagnoses and then standing helplessly by while patients died.  Back then I read topology and the intense concentration required to absorb and digest the terms and relationships, took me miles and miles away.  The husband of one of my interns was a mathematician, and she said he would dream about mathematics.

Presumably some of the readership are chemists with graduate degrees, meaning that part of their acculturation as such was a course in quantum mechanics.  Back in the day it was pretty much required of chemistry grad students — homage a Prof. Marty Gouterman who taught the course to us 3 years out from his PhD in 1961.  Definitely a great teacher.  Here he is now, a continent away — http://faculty.washington.edu/goutermn/.

So for those happy souls I strongly recommend volume III of The Feynman Lectures on Physics.  Equally strongly do I recommend getting the Millennium Edition which has been purged of the 1,100 or so errors found in the 3 volumes over the years.

“Traditionally, all courses in quantum mechanics have begun in the same way, retracing the path followed in the historical development of the subject.  One first learns a great deal about classical mechanics so that he will be able to understand how to solve the Schrodinger equation.  Then he spends a long time working out various solutions.  Only after a detailed study of this equation does he get to the advanced subject of the electron’s spin.”

The first half of volume III is about spin

Feynman doesn’t even get to the Hamiltonian until p. 88.  I’m almost half through volume III and there has been no sighting of the Schrodinger equation so far.  But what you will find are clear explanations of Bosons and Fermions and why they are different, how masers and lasers operate (they are two state spin systems), how one electron holds two protons together, and a great explanation of covalent bonding.  Then there is great stuff beyond the ken of most chemists (at least this one) such as the Yukawa explanation of the strong nuclear force, and why neutrons and protons are really the same.  If you’ve read about Bell’s theorem proving that ‘spooky action at a distance must exist’, you’ll see where the numbers come from quantum mechanically that are simply impossible on a classical basis.  Zeilinger’s book “The Dance of the Photons” goes into this using .75 (which Feynman shows is just cos(30)^2.

Although Feynman doesn’t make much of a point about it, the essentiality of ‘imaginary’ numbers (complex numbers) to the entire project of quantum mechanics impressed me.  Without them,  wave interference is impossible.

I’m far from sure a neophyte could actually learn QM from Feynman, but having mucked about using and being exposed to QM and its extensions for 60 years, Feynman’s development of the subject is simply a joy to read.

So get the 3 volumes and plunge in.  You’ll forget all about the pandemic (for a while anyway)


The pleasures of reading Feynman on Physics

“Traditionally, all courses in quantum mechanics have begun in the same way, retracing the path followed in the historical development of the subject.  One first learns a great deal about classical mechanics so that he will be able to understand how to solve the Schrodinger equation.  Then he spends a long time working out various solutions.  Only after a detailed study of this equation does he get to the advanced subject of the electron’s spin.”

From vol. III of the Feynman lectures on physics  p. 3 – 1.

Certainly that’s the way I was taught QM as a budding chemist in 1961. Nothing wrong with that.  For a chemist it is very useful to see how all those orbitals pop out of series solutions to the Schrodinger equation for the hydrogen atom.

“We have come to the conclusion that what are usually called the advanced parts of quantum mechanics are in fact, quite simple. The mathematics that is involved is particularly simple, involving simple algebraic operations and no differential equations or at most only very simple ones.”

Quite true, but when, 50 years or so later,  I audited a QM course at an elite woman’s college, the underlying linear algebra wasn’t taught — so I wrote a series of posts giving the basics of the linear algebra used in QM — start at https://luysii.wordpress.com/2010/01/04/linear-algebra-survival-guide-for-quantum-mechanics-i/ and follow the links (there are 8 more posts).

Even more interesting was the way Mathematica had changed the way quantum mechanics was taught — see https://luysii.wordpress.com/2009/09/22/what-hath-mathematica-wrought/

But back to Feynman:  I’m far from sure a neophyte could actually learn QM this way, but having mucked about using and being exposed to QM and its extensions for 60 years, Feynman’s development of the subject is simply a joy to read. Feynman starts out as a good physicist should with the experiments.  Nothing fancy, bullets are shot at a screen through a slit, then electrons then two slits, and the various conundrums arising when one slit is closed.

Onward and upward through the Stern Gerlach experiments and how matrices are involved (although Feynman doesn’t call them that).  The only flaw in what I’ve found so far is his treatment of phase factors (p. 4 -1 ).  They aren’t really defined, but they are crucial as phase factors are what breaks the objects of physics into fermions and bosons.

If you’ve taken any course in QM and have some time (who doesn’t now that we’re all essentially inmates in our own homes/apartments) than have a look.   You’ll love it.  As Bill Gates said about the books “It is good to sit at the feet of the master”.

One piece of advice — get the new Millennium edition — it has removed some 1,100 errors and misprints found over the decades, so if you’re studying it by yourself, you won’t be tripped up by a misprint in the text when you don’t understand something.

How can it be like that?

The following quote is from an old book on LISP programming (Let’s Talk LISP) by Laurent Siklossy.“Remember, if you don’t understand it right away, don’t worry. You never learn anything, you only get used to it.”

Unlike quantum mechanics, where Feynman warned never to ask ‘how can it be like that’, those of us in any area of biology should always  be asking ourselves that question.  Despite studying the brain and its neurons for years and years and years, here’s a question I should have asked myself (but didn’t, and as far as I can tell no one has until this paper [ Proc. Natl. Acad. Sci. vol. 117 pp. 4368 – 4374 ’20 ] ).

It’s a simple enough question.  How does a neuron know what receptor to put at a given synapse, given that all neurons in the CNS have both excitatory and inhibitory synapses on them. Had you ever thought about that?  I hadn’t.

Remember many synapses are far away from the cell body.  Putting a GABA receptor at a glutamic acid synapse would be less than useful.

The paper used a rather bizarre system to at least try to answer the question.  Vertebrate muscle cells respond to acetyl choline.  The authors bathed embryonic skeletal muscle cells (before innervation) with glutamic acid, and sure enough glutamic acid receptors appeared.

There’s a lot in the paper about transcription factors and mechanism, which is probably irrelevant to the CNS (muscle nuclei underly the neuromuscular junction).   Even if you send receptors for many different neurotransmitters everywhere in a neuron, how is the correct one inserted and the rest not at a given synapse.

I’d never thought of this.  Had you?


Tough 60s chick disappoints woke young interviewer

Quanta — https://www.quantamagazine.org — always has science and math worth reading.   The following is worth reading for the contrasting mindsets of interviewer and interviewee — https://www.quantamagazine.org/virginia-trimble-has-seen-the-stars-20191111/.

Virginia Trimble — now 76 and in declining health, was once young, beautiful, brilliant and a grad student at Cal Tech in astronomy.  Now a prof a UC Irvine she was interviewed by a young woman, about her experiences.

The interviewer did her best to make her appear exploited, but Trimble was having none of it.

Q. So you get to Caltech, and you’re the only woman —

A. I wasn’t. There was this tiny little cluster of seven of us that went through all at the same time …

She was so gorgeous (see her picture in the article) that Feynman asked her to be a model as (presumably) he was learning to draw, paying her $5.50 an hour.

The interviewer couldn’t contain herself

Q. When you spoke to him, did he treat you collegially?

Trimble, never one to pussyfoot around, responds.

A. The obvious question, did we make love? As it happens, no, but not a big deal. It might just as well have been yes, and we would still have been good friends.

The intrepid interviewer persists.

Q: I want to talk about sexual harassment in academia.

I don’t particularly care for the word harassment.

Let’s go with “hanky-panky,” OK?


When I was young and beautiful, I engaged in a great deal of hanky-panky. I could never see anything very wrong with it. You may claim that if a very distinguished senior scientist engages in hanky-panky with a much younger scientist — never mind which gender, and make it a conductor and a violinist if you wish, instead of senior and junior scientists — if that results in real injustice to people not involved in the hanky-panky, then, yeah, you have to have a rule against it. But I’m not so sure that that happens very often.

Clearly not the sort of thing the interviewer expected.  She persists and gamely frames a question so she can get the answer she wants

Q: I just want to clarify, when someone in power makes someone who is subordinate feel that they have to engage in sexual activities for the sake of their job or the sake of their academic career, that is what angers so many people right now.

Trimble is having none of it

A: The step in that syllogism where I part company is the younger — or whatever — person feeling they “have to,” and let me say again that a young attractive female has a lot of power, where older, less attractive men are concerned.

Defeated, the interviewer changes the subject.

Q: What are you most proud of in terms of your scientific work?

There’s a lot of interesting science in the article, but the clash of mindsets interested me more.

Did these guys just repeal the second law of thermodynamics and solve the global warming problem?

Did these guys just repeal the second law of thermodynamics and solve the global warming problem to boot? [ Science vol. 355 pp. 1023 – 1024, 1062 -1066 ’17 ] Heady stuff. But they can put a sheet of metamaterial over water during the day in Arizona and cool it by 8 degrees Centigrade in two hours!

How did they do it? Time for a little atmospheric physics. There is nothing in the Earth’s atmosphere which absorbs light of wavelength between 8 and 13 microns (this is called the atmospheric window). So anything radiating energy in this range sends it out into space. This is called radiative cooling. It doesn’t work during the day because most materials absorb sunlight in the visible and near infrared range (.7 -2.5 microns) heating them up. Solar power density overwhelms the room temperature radiation spectrum shorter than 4 microns. So for daytime cooling you need a material reflecting all the light shorter than 4 microns, while being fully emissive for longer wavelengths.

This work describes a metamaterial– https://en.wikipedia.org/wiki/Metamaterial — in which small (average diameter 4 microns) spheres ofSiO2 (glass) are randomly dispersed in a polymer matrix transparent to visible and infrared light. The matrix is 50 microns thick. The whole shebang is backed by a very thin (.2 micron) silver mirror. So light easily passes through the film and is then bounced back by the mirror without being absorbed.

Chemists have already studied the Carnot cycle, which gives the maximum efficiency of a heat engine. This is always proportional to the temperature difference between phases of the cycle. That’s why the biggest thing about a nuclear power plant is the cooling tower (and almost as important). Well few things are colder than the cosmic microwave background (2.7 degrees Centigrade above absolute zero).

So while the entropy of the universe increases as the heat goes somewhere, locally it looks like the second law of thermodynamics is being violated. No work is done (as far as i can tell) yet the objects spontaneously cool.

Perhaps the physics mavens out there can help. I seem to remember Feynman and Wheeler once saying something to the effect that radiation is impossible without something around to absorb it. If I haven’t totally garbled the physics, it almost sounds like emitter and absorber are entangled.

Anyway beaming heat out into space through the atmospheric window sounds like a good way to combat global warming.

No wonder DARPA supported this research.

SmORFs and DWORFs — has molecular biology lost its mind?

There’s Plenty of Room at The Bottom is a famous talk given by Richard Feynman 56 years ago. He was talking about something not invented until decades later — nanotechnology. He didn’t know that the same advice now applies to molecular biology. The talk itself is well worth reading — here’s the link http://www.zyvex.com/nanotech/feynman.html.

Those not up to speed on molecular biology can find what they need at — https://luysii.wordpress.com/2010/07/07/molecular-biology-survival-guide-for-chemists-i-dna-and-protein-coding-gene-structure/. Just follow the links (there are only 5) in the series.

lncRNA stands for long nonCoding RNA — nonCoding for protein that is. Long is taken to mean over 200 nucleotides. There is considerable debate concerning how many there are — but “most estimates place the number in the tens of thousands” [ Cell vol. 164 p. 69 ’16 ]. Whether they have any cellular function is also under debate. Could they be like the turnings from a lathe, produced by the various RNA polymerases we have (3 actually) simply transcribing the genome compulsively? I doubt this, because transcription takes energy and cells are a lot of things but wasteful isn’t one of them.

Where does Feynmann come in? Because at least one lncRNA codes for a very small protein using a Small Open Reading Frame (SMORF) to do so. The protein in question is called DWORF (for DWorf Open Reading Frame). It contains only 34 amino acids. Its function is definitely not trivial. It binds to something called SERCA, which is a large enzyme in the sarcoplasmic reticulum of muscle which allows muscle to relax after contracting. Muscle contraction occurs when calcium is released from the endoplasmic reticulum of muscle.  SERCA takes the released calcium back into the endoplasmic reticulum allowing muscle to contract. So repetitive muscle contraction depends on the flow and ebb of calcium tides in the cell. Amazingly there are 3 other small proteins which also bind to SERCA modifying its function. Their names are phospholamban (no kidding) sarcolipin and myoregulin — also small proteins of 52, 31 and 46 amino acids.

So here is a lncRNA making an oxymoron of its name by actually coding for a protein. So DWORF is small, but so are its 3 exons, one of which is only 4 amino acids long. Imagine the gigantic spliceosome which has a mass over 1,300,000 Daltons, 10,574 amino acids making up 37 proteins, along with several catalytic RNAs, being that precise and operating on something that small.

So there’s a whole other world down there which we’ve just begun to investigate. It’s probably a vestige of the RNA world from which life is thought to have sprung.

Then there are the small molecules of intermediary metabolism. Undoubtedly some of them are used for control as well as metabolism. I’ll discuss this later, but the Human Metabolome DataBase (HMDB) has 42,000 entries and METLIN, a metabolic database has 240,000 entries.

Then there is competitive endogenous RNA –https://luysii.wordpress.com/2012/01/29/why-drug-discovery-is-so-hard-reason-20-competitive-endogenous-rnas/

Do you need chemistry to understand this? Yes and no. How the molecules do what they do is the province of chemistry. The description of their function doesn’t require chemistry at all. As David Hilbert said about axiomatizing geometry, you don’t need points, straight lines and planes You could use tables, chairs and beer mugs. What is important are the relations between them. Ditto for the chemical entities making us up.

I wouldn’t like that.  It’s neat to picture in my mind our various molecular machines, nuts and bolts doing what they do.  It’s a much richer experience.  Not having the background is being chemical blind..  Not a good thing, but better than nothing.

How formal tensor mathematics and the postulates of quantum mechanics give rise to entanglement

Tensors continue to amaze. I never thought I’d get a simple mathematical explanation of entanglement, but here it is. Explanation is probably too strong a word, because it relies on the postulates of quantum mechanics, which are extremely simple but which lead to extremely bizarre consequences (such as entanglement). As Feynman famously said ‘no one understands quantum mechanics’. Despite that it’s never made a prediction not confirmed by experiments, so the theory is correct even if we don’t understand ‘how it can be like that’. 100 years of correct prediction of experimentation are not to be sneezed at.

If you’re a bit foggy on just what entanglement is — have a look at https://luysii.wordpress.com/2010/12/13/bells-inequality-entanglement-and-the-demise-of-local-reality-i/. Even better; read the book by Zeilinger referred to in the link (if you have the time).

Actually you don’t even need all the postulates for quantum mechanics (as given in the book “Quantum Computation and Quantum Information by Nielsen and Chuang). No differential equations. No Schrodinger equation. No operators. No eigenvalues. What could be nicer for those thirsting for knowledge? Such a deal ! ! ! Just 2 postulates and a little formal mathematics.

Postulate #1 “Associated to any isolated physical system, is a complex vector space with inner product (that is a Hilbert space) known as the state space of the system. The system is completely described by its state vector which is a unit vector in the system’s state space”. If this is unsatisfying, see an explication of this on p. 80 of Nielson and Chuang (where the postulate appears)

Because the linear algebra underlying quantum mechanics seemed to be largely ignored in the course I audited, I wrote a series of posts called Linear Algebra Survival Guide for Quantum Mechanics. The first should be all you need. https://luysii.wordpress.com/2010/01/04/linear-algebra-survival-guide-for-quantum-mechanics-i/ but there are several more.

Even though I wrote a post on tensors, showing how they were a way of describing an object independently of the coordinates used to describe it, I did’t even discuss another aspect of tensors — multi linearity — which is crucial here. The post itself can be viewed at https://luysii.wordpress.com/2014/12/08/tensors/

Start by thinking of a simple tensor as a vector in a vector space. The tensor product is just a way of combining vectors in vector spaces to get another (and larger) vector space. So the tensor product isn’t a product in the sense that multiplication of two objects (real numbers, complex numbers, square matrices) produces another object of the exactly same kind.

So mathematicians use a special symbol for the tensor product — a circle with an x inside. I’m going to use something similar ‘®’ because I can’t figure out how to produce the actual symbol. So let V and W be the quantum mechanical state spaces of two systems.

Their tensor product is just V ® W. Mathematicians can define things any way they want. A crucial aspect of the tensor product is that is multilinear. So if v and v’ are elements of V, then v + v’ is also an element of V (because two vectors in a given vector space can always be added). Similarly w + w’ is an element of W if w an w’ are. Adding to the confusion trying to learn this stuff is the fact that all vectors are themselves tensors.

Multilinearity of the tensor product is what you’d think

(v + v’) ® (w + w’) = v ® (w + w’ ) + v’ ® (w + w’)

= v ® w + v ® w’ + v’ ® w + v’ ® w’

You get all 4 tensor products in this case.

This brings us to Postulate #2 (actually #4 on the book on p. 94 — we don’t need the other two — I told you this was fairly simple)

Postulate #2 “The state space of a composite physical system is the tensor product of the state spaces of the component physical systems.”


Where does entanglement come in? Patience, we’re nearly done. One now must distinguish simple and non-simple tensors. Each of the 4 tensors products in the sum on the last line is simple being the tensor product of two vectors.

What about v ® w’ + v’ ® w ?? It isn’t simple because there is no way to get this by itself as simple_tensor1 ® simple_tensor2 So it’s called a compound tensor. (v + v’) ® (w + w’) is a simple tensor because v + v’ is just another single element of V (call it v”) and w + w’ is just another single element of W (call it w”).

So the tensor product of (v + v’) ® (w + w’) — the elements of the two state spaces can be understood as though V has state v” and W has state w”.

v ® w’ + v’ ® w can’t be understood this way. The full system can’t be understood by considering V and W in isolation, e.g. the two subsystems V and W are ENTANGLED.

Yup, that’s all there is to entanglement (mathematically at least). The paradoxes entanglement including Einstein’s ‘creepy action at a distance’ are left for you to explore — again Zeilinger’s book is a great source.

But how can it be like that you ask? Feynman said not to start thinking these thoughts, and if he didn’t know you expect a retired neurologist to tell you? Please.