# The time-energy uncertainty relation.

Updated the diagram into a table on 14.Oct.2018.

 Theory Best value OPERA Hypothesis photon-mass deviation $0 \hspace{3pt}eV$ $10^{-18} \hspace{3pt}eV$ $10^{-8}\hspace{3pt}eV$ $2 \hspace{3pt}eV$ “time window” according Heisenberg Uncertainty Principle. $\infty$ $41 \hspace{3pt}s$ $4.1 \hspace{3pt}10^{-9}s$ $20.5 \hspace{3pt}10^{-18}s$

This article — and one more I posted before this, is the answer I surmise, why OPERA experiment was capable of measuring the neutrinos pacing beyond the photons. The answer is Quantum Mechanical energy-time uncertainty principle, present errors/uncertainties on neutrino mass even allow in principle a time measurement to the precision of $10^{-22}s$, OPERA did just $10^{-9}s$.

Nature says — I am quoting, attention please: “the limits I have imposed on theory and experiments by human beings, is good enough for them to track down the photon, and anything else, by using neutrinos, to the $10^{22}$th of a second, OPERA did just $10^{9}$th of a second”.

The time-energy uncertainty relation is a blessing in disguise which comes in handy to check various values that are quoted, so as to see if something is inconsistent or not. It’s very powerful in guiding to check if we are ourselves making something silly or not.

I have described in two recent articles — will link later, why.

1. One must be careful what energy and what time one is relating to, one just does not take any time and any energy and make a relation, in-fact one can see who is a good physicist from one who is a novice, by seeing how this relation is used by him.

This was joked by Landau: I can measure the energy and then look at my watch, time is just a parameter. But Einstein and Niels Bohr argued “during a very short time interval one must be careful what energy is allowed and what is not, there is a constraint on the windows of errors or uncertainties”.

2. Life-times are arbitrary variables as are energies, their means are not necessarily linked inversely as in case of the uncertainty relation itself, the latter gives a relation between the error-window which are linked inversely.

So watch out how much inconsistent description is given in an average article eg in Wikipedia and even in our text-books. These are training the future physicists very wrongly. One needs experience of solving good problems, one is to work in experiments of highest standard and understand them.

Then this practice makes the case easy for one. Feynman said “it is safe to say no one understands quantum mechanics”. I say ” … that + practice makes it easy to have a better control over inconsistency we make in applying quantum mechanics, problems and only problems, solving them, can give one the feel for what is being talked about”

So have a good training in quantum mechanics i.e. course-work in undergrad and grad school, solve real-life problems and learn from good sources.

Why is no one telling us photon can’t have a zero mass as per quantum mechanical uncertainty principle? Possibly no one thought — well those who are competitive physicists, in some way or other they know this, they may not think explicitly, that is why I am mentioning.

Photon really can’t have an absolute zero mass, but an extremely small mass is allowed so as to allow a corresponding error on time. eg I wrote one article yesterday where photon mass is experimentally found to be between $10^{-27}\hspace{3pt}eV - 10^{-7}\hspace{3pt}eV$. Now if you take that to be the width on mass or energy, it is essentially $10^{-7}\hspace{3pt}eV$ because the other factor is zero compared to this.

So $\Delta E \times \Delta T \geq h\hspace{5pt}(=4.136 \times 10^ {-15} \hspace{3pt} eV.s)$.

$\Delta E = 10^{-7}\hspace{3pt} eV$, this gives a $\Delta T$ of 0.4136 nano-second. In other-words this is not the life time of the photon, but it is saying photon at-least lives for this time if we are to measure it’s energy of $10^{-7}\hspace{3pt} eV$,

*Note* 0.4136 nano-seconds or more, and we quote an infinitely large number for it’s actual life-time which is a mean, this is inferred from a detailed measurement, here I am positing on the basic idea.

Since we have that capability of time precision we have been able to measure photon mass to much smaller limits.

*Note* The present best value of photon mass is $10^{-18}\hspace{3pt} eV$, hence we allow corresponding large error on time, this is what I had described in the article: why scientists think photon is mass-less ?, the GHz frequency measurement of photon mass entailed a large error on photon mass of $10^{-7}\hspace{3pt} eV$, note that GHz is inverse of a nano-second. ha ha ha.

Now the OPERA neutrino experiment measured a nano-second time, hence it actually allowed a $10^{-7}\hspace{3pt} eV$ energy-uncertainty. That is this measurement of OPERA with the inherent methods of the experiment could not have dealt with a situation if the neutrinos were to have an energy-uncertainty less than $10^{-7}\hspace{3pt} eV$. That would have required a better time measurement than nano-second.

*Note* In actuality they possibly had a better time resolution, but given that neutrino mixing meant a far worse energy-uncertainty the time need not have been more accurate to make a precise and accurate experiment, the good statistics made the experiment a discovery and the good control over the systematic-error made it a robust and valid measurement. The other issues of anomaly or not and everything is free from errors or not, will be performed again, by them or by others and it is needed**

But it is clear that an error as much as a MeV in the neutrino mass makes the measurement sensitive to a time of 10,000th of an attosecond ($10^{-22}\hspace{3pt}s$) in theory.

How precise clocks we have and how much dispersion of time resolution occurs in our experimental set-up is a problem of technology and research, but nature is kind to far below attosecond level, even with MeV errors on rest-mass of the particle.

The good point of OPERA as is not understood by me in the beginning and now I see, I have friends in the blog-o-sphere who are affected by such a syndrome is: it is just a speed measurement, hence make a baseline, let the two participants run along the baseline and measure your time to a desired level of accuracy, by allowing yourself a desired level of uncertainty on the energy.

Once the speed has been measured we have proved our point beyond any doubt and uncertainty. As is customary the next thing in line is “anomaly”. The anomaly has to be addressed by various new techniques of the measurement or external independent experiments. But this is a very clever measurement, an example of how experiments are carried out in the best traditions of scientific practice.

#### Tags

I am an experimental particle physicist, traveler, teacher, researcher, scientist and communicator of ideas. I am a quarkist and a bit quirky ! Hypothesis non fingo, eppur si muove, dubito cogito ergo sum are things that turn me on ! Researcher in experimental high energy physics (aka elementary particle physics; like “quarks, leptons & mesons and baryons”) … Teacher of Physics (and occasionally chemistry and maths) Blogger (check my website; mdashf.org) ! Love to read read and read but only stuff that interest me. Love to puff away my time in frivolities, just dreaming and may be thinking. Right now desperately trying to streamline myself.