Is the famous uncertainty relation wrong? Here is how!

2 minutes read

|Difficulty level: Medium|

The most popular form of uncertainty relation which we study in our textbooks is described roughly as the following. It states that the product of standard deviations of two quantities (more precisely, two non-commuting observables) (say position and momentum) is greater than or equal to some definite quantity. In other words, we cannot precisely measure both of these observables at a time. If we try to measure one of them precisely (say position), the other (momentum) would be measured with very large standard deviation. The more general form of the famous uncertainty principle looks like this:

For two operators A and B,

\(\Delta_{\psi}A \Delta_{\psi}B \geq {1\over 2} \vert \left<{[A,B]}\right>_{\psi}\vert \)

where \(\Delta_{\psi}A\), for example, denotes the standard deviation of observable A for state \(\psi \).

What if I tell you that the above is not the correct way to express the uncertainty principle. Many physicists have criticised the above inequality. The problem is the lower bound of the uncertainty product, i.e. the right-hand side of the inequality is not a fixed value. The bound is dependent on the state \(\psi \). Physicists preferred a lower bound which is in some way not so much dependent on the state itself. Please note that the left-hand side of uncertainty relation is also dependent on the state \(\psi \), of course.

How this bound is dependent on \(\psi \)?

Consider this situation. Suppose the state \(\psi \) is an eigenvector of one of the operators (let’s say A). In that case, the standard deviation would be precisely 0. This would mean the product on the left-hand side would be 0. That, in turn, means the right-hand side must be 0. It gets weirder as you move forward. How? Let’s continue then. The right-hand side becoming zero would mean the bound becomes 0 and that in turn mean the bound on the standard deviation of the other operator B no more exists because the standard deviation of the first operator is already 0.

To overcome this confusion, people came up with alternate expressions of the uncertainty principle. The new expression utilises the concept of Shannon entropy. Don’t worry if you are not familiar with Shannon entropy. Consider it just like a mathematical quantity defined later in this text.

Let \(\left|{a_i}\right>\) and \(\left|{b_i}\right>\) denote the complete sets of the normalized eigenvectors of the two operators A and B respectively. Let \(p=(p_1 …p_N)\) and \(q=(q_1….q_N)\) denote the probability distributions for the eigenstates corresponding to A and B respectively. For example, \(p_i\) would be \(p_i=\vert \left<{a_i \vert\psi}\right>\vert^2\).

For probability distribution \(p\), for instance, satisfying \(p_i \geq 0\) and \(\sum_{i}p_i =1\), the Shannon entropy is defined as

\( S(p)=-\sum_{i} ( p_i) ( ln (p_i)\).

Knowing that definition, the new uncertainty relation is written as

\( S(p) + S(q) \geq ( -2 ) ln (c)\)

where \(c=max |\left<{a_i|b_j}\right>|\), searching over all \( j\) and \(k\).

We can now see that the right hand side of the uncertainty relation is no more dependent on the state but is more dependent on the eigenvectors of the operators which is solely the property of operators. However, the uncertainty in one term being zero does not, in turn, loses the bound on the standard deviation of other observable unlike the old expression of uncertainty principle.


If you found this article interesting and want to see more like this, please share this page. You are our strength! 

For references and to know more you can click here.
Disclaimer: If there is something bothering you about the content of this page kindly visit the disclaimer page by clicking here

 

Ā© Quantuse

 

Nishant Pathak

He is a researcher working on Quantum Information. He loves playing soccer. To contact, use the Contact Us page.

6 thoughts on “Is the famous uncertainty relation wrong? Here is how!

    • June 10, 2020 at 8:02 pm
      Permalink

      Thank You! Stay tuned for more.

      Reply
  • June 10, 2020 at 6:58 pm
    Permalink

    It is really interesting explanation.

    Reply
    • June 10, 2020 at 8:03 pm
      Permalink

      Thank you! Stay connected.

      Reply
  • June 15, 2020 at 11:00 am
    Permalink

    just a clear cut explanation using information and entropy taken into account.

    Reply
    • June 16, 2020 at 10:49 pm
      Permalink

      Thank You! Stay tuned for more.

      Reply

Leave a Reply

Are you a science enthusiast?

Publish your article for free!

Click on Join Us on the menu bar