Product Probability in Quantum Mechanics With One Factor Linked to Maximum Entropy And Information

Main Author: Francesco R. Ruggeri
Format: info publication-preprint Journal
Terbitan: , 2021
Subjects:
Online Access: https://zenodo.org/record/5391424
Daftar Isi:
  • In classical statistical mechanics there exists the idea of maximum entropy subject to constraints which is linked to the idea of information. For an equilibrium situation, a given information value is related to the idea that the system does not distinguish products of probabilities (which are functions of information) if information of one plus that of the other equals the given value. For example for a Maxwell-Boltzmann gas, the maximization of entropy subject to average energy i.e. -Sum over i f(ei) ln(f(ei)) + b Sum over i ei f(ei) leads to a statement of information i.e. ln(f(ei)) = -b ei where b=1/T and ln(f(ei)) is called information. For equilibrium a given information value say E=ei+ej means any ei, ej pair’s joint probability (product) is fixed i.e. a function of E. (For the Fermi-Dirac and Bose-Einstein gases, there is a function g(f(ei)) which satisfies the same condition. In this note, we attempt to consider the idea of information in a quantum bound state. For such a system there is one particle so it cannot collide with other particles to establish equilibrium. It may, however, collide with the potential in a series of impulse hits which on average create V(x) and mimic the idea of an average acceleration i.e. changing kinetic energy from point to point. Breaking V(x) into probabilities linked to impulse hits requires V(x)= Sum over k Vk G(kx) such that G(kx) is invariant over all space and d/dx G(kx)= k G(kx) c where c is a constant. k is the impulse received. A solution is Vk exp(ikx) which has two factors Vk and exp(ikx). Each is a type of probability, but with very different natures. exp(ikx) which is linked to the wavelength shows the nonlocality associated with an impulse k and also that one does not really know k at x which is why this factor is complex. It is this factor, however, which is linked to information in a statistical sense as we argue in the note like ei in the classical statistical mechanical situation. The other factor Vk is a probability which is fixed from the constraint V(x)=Sum over k Vk exp(ikx). Analogously, the quantum bound particle is described by a distribution W(x)=Sum over p a(p)exp(ipx) where W(x) is the wavefunction. This again has a product probability for each p with factors a(p) and exp(ipx). exp(ipx) is related to information (i.e. to maximum entropy) as it combines with exp(ikx) from V(x). In equilibrium a specific exp(ipx) is created by various exp(ikx)exp(i(p-k)x) terms, but the other factors Vk a(p-k) come into play with a(p-k) being fixed by the constraint: a(p) pp/2m + Sum over k Vk a(p-k) = E a(p). Thus, in quantum mechanics unlike classical statistical mechanics there is a product of probabilities for each p with the imaginary probability exp(ipx) being linked to information as in the classical statistical case. It is also linked to nonlocality i.e. a wavefunction and the idea that one may not really discern p at x. The second probability factor a(p) may be real and yields information about the probability to find a given p, but with the nature of exp(ipx) and W*(x)W(x) as a density, it is a(p)a(p) which is the probability P(p) with all information about x removed. a(p)a(p) which depends on the information p is like f(ei) from classical statistical mechanics in that it is real, but it does not hold at a specific x point and is only fixed through a constraint, not through any maximization of entropy.