Appendix D. Expectation values of observables

This appendix deviates from the convention introduced in Sec. 2 by using $x$ to denote a set of $\Ndof$ coordinates.

D.1 Preliminaries

Let $\configspace$ denote the configuration space of a physical system with $\Ndof$ degrees of freedom. In other words, ${\configspace}$ is the set of all possible microstructures of the system, $x$.

Let ${\pdf:\configspace\to\realpos}$, ${x\mapsto\pdf(x)}$ be a probability density function. Then, if ${\varphi\in\lebesgue(\configspace)}$ is any function

\begin{align*} \varphi:\configspace\to\complex, x\mapsto \varphi(x)\equiv \sqrt{\pdf(x)}e^{i\theta(x)}, \end{align*}
where ${\theta:\configspace\to\realone}$ is arbitrary for now, it specifies ${\pdf=\varphi^*\varphi}$. Therefore $\varphi$ specifies the statistical state specified by $\pdf$.

The state can also be represented by the element,

\begin{align*} \ket{\varphi}\equiv\intconfig \dd{x}\varphi(x)\ket{x}\in\hilbert, \end{align*}
of an abstract Hilbert space ${\hilbert}$, whose elements are in one-to-one correspondence with elements of ${\lebesgue(\configspace)}$.

Vector space ${\hilbert}$ and vector ${\ket{\varphi}}$ are defined by a construction analogous to the one outlined in Appendix C: Briefly, the integral ${\intconfig\cdots\dd{x}}$ is really the discrete Riemann sum ${\hilbv\sum_{x\in\discreal}}$ over a partition of $\configspace$. The elements of the partition are all simply-connected subsets of $\configspace$, with the same arbitrarily-small finite measure $\hilbv$, centered at points on a lattice ${\discreal=\discreal(\hilbv)}$. Therefore ${\hilbert}$ is a finite dimensional vector space, but its dimension is arbitrarily large: It increases monotonically as ${\hilbv\in\realpos}$ decreases, and it diverges in the limit ${\hilbv\to 0^+}$.

It will be assumed in this appendix that all functions and vectors that specify statistical states are normalized to one, i.e.,

\begin{align*} \norm{\varphi}_2^2\equiv\braket{\varphi}{\varphi}=1. \end{align*}
The vector ${\ket{x}}$ does not represent statistical state. It represents a smooth function that (almost) vanishes everywhere except in the element of the partition of $\configspace$ that is centered at ${x\in\discreal}$, where its (almost) constant value is ${1/\sqrt{\hilbv}}$. Therefore, for any function ${f:\configspace\to\complex}$,
\begin{align*} \intconfig\dd{x} f(x)\braket{x}{x'} = \hilbv\sum_{x\in\discreal} f(x)\braket{x}{x'} = f(x'). \end{align*}
For all practical purposes, ${\braket{x}{x'}}$ is the Dirac delta function, ${\delta(x-x')}$.

D.2 An observable and its expectation value

Let ${\Obs:\configspace\to\realone,\;x\mapsto\Obs(x)}$ be an observable whose value is determined by the microstructure. Its expectation value in the statistical state specified by $\pdf$ or $\varphi$ is
\begin{align*} \expval{\Obs}=\expOp[\pdf] &\equiv \intconfig \pdf(x)\Obs(x)\dd{x} \\ & = \intconfig \varphi^*(x) \Obs(x)\varphi(x)\dd{x}. \end{align*}
Since ${O(x)\in\realone, \;\forall x\in\configspace}$, the operator ${\hO:\hilbert\to\hilbert}$ defined by
\begin{align*} \hO \equiv \intconfig\dd{x} \Obs(x)\dyad{x} \end{align*}
is a Hermitian or self-adjoint operator, i.e.,
\begin{align*} \hO^\dagger = \intconfig\dd{x} \Obs^*(x)\left(\dyad{x}\right)^\dagger = \intconfig\dd{x} \Obs(x)\dyad{x} = \hO. \end{align*}
The result of its action on ${\ket{\varphi}}$ is
\begin{align*} \hO\ket{\varphi} &= \left(\intconfig\dd{x}\Obs(x)\dyad{x}\right)\left(\intconfig\dd{x'}\varphi(x')\ket{x'}\right) \\ & = \intconfig\dd{x}\Obs(x)\ket{x}\left(\intconfig\dd{x'}\varphi(x')\braket{x}{x'}\right) \\ &=\ket{\Obs\varphi} \equiv \intconfig\dd{x} \Obs(x)\varphi(x)\ket{x} = \bra{\Obs\varphi}^\dagger. \end{align*}
Therefore ${\expval{\Obs}}$ can be expressed as
\begin{align*} \expval{\Obs} &=\expvaltwo{\hO}{\varphi}=\braket{\varphi}{\Obs\varphi}=\braket{\Obs\varphi}{\varphi} \\ &= \left(\intconfig\dd{x}\varphi^*(x)\bra{x}\right)\left(\intconfig\dd{x'}\Obs(x')\varphi(x')\ket{x'}\right) \\ & = \intconfig\dd{x}\varphi^*(x)\Obs(x)\varphi(x). \end{align*}
In other words, ${\expval{\Obs}}$ is the real-valued inner product ${\braket{\varphi}{\Obs\varphi}=\braket{\Obs\varphi}{\varphi}}$ of ${\varphi}$ and the function
\begin{align*} \Obs\varphi:\configspace\to\complex; \;x\mapsto \Obs(x)\varphi(x). \end{align*}

D.3 Stationary states

In this subsection, with generalizations that will be introduced in later subsections in mind, I will sometimes denote ${\Obs(x)}$ by ${\hOx}$; and the expectation value of ${\Obs}$ in a unit-normalized state ${\ket{\varphi}}$ will sometimes be denoted by ${\obs\lbrack \varphi\rbrack }$. i.e.,
\begin{align*} \obs\lbrack \varphi\rbrack \equiv\expvaltwo{\hO}{\varphi} = \mybraket{\varphi}{\hOx\varphi}. \end{align*}
A necessary condition for this expectation value to be stationary with respect to variations of $\varphi$ that conserve total probability is ${\delta\functionalarg{\Obs}\lbrack \varphi;\eta\rbrack =0, \forall\eta\in\lebesgue(\configspace)}$, where
\begin{align*} \functionalarg{\Obs}\lbrack \varphi\rbrack \equiv \obs\lbrack \varphi\rbrack +\lagrangearg{\Obs}\braket{\varphi}{\varphi}; \end{align*}
${\lagrangearg{\Obs}}$ is a Lagrange multiplier; and ${\delta\functionalarg{\Obs}\lbrack \varphi;\eta\rbrack }$ is the Gateaux derivative of ${\functionalarg{\Obs}}$ in direction $\eta$ at ${\varphi}$. That is,
\begin{align*} \delta\functionalarg{\Obs}\lbrack \varphi;\eta\rbrack &= \dvone{\zeta}\functionalarg{\Obs}\lbrack \varphi+\zeta\eta\rbrack \eval_{\zeta=0} \\ &=\lim_{\zeta\to 0}\frac{\functionalarg{\Obs}\lbrack \varphi+\zeta\eta\rbrack -\functionalarg{\Obs}\lbrack \varphi\rbrack }{\zeta}=0. \end{align*}

D.3.1 Stationary states are eigenstates

Let ${\varphi_r\equiv\Re\{\varphi\}}$ and ${\varphi_i\equiv\Im\{\varphi\}}$; and let use choose ${\eta}$ to be an element of a real-valued complete basis of ${\lebesgue(\configspace)=\lebesgue(\configspace;\complex)}$. Then,
\begin{align*} 2\Re\left\{\delta\functionalarg{\Obs}\lbrack \varphi;\eta\rbrack \right\} &= \mel{\varphi}{\hO}{\eta} + \mel{\eta}{\hO}{\varphi} -\lagrangearg{\Obs}\left[\braket{\varphi}{\eta}+\braket{\eta}{\varphi}\right] \\ & = 2\mel{\eta}{\hO}{\varphi_r} - 2\lagrangearg{\Obs}\braket{\eta}{\varphi_r}=0 \\ \implies &\mel{\eta}{\hO}{\varphi_r} =\lagrangearg{\Obs}\braket{\eta}{\varphi_r}; \tag{111} \end{align*}
and
\begin{align*} 2i\Im\left\{\delta\functionalarg{\Obs}\lbrack \varphi;\eta\rbrack \right\} &= \mel{\varphi}{\hO}{\eta} - \mel{\eta}{\hO}{\varphi} -\lagrangearg{\Obs}\left[\braket{\varphi}{\eta}-\braket{\eta}{\varphi}\right] \\ & = 2i\mel{\eta}{\hO}{\varphi_i} - 2i\lagrangearg{\Obs}\braket{\eta}{\varphi_i}=0 \\ \implies & \mel{\eta}{\hO}{\varphi_i} = \lagrangearg{\Obs}\braket{\eta}{\varphi_i}=0. \tag{112} \end{align*}
Since ${\eta\in\lebesgue(\configspace;\realone)\subset\lebesgue(\configspace)}$ is an arbitrary element of a complete basis, Eq. (111) and Eq. (112) must hold for every element of the basis. Therefore the functions ${\hOx\!\varphi_r,\hOx\!\varphi_i\in\lebesgue(\configspace;\realone)}$ can be expressed as
\begin{align*} \hOx\!\varphi_r & = \sum_\eta \mel{\eta}{\hO}{\varphi_r}\eta = \sum_\eta \braket{\eta}{\varphi_r}\eta = \lagrangearg{\Obs}\varphi_r, \\ \hOx\!\varphi_i & = \sum_\eta \mel{\eta}{\hO}{\varphi_i}\eta = \sum_\eta \braket{\eta}{\varphi_i}\eta = \lagrangearg{\Obs}\varphi_i, \end{align*}
where the sums are over all elements of the basis. It follows that
\begin{align*} \hOx\!\varphi= \hOx(\varphi_r+i\varphi_i)=\lagrangearg{\Obs}\varphi \implies\hO\ket{\varphi}=\lagrangearg{\Obs}\ket{\varphi}. \end{align*}
Therefore $\varphi$ and ${\ket{\varphi}}$ are normalized eigenstates of ${\hOx}$ and ${\hO}$, respectively, if and only if they are states at which the expectation value ${\obs\lbrack \varphi\rbrack =\expvaltwo{\hO}{\varphi}}$ is stationary with respect to variations that preserve their normalizations.

D.4 Symmetry

D.4.1 Degeneracies imply symmetries

Let us assume that the set
\begin{align*} \odomain\equiv\left\{\expvaltwo{\hO}{\varphi}:\ket{\varphi}\in\hilbert,\;\braket{\varphi}{\varphi}=1\right\} \end{align*}
of all possible values of ${\expval{\Obs}}$ is $\realone$ or a connected subset of $\realone$, such as an interval or ${\realnonneg}$.

Let $\varphi_i$ and ${\varphi_j\neq\varphi_i}$, be two different stationary states of $\Obs$, whose stationary values are ${\obs_i \equiv \obs\lbrack \varphi_i\rbrack }$ and ${\obs_j \equiv \obs\lbrack \varphi_j\rbrack }$, respectively. Let us also assume that ${\obs_i}$ and ${\obs_j}$ are not both boundary points of ${\odomain}$. Then, if ${\deltaO>0}$ is sufficiently small, the probability

\begin{align*} \Pr\left(\abs{\obs_i-\obs_j}< \deltaO\right), \end{align*}
decreases as ${\deltaO}$ decreases, and it vanishes in the limit ${\deltaO\to 0^+}$. In other words, it is improbable that, by chance, two stationary values turn out to be very close in value; and it is impossible that they turn out to be exactly equal (i.e., to infinite precision) by chance.

Since ${\obs_i}$ and ${\obs_j}$ cannot be equal by chance, ${\obs_i=\obs_j}$ implies that the stationary states specified by $\varphi_i$ and $\varphi_j$ are equivalent to one another by a symmetry of the system.

SEVERAL SUBSECTIONS ARE MISSING
(mid rewrite)


Comments