References: Shankar, R. (1994), *Principles of Quantum Mechanics*, Plenum Press. Section 4.3.

The first three postulates of quantum mechanics concern the properties of a quantum state. The fourth postulate concerns how states evolve with time. The postulate simply states that in non-relativistic quantum mechanics, a state satisfies the Schrödinger equation:

where is the Hamiltonian, which is obtained from the classical Hamiltonian by means of the other postulates of quantum mechanics, namely that we replace all references to the position by the quantum position operator with matrix elements (in the basis) of

and all references to classical momentum by the momentum operator with matrix elements

Although we’ve posted many articles based on Griffiths’s book in which we solved the Schrödinger equation, the approach taken by Shankar is a bit different and, in some ways, a lot more elegant. We begin with a Hamiltonian that does not depend explicitly on time, and then by observing that, since the Schrödinger equation contains only the first derivative with respect to time, The time evolution of a state can be uniquely determined if we specify only the initial state . [A differential equation that is second order in time, such as the wave equation, requires both the initial position and initial velocity to be specified.]

The solution of the Schrödinger equation is then found in analogy to the approach we used in solving the coupled masses problem earlier. We find the eigenvalues and eigenvectors of the Hamiltonian in some basis and use these to construct the propagator . We can then write the solution as

For the case of a time-independent Hamiltonian, we can actually construct in terms of the eigenvectors of . The eigenvalue equation is

where is an eigenvalue of and is its corresponding eigenvector. Since the eigenvectors form a vector space, we can expand the wave function in terms of them in the usual way

The coefficient is the component of along the vector as a function of time. We can now apply the Schrödinger equation 1 to get (a dot over a symbol indicates a time derivative):

Since the eigenvectors are linearly independent (as they form a basis for the vector space), each term in the sum in the first line must be equal to the corresponding term in the sum in the last line, so we have

The solution is

The general solution 7 is therefore

from which we can read off the propagator:

Thus if we can determine the eigenvalues and eigenvectors of , we can write the propagator in terms of them and get the general solution. We can see from this that is unitary:

This derivation uses the fact that the eigenvectors are orthonormal and form a complete set, so that and . Since a unitary operator doesn’t change the norm of a vector, we see from 4 that if is normalized, then so is for all times . Further, the probability that the state will be measured to be in eigenstate is constant over time, since this probability is given by

This derivation assumed that the spectrum of was discrete and non-degenerate. If the possible eigenvalues are continuous, then the sum is replaced by an integral

If the spectrum is discrete and degenerate, then we need to find an orthonormal set of eigenvectors that spans each degenerate subspace, and sum over these sets. For example, if is degenerate, then we find a set of eigenvectors that spans the subspace for which is the eigenvalue. The index runs from 1 up to the degree of degeneracy of , and the propagator is then

The sum over runs over all the distinct eigenvalues, and the sum over runs over the eigenvectors for each different .

Another form of the propagator can be written directly in terms of the time-independent Hamiltonian as

This relies on the concept of the function of an operator, so that is a matrix whose elements are power series of the exponent . The power series must, of course, converge for this solution to be valid. Since is Hermitian, is unitary. We can verify that the solution using this form of satisfies the Schrödinger equation:

The derivative of can be calculated from the derivatives of its matrix elements, which are all power series.