Stochastic methods are based on the Markov Chain. A Markov
process is characterized by a lack of memory (sound familiar?).
The statistical properties of the immediate future are uniquely
determined from the present, independent of the past. There are
several flavors that exist to implement this approach:
- Brownian Dynamics
- The Brownian dynamics method computes phase space trajectories
of a collection of particles that individually obey Langevin
equations in a field of force.
- Assign an initial position and velocity.
- Get a random number from a Gaussian distribution
- Integrate velocity at n+1.
- Add random component to the velocity to mimic the
interaction of the system with the environment.
- Generalized Langevin Equations
- The generalized Langevin approach is based on non-equilibrium
statistical thermodynamics. A stochastic frictional force is
included which is related to the second fluctuation dissipation
theorem.
The general equations are
where
-
is the system force derived from a potential.
-
is the friction force, a damping term.
-
is the random driving force.
-
from the second flucuation dissipation theorem.
- Monte Carlo Methods
- Representing the solution of a problem as a parameter of a
hypothetical population, and using a random sequence of numbers
to construct a sample of the population, from which statistical
estimates of the parameter can be obtained.
In particular, the Metropolis Algorithm (Canonical Monte Carlo
Method) is commonly used.
- Specify an initial configuration
- Generate a new configuration
- Compute energy and difference
If difference < 0 accept the new configuration
If difference > 0, randomly accept new configuration
Author: Ken Flurchick