Metropolis Monte Carlo: Difference between revisions
Carl McBride (talk | contribs) |
(→Main features: grammar) |
||
(21 intermediate revisions by 2 users not shown) | |||
Line 1: | Line 1: | ||
'''Metropolis Monte Carlo ( | The '''Metropolis Monte Carlo''' technique | ||
''' | <ref>[http://dx.doi.org/10.1063/1.1699114 Nicholas Metropolis, Arianna W. Rosenbluth, Marshall N. Rosenbluth, Augusta H. Teller and Edward Teller, "Equation of State Calculations by Fast Computing Machines", Journal of Chemical Physics '''21''' pp.1087-1092 (1953)]</ref> | ||
is a variant of the original [[Monte Carlo]] method proposed by [[Nicholas Metropolis]] and [[Stanislaw Ulam]] in 1949 <ref>[http://links.jstor.org/sici?sici=0162-1459%28194909%2944%3A247%3C335%3ATMCM%3E2.0.CO%3B2-3 Nicholas Metropolis and S. Ulam "The Monte Carlo Method", Journal of the American Statistical Association '''44''' pp. 335-341 (1949)]</ref> | |||
== Main features == | == Main features == | ||
Metropolis Monte Carlo simulations can be carried out in different ensembles. For the case of one-component systems the usual ensembles are: | Metropolis Monte Carlo simulations can be carried out in different ensembles. For the case of one-component systems the usual ensembles are: | ||
* [[Canonical ensemble]] (<math> NVT </math> ) | * [[Canonical ensemble]] (<math> NVT </math> ) | ||
* [[Isothermal-isobaric ensemble]] (<math> NpT </math>) | * [[Isothermal-isobaric ensemble]] (<math> NpT </math>) | ||
* [[Grand canonical ensemble]] (<math> \mu V T </math>) | * [[Grand canonical ensemble]] (<math> \mu V T </math>) | ||
In the case of mixtures, it is useful to consider the so-called [[Semi-grand ensembles]]. | |||
In the case of mixtures, it is useful to consider the so-called | |||
The purpose of these techniques is to sample representative configurations of the system at the corresponding | The purpose of these techniques is to sample representative configurations of the system at the corresponding | ||
thermodynamic conditions. | thermodynamic conditions. The sampling techniques make use of the so-called pseudo-[[Random numbers |random number]] generators. | ||
The sampling techniques make use the so-called pseudo-[[Random numbers |random number]] generators | |||
== Configuration == | == Configuration == | ||
A configuration is a microscopic realisation of the ''thermodynamic state'' of the system. | A configuration is a microscopic realisation of the ''thermodynamic state'' of the system. | ||
To define a configuration (denoted as <math> \left. X \right. </math> ) we usually require: | To define a configuration (denoted as <math> \left. X \right. </math> ) we usually require: | ||
*The position coordinates of the particles | *The position coordinates of the particles | ||
*Depending on the problem, other variables like volume, number of particles, etc. | *Depending on the problem, other variables like volume, number of particles, etc. | ||
The probability of a given configuration, denoted as <math> \Pi \left( X | k \right) </math>, | The probability of a given configuration, denoted as <math> \Pi \left( X | k \right) </math>, | ||
depends on the parameters <math> k </math> (e.g. temperature, pressure) | depends on the parameters <math> k </math> (e.g. [[temperature]], [[pressure]]) | ||
Example: | Example: | ||
Line 39: | Line 25: | ||
In most of the cases <math> \Pi \left( X | k \right) </math> exhibits the following features: | In most of the cases <math> \Pi \left( X | k \right) </math> exhibits the following features: | ||
* It is a function of many variables | * It is a function of many variables | ||
* Only for a very small fraction of the configurational space the value of <math> \Pi \left( X | k \right) </math> is not negligible | * Only for a very small fraction of the configurational space the value of <math> \Pi \left( X | k \right) </math> is not negligible. | ||
Due to these properties, Metropolis Monte Carlo requires the use of '''Importance Sampling''' techniques | Due to these properties, Metropolis Monte Carlo requires the use of '''Importance Sampling''' techniques | ||
== Importance sampling == | == Importance sampling == | ||
Importance sampling is useful to evaluate average values given by: | |||
: <math> \langle A(X|k) \rangle = \int dX \Pi(X|k) A(X) </math> | : <math> \langle A(X|k) \rangle = \int dX \Pi(X|k) A(X) </math> | ||
where: | where: | ||
* <math> \left. X \right. </math> represents a set of many variables, | * <math> \left. X \right. </math> represents a set of many variables, | ||
* <math> \left. \Pi \right. </math> is a probability distribution function which depends on <math> X </math> and on the constraints (parameters) <math> k </math> | * <math> \left. \Pi \right. </math> is a probability distribution function which depends on <math> X </math> and on the constraints (parameters) <math> k </math> | ||
* <math> \left. A \right. </math> is an observable which depends on the <math> X </math> | * <math> \left. A \right. </math> is an observable which depends on the <math> X </math> | ||
Depending on the behavior of <math> \left. \Pi \right. </math> we can use to compute <math> \langle A(X|k) \rangle </math> different numerical methods: | Depending on the behavior of <math> \left. \Pi \right. </math> we can use to compute <math> \langle A(X|k) \rangle </math> different numerical methods: | ||
* If <math> \left. \Pi \right. </math> is, roughly speaking, quite uniform: [[Monte Carlo Integration]] methods can be effective | * If <math> \left. \Pi \right. </math> is, roughly speaking, quite uniform: [[Monte Carlo Integration]] methods can be effective | ||
* If <math> \left. \Pi \right. </math> has | * If <math> \left. \Pi \right. </math> has significant values only for a small part of the configurational space, Importance sampling could be the appropriate technique | ||
====Outline of the Method==== | |||
* Random walk over <math> \left. X \right. </math>: | |||
: <math> \left. X_{i+1}^{test} = X_{i} + \delta X \right. </math> | : <math> \left. X_{i+1}^{test} = X_{i} + \delta X \right. </math> | ||
From the configuration at the i-th step | From the configuration at the i-th step one builds up a ''test'' configuration by slightly modifying some of the variables <math> X </math> | ||
* The test configuration is accepted as the new (i+1)-th configuration with certain criteria (which depends basically on <math> \Pi </math>) | * The test configuration is accepted as the new (i+1)-th configuration with certain criteria (which depends basically on <math> \Pi </math>) | ||
* If the test configuration is not accepted as the new configuration then: <math> \left. X_{i+1} = X_i \right. </math> | * If the test configuration is not accepted as the new configuration then: <math> \left. X_{i+1} = X_i \right. </math> | ||
The procedure is based on the [[Markov chain]] formalism, and on the [[Perron-Frobenius theorem]]. | |||
The procedure is based on the [[Markov | The acceptance criteria must be chosen to guarantee that after a certain equilibration ''time'' a given configuration appears with probability given by <math> \Pi(X|k) </math> | ||
The acceptance criteria must be chosen to guarantee that after a certain equilibration ''time'' a given configuration appears with | |||
probability given by <math> \Pi(X|k) </math> | |||
== Temperature == | == Temperature == | ||
The temperature is usually fixed in Metropolis Monte Carlo simulations, since in classical statistics the kinetic degrees of freedom (momenta) can be generally, integrated out. However, it is possible to design procedures to perform Metropolis Monte Carlo simulations in the [[Microcanonical ensemble| microcanonical ensemble]] (NVE). | The [[temperature]] is usually fixed in Metropolis Monte Carlo simulations, since in classical statistics the kinetic degrees of freedom (momenta) can be generally, integrated out. However, it is possible to design procedures to perform Metropolis Monte Carlo simulations in the [[Microcanonical ensemble| microcanonical ensemble]] (NVE). | ||
See [[Monte Carlo in the microcanonical ensemble]] | See [[Monte Carlo in the microcanonical ensemble]] | ||
== Boundary Conditions == | == Boundary Conditions == | ||
The simulation of homogeneous systems is usually carried out using [[periodic boundary conditions]]. | The simulation of homogeneous systems is usually carried out using [[periodic boundary conditions]]. | ||
Line 95: | Line 66: | ||
The usual choices for the initial configuration in fluid simulations are: | The usual choices for the initial configuration in fluid simulations are: | ||
* an equilibrated configuration under similar conditions, | * an equilibrated configuration under similar conditions (for example see <ref>[http://dx.doi.org/10.1016/j.cpc.2005.02.006 Carl McBride, Carlos Vega and Eduardo Sanz "Non-Markovian melting: a novel procedure to generate initial liquid like phases for small molecules for use in computer simulation studies", Computer Physics Communications '''170''' pp. 137-143 (2005)]</ref>) | ||
* an ordered lattice structure. For details concerning the construction of such structures see: [[Lattice Structures | lattice structures]]. | * an ordered lattice structure. For details concerning the construction of such structures see: [[Lattice Structures | lattice structures]]. | ||
== Advanced techniques == | == Advanced techniques == | ||
:''Main article: [[Monte Carlo]]'' | |||
== References == | == References == | ||
<references/> | |||
'''Related reading''' | |||
*[http://links.jstor.org/sici?sici=0162-1459%28194909%2944%3A247%3C335%3ATMCM%3E2.0.CO%3B2-3 Nicholas Metropolis and S. Ulam "The Monte Carlo Method", Journal of the American Statistical Association '''44''' pp. 335-341 (1949)] | |||
*[http://library.lanl.gov/cgi-bin/getfile?00326886.pdf Herbert L. Anderson "Metropolis, Monte Carlo, and the MANIAC", Los Alamos Science '''14''' pp. 96-107 (1986)] | |||
*[http://library.lanl.gov/cgi-bin/getfile?00326866.pdf N. Metropolis "The Beginnning of the Monte Carlo Method" Los Alamos Science '''15''' pp. 125-130 (1987)] | |||
*[http://cise.aip.org/getpdf/servlet/GetPDFServlet?filetype=pdf&id=CSENFA000002000001000065000001&idtype=cvips&prog=normal Isabel Beichl and Francis Sullivan "The Metropolis Algorithm", Computing in Science & Engineering '''2''' Issue 1 pp. 65-69 (2000)] | |||
*[http://dx.doi.org/10.1063/1.1632112 Marshall N. Rosenbluth "Genesis of the Monte Carlo Algorithm for Statistical Mechanics", AIP Conference Proceedings '''690''' pp. 22-30 (2003)] | |||
*[http://dx.doi.org/10.1063/1.1632114 Marshall N. Rosenbluth "Proof of Validity of Monte Carlo Method for Canonical Averaging", AIP Conference Proceedings '''690''' pp. 32-38 (2003)] | |||
*[http://dx.doi.org/10.1063/1.1632115 William W. Wood "A Brief History of the Use of the Metropolis Method at LANL in the 1950s", AIP Conference Proceedings '''690''' pp. 39-44 (2003)] | |||
*[http://dx.doi.org/10.1063/1.1632124 David P. Landau "The Metropolis Monte Carlo Method in Statistical Physics", AIP Conference Proceedings '''690''' pp. 134-146 (2003)] | |||
[[Category: Monte Carlo]] | [[Category: Monte Carlo]] |
Latest revision as of 09:37, 13 June 2012
The Metropolis Monte Carlo technique [1] is a variant of the original Monte Carlo method proposed by Nicholas Metropolis and Stanislaw Ulam in 1949 [2]
Main features[edit]
Metropolis Monte Carlo simulations can be carried out in different ensembles. For the case of one-component systems the usual ensembles are:
In the case of mixtures, it is useful to consider the so-called Semi-grand ensembles. The purpose of these techniques is to sample representative configurations of the system at the corresponding thermodynamic conditions. The sampling techniques make use of the so-called pseudo-random number generators.
Configuration[edit]
A configuration is a microscopic realisation of the thermodynamic state of the system. To define a configuration (denoted as ) we usually require:
- The position coordinates of the particles
- Depending on the problem, other variables like volume, number of particles, etc.
The probability of a given configuration, denoted as , depends on the parameters (e.g. temperature, pressure)
Example:
In most of the cases exhibits the following features:
- It is a function of many variables
- Only for a very small fraction of the configurational space the value of is not negligible.
Due to these properties, Metropolis Monte Carlo requires the use of Importance Sampling techniques
Importance sampling[edit]
Importance sampling is useful to evaluate average values given by:
where:
- represents a set of many variables,
- is a probability distribution function which depends on and on the constraints (parameters)
- is an observable which depends on the
Depending on the behavior of we can use to compute different numerical methods:
- If is, roughly speaking, quite uniform: Monte Carlo Integration methods can be effective
- If has significant values only for a small part of the configurational space, Importance sampling could be the appropriate technique
Outline of the Method[edit]
- Random walk over :
From the configuration at the i-th step one builds up a test configuration by slightly modifying some of the variables
- The test configuration is accepted as the new (i+1)-th configuration with certain criteria (which depends basically on )
- If the test configuration is not accepted as the new configuration then:
The procedure is based on the Markov chain formalism, and on the Perron-Frobenius theorem. The acceptance criteria must be chosen to guarantee that after a certain equilibration time a given configuration appears with probability given by
Temperature[edit]
The temperature is usually fixed in Metropolis Monte Carlo simulations, since in classical statistics the kinetic degrees of freedom (momenta) can be generally, integrated out. However, it is possible to design procedures to perform Metropolis Monte Carlo simulations in the microcanonical ensemble (NVE).
See Monte Carlo in the microcanonical ensemble
Boundary Conditions[edit]
The simulation of homogeneous systems is usually carried out using periodic boundary conditions.
Initial configuration[edit]
The usual choices for the initial configuration in fluid simulations are:
- an equilibrated configuration under similar conditions (for example see [3])
- an ordered lattice structure. For details concerning the construction of such structures see: lattice structures.
Advanced techniques[edit]
- Main article: Monte Carlo
References[edit]
- ↑ Nicholas Metropolis, Arianna W. Rosenbluth, Marshall N. Rosenbluth, Augusta H. Teller and Edward Teller, "Equation of State Calculations by Fast Computing Machines", Journal of Chemical Physics 21 pp.1087-1092 (1953)
- ↑ Nicholas Metropolis and S. Ulam "The Monte Carlo Method", Journal of the American Statistical Association 44 pp. 335-341 (1949)
- ↑ Carl McBride, Carlos Vega and Eduardo Sanz "Non-Markovian melting: a novel procedure to generate initial liquid like phases for small molecules for use in computer simulation studies", Computer Physics Communications 170 pp. 137-143 (2005)
Related reading
- Nicholas Metropolis and S. Ulam "The Monte Carlo Method", Journal of the American Statistical Association 44 pp. 335-341 (1949)
- Herbert L. Anderson "Metropolis, Monte Carlo, and the MANIAC", Los Alamos Science 14 pp. 96-107 (1986)
- N. Metropolis "The Beginnning of the Monte Carlo Method" Los Alamos Science 15 pp. 125-130 (1987)
- Isabel Beichl and Francis Sullivan "The Metropolis Algorithm", Computing in Science & Engineering 2 Issue 1 pp. 65-69 (2000)
- Marshall N. Rosenbluth "Genesis of the Monte Carlo Algorithm for Statistical Mechanics", AIP Conference Proceedings 690 pp. 22-30 (2003)
- Marshall N. Rosenbluth "Proof of Validity of Monte Carlo Method for Canonical Averaging", AIP Conference Proceedings 690 pp. 32-38 (2003)
- William W. Wood "A Brief History of the Use of the Metropolis Method at LANL in the 1950s", AIP Conference Proceedings 690 pp. 39-44 (2003)
- David P. Landau "The Metropolis Monte Carlo Method in Statistical Physics", AIP Conference Proceedings 690 pp. 134-146 (2003)