Artificial neural network potentials: Difference between revisions
Carl McBride (talk | contribs) m (tmp save) |
Carl McBride (talk | contribs) m (→References: Added a recent publication) |
||
(7 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
'''Artificial neural network potentials''' (ANNP) | '''Artificial neural network potentials''' (ANNP). Neural networks (NN) are used more and more for a wide array of applications. Here we are concerned with a more narrow application; their use in fitting | ||
[[water]] <ref>[http://dx.doi.org/10.1073/pnas.1602375113 Tobias Morawietz, Andreas Singraber, Christoph Dellago, and Jörg Behler "How van der Waals interactions determine the unique properties of water", PNAS '''113''' pp. 8368-8373 (2016)]</ref> | <ref>[http://dx.doi.org/10.1007/BF02551274 G. Cybenko "Approximation by superpositions of a sigmoidal function", Mathematics of Control, Signals and Systems '''2''' pp. 303-314 (1989)]</ref> | ||
[[Sodium hydroxide-water mixture | aqueous NaOH solutions]] <ref>[http://dx.doi.org/10.1039/C6CP06547C Matti Hellström and Jörg Behler "Structure of aqueous NaOH solutions: insights from neural-network-based molecular dynamics simulations", Physical Chemistry Chemical Physics '''19''' pp. 82-96 (2017)] | <ref>[http://dx.doi.org/10.1016/0893-6080(89)90020-8 Kurt Hornik, Maxwell Stinchcombe, Halbert White "Multilayer feedforward networks are universal approximators", Neural Networks '''2''' pp. 359-366 (1989)]</ref> to an atomic or molecular potential energy surface. In particular the ''output layer'', or ''node'', provides an energy as a function of the coordinates, which form the ''input layer''. | ||
</ref> | ==Activation functions== | ||
[[gold]] nanoparticles <ref>[http://dx.doi.org/10.1063/1.4977050 Siva Chiriki, Shweta Jindal, and Satya S. Bulusu "Neural network potentials for dynamics and thermodynamics of gold nanoparticles", Journal of Chemical Physics '''146''' 084314 (2017)]</ref> | ==Training== | ||
==Example== | |||
The output of a feedforward NN, having a single layer of hidden neurons, each having a sigmoid activation function and a linear output neuron, is given by: | |||
:<math>g(\mathbf{x},\mathbf{w}) = \sum_{i=1}^{N_c} \left[ w_{N_C+1,i} \tanh \left( \sum_{j=1}^n w_{i,j} x_j + w_{i0} \right) \right] + w_{N_c+1,0} </math> | |||
==Applications== | |||
Since the early work of Blank ''et al''. <ref>[http://dx.doi.org/10.1063/1.469597 Thomas B. Blank, Steven D. Brown, August W. Calhoun, and Douglas J. Doren "Neural network models of potential energy surfaces", Journal of Chemical Physics '''103''' 4129 (1995)]</ref> ANNS have been sucessfully developed for [[water]] <ref>[http://dx.doi.org/10.1073/pnas.1602375113 Tobias Morawietz, Andreas Singraber, Christoph Dellago, and Jörg Behler "How van der Waals interactions determine the unique properties of water", PNAS '''113''' pp. 8368-8373 (2016)]</ref>, | |||
Al<sup>3+</sup> ions dissolved in water <ref>[http://dx.doi.org/10.1021/jp972209d Helmut Gassner, Michael Probst, Albert Lauenstein, and Kersti Hermansson "Representation of Intermolecular Potential Functions by Neural Networks", Journal of Physical Chemistry A '''102''' pp. 4596-4605 (1998)]</ref>, | |||
[[Sodium hydroxide-water mixture | aqueous NaOH solutions]] <ref>[http://dx.doi.org/10.1039/C6CP06547C Matti Hellström and Jörg Behler "Structure of aqueous NaOH solutions: insights from neural-network-based molecular dynamics simulations", Physical Chemistry Chemical Physics '''19''' pp. 82-96 (2017)]</ref>, | |||
[[gold]] nanoparticles <ref>[http://dx.doi.org/10.1063/1.4977050 Siva Chiriki, Shweta Jindal, and Satya S. Bulusu "Neural network potentials for dynamics and thermodynamics of gold nanoparticles", Journal of Chemical Physics '''146''' 084314 (2017)]</ref> as well as many other systems <ref>[http://dx.doi.org/10.1016/j.cplett.2004.07.076 Sönke Lorenz, Axel Groß and Matthias Scheffler "Representing high-dimensional potential-energy surfaces for reactions at surfaces by neural networks", Chemical Physics Letters '''395''' pp. 210-215 (2004)]</ref><ref>[http://dx.doi.org/10.1021/jp055253z Sergei Manzhos, Xiaogang Wang, Richard Dawes, and Tucker Carrington Jr. "A Nested Molecule-Independent Neural Network Approach for High-Quality Potential Fits", Journal of Physical Chemistry A '''110''' pp. 5295-5304 (2006)]</ref>. | |||
==References== | ==References== | ||
<references/> | <references/> | ||
Line 10: | Line 20: | ||
*[http://dx.doi.org/10.1002/qua.24890 Jörg Behler "Constructing high-dimensional neural network potentials: A tutorial review", International Journal of Quantum Chemistry '''115''' pp. 1032-1050 (2015)] | *[http://dx.doi.org/10.1002/qua.24890 Jörg Behler "Constructing high-dimensional neural network potentials: A tutorial review", International Journal of Quantum Chemistry '''115''' pp. 1032-1050 (2015)] | ||
*[http://dx.doi.org/10.1063/1.4966192 Jörg Behler "Perspective: Machine learning potentials for atomistic simulations", Journal of Chemical Physics '''145''' 170901 (2016)] | *[http://dx.doi.org/10.1063/1.4966192 Jörg Behler "Perspective: Machine learning potentials for atomistic simulations", Journal of Chemical Physics '''145''' 170901 (2016)] | ||
*[https://doi.org/10.1063/1.5027645 Linfeng Zhang, Jiequn Han, Han Wang, Roberto Car, and Weinan E "DeePCG: Constructing coarse-grained models via deep neural networks", Journal of Chemical Physics '''149''' 034101 (2018)] | |||
*[https://doi.org/10.1063/1.5037098 Caroline Desgranges and Jerome Delhommelle "A new approach for the prediction of partition functions using machine learning techniques", Journal of Chemical Physics 149, 044118 (2018)] | |||
[[category:models]] | [[category:models]] |
Latest revision as of 12:32, 12 September 2018
Artificial neural network potentials (ANNP). Neural networks (NN) are used more and more for a wide array of applications. Here we are concerned with a more narrow application; their use in fitting [1] [2] to an atomic or molecular potential energy surface. In particular the output layer, or node, provides an energy as a function of the coordinates, which form the input layer.
Activation functions[edit]
Training[edit]
Example[edit]
The output of a feedforward NN, having a single layer of hidden neurons, each having a sigmoid activation function and a linear output neuron, is given by:
Applications[edit]
Since the early work of Blank et al. [3] ANNS have been sucessfully developed for water [4], Al3+ ions dissolved in water [5], aqueous NaOH solutions [6], gold nanoparticles [7] as well as many other systems [8][9].
References[edit]
- ↑ G. Cybenko "Approximation by superpositions of a sigmoidal function", Mathematics of Control, Signals and Systems 2 pp. 303-314 (1989)
- ↑ Kurt Hornik, Maxwell Stinchcombe, Halbert White "Multilayer feedforward networks are universal approximators", Neural Networks 2 pp. 359-366 (1989)
- ↑ Thomas B. Blank, Steven D. Brown, August W. Calhoun, and Douglas J. Doren "Neural network models of potential energy surfaces", Journal of Chemical Physics 103 4129 (1995)
- ↑ Tobias Morawietz, Andreas Singraber, Christoph Dellago, and Jörg Behler "How van der Waals interactions determine the unique properties of water", PNAS 113 pp. 8368-8373 (2016)
- ↑ Helmut Gassner, Michael Probst, Albert Lauenstein, and Kersti Hermansson "Representation of Intermolecular Potential Functions by Neural Networks", Journal of Physical Chemistry A 102 pp. 4596-4605 (1998)
- ↑ Matti Hellström and Jörg Behler "Structure of aqueous NaOH solutions: insights from neural-network-based molecular dynamics simulations", Physical Chemistry Chemical Physics 19 pp. 82-96 (2017)
- ↑ Siva Chiriki, Shweta Jindal, and Satya S. Bulusu "Neural network potentials for dynamics and thermodynamics of gold nanoparticles", Journal of Chemical Physics 146 084314 (2017)
- ↑ Sönke Lorenz, Axel Groß and Matthias Scheffler "Representing high-dimensional potential-energy surfaces for reactions at surfaces by neural networks", Chemical Physics Letters 395 pp. 210-215 (2004)
- ↑ Sergei Manzhos, Xiaogang Wang, Richard Dawes, and Tucker Carrington Jr. "A Nested Molecule-Independent Neural Network Approach for High-Quality Potential Fits", Journal of Physical Chemistry A 110 pp. 5295-5304 (2006)
- Related reading
- Christopher Michael Handley and Jörg Behler "Next generation interatomic potentials for condensed systems", European Physical Journal B 87 152 (2014)
- Jörg Behler "Constructing high-dimensional neural network potentials: A tutorial review", International Journal of Quantum Chemistry 115 pp. 1032-1050 (2015)
- Jörg Behler "Perspective: Machine learning potentials for atomistic simulations", Journal of Chemical Physics 145 170901 (2016)
- Linfeng Zhang, Jiequn Han, Han Wang, Roberto Car, and Weinan E "DeePCG: Constructing coarse-grained models via deep neural networks", Journal of Chemical Physics 149 034101 (2018)
- Caroline Desgranges and Jerome Delhommelle "A new approach for the prediction of partition functions using machine learning techniques", Journal of Chemical Physics 149, 044118 (2018)