Evolutionary and Gradient-Based Algorithms for Lennard-Jones Cluster Optimization

S. Müller, N. N. Schraudolph, and P. Koumoutsakos. Evolutionary and Gradient-Based Algorithms for Lennard-Jones Cluster Optimization. In Genetic and Evolutionary Computation Conference Workshop Program, pp. 160–165, AAAI, Chicago, 2003.

Download

pdf djvu ps.gz
164.6kB   75.6kB   60.4kB  

Abstract

Finding the equilibriated configuration of atomic clusters modeled by the Lennard-Jones potential poses a challenging task to numerical optimization strategies as the number of local minima grows exponentially with the number of atoms in the cluster. We use this massively multimodal problem to test different evolutionary, deterministic and randomized gradient methods with respect to their global search behavior. The randomized gradient method was designed to combine the advantages of gradient and stochastic direct optimization.

BibTeX Entry

@inproceedings{MueSchKou03,
     author = {Sybille M\"uller and Nicol N. Schraudolph
               and Petros Koumoutsakos},
      title = {\href{http://nic.schraudolph.org/pubs/MueSchKou03.pdf}{
               Evolutionary and Gradient-Based Algorithms
               for {L}ennard-{J}ones Cluster Optimization}},
      pages = {160--165},
     editor = {Alwyn M. Barry},
  booktitle = {Genetic and Evolutionary Computation Conference
               Workshop Program},
  publisher = {AAAI},
    address = {Chicago},
       year =  2003,
   b2h_type = {Other},
  b2h_topic = {Evolutionary Algorithms},
   abstract = {
    Finding the equilibriated configuration of atomic clusters
    modeled by the Lennard-Jones potential poses a challenging task
    to numerical optimization strategies as the number of local minima
    grows exponentially with the number of atoms in the cluster.  We use
    this massively multimodal problem to test different evolutionary,
    deterministic and randomized gradient methods with respect to
    their global search behavior.  The randomized gradient method was
    designed to combine the advantages of gradient and stochastic direct
    optimization.
}}

Generated by bib2html.pl (written by Patrick Riley) on Thu Sep 25, 2014 12:00:33