Optimization by random search with jumps

Chunshien Li, Roland Priemer*, Kuo Hsiang Cheng

*Corresponding author for this work

Research output: Contribution to journalJournal Article peer-review

12 Scopus citations


We give a random optimization (RO) algorithm to optimize a real-valued function of n real variables. During the optimization process, interpolation points are examined to follow valleys, and jumps to new starting points are executed to avoid numerous iterations in local minima. Convergence with probability one to the global minimum of a function is proved. The proposed RO method is a simple, derivative-free and computationally moderate algorithm, with excellent performance compared to other RO methods. Seven functions, which are commonly used to test the performance of optimization methods, are used to evaluate the performance of the RO algorithm given here.

Original languageEnglish
Pages (from-to)1301-1315
Number of pages15
JournalInternational Journal for Numerical Methods in Engineering
Issue number7
StatePublished - 21 06 2004


  • Derivative-free optimization
  • Global minimum
  • Interpolation check
  • Jump
  • Multi-variable function optimization
  • Random optimization
  • Random search


Dive into the research topics of 'Optimization by random search with jumps'. Together they form a unique fingerprint.

Cite this