In this paper, we use the Ehlich-Zeller-G9ärtel inequality to derive an algorithm for finding the global minima of polynomials over hyperrectangles as well as to provide a bounding method for the branch-and-bound algorithm. The latter application of the inequality results in an improved algorithm which gives simultaneously a decreasing upper bound and an increasing lower bound for the global minimum at each iteration. The algorithm can be used also to find the Lipschitz constant of a polynomial. [ABSTRACT FROM AUTHOR]
The supervisor and searcher cooperation framework (SSC), introduced in Refs. 1 and 2, provides an effective way to design efficient optimization algorithms combining the desirable features of the two existing ones. This work aims to develop efficient algorithms for a wide range of noisy optimization problems including those posed by feedforward neural networks training. It introduces two basic SSC algorithms. The first seems suited for generic problems. The second is motivated by neural networks training problems. It introduces also inexact variants of the two algorithms, which seem to possess desirable properties. It establishes general theoretical results about the convergence and speed of SSC algorithms and illustrates their appealing attributes through numerical tests on deterministic, stochastic, and neural networks training problems. [ABSTRACT FROM AUTHOR]