151. Information Perspective of Optimization
- Author
-
Riccardo Poli and Yossi Borenstein
- Subjects
Mathematical optimization ,Computational complexity theory ,Functional analysis ,Kolmogorov complexity ,Computer science ,Entropy (statistical thermodynamics) ,No free lunch theorem ,Information theory ,Combinatorics ,Entropy (classical thermodynamics) ,Random search ,Distribution function ,Entropy (information theory) ,Probability distribution ,Combinatorial optimization ,Entropy (energy dispersal) ,Entropy (arrow of time) ,Entropy (order and disorder) - Abstract
In this paper we relate information theory and Kolmogorov Complexity (KC) to optimization in the black box scenario. We define the set of all possible decisions an algorithm might make during a run, we associate a function with a probability distribution over this set and define accordingly its entropy. We show that the expected KC of the set (rather than the function) is a better measure of problem difficulty. We analyze the effect of the entropy on the expected KC. Finally, we show, for a restricted scenario, that any permutation closure of a single function, the finest level of granularity for which a No Free Lunch Theorem can hold [7], can be associated with a particular value of entropy. This implies bounds on the expected performance of an algorithm on members of that closure.
- Published
- 2006
- Full Text
- View/download PDF