1. Multi-armed bandit algorithm for sequential experiments of molecular properties with dynamic feature selection.
- Author
-
Abedin, Md. Menhazul, Tabata, Koji, Matsumura, Yoshihiro, and Komatsuzaki, Tamiki
- Subjects
- *
FEATURE selection , *OPTIMIZATION algorithms , *PROBLEM-based learning , *REINFORCEMENT learning , *ALGORITHMS , *CHEMICAL yield , *ENANTIOMERS - Abstract
Sequential optimization is one of the promising approaches in identifying the optimal candidate(s) (molecules, reactants, drugs, etc.) with desired properties (reaction yield, selectivity, efficacy, etc.) from a large set of potential candidates, while minimizing the number of experiments required. However, the high dimensionality of the feature space (e.g., molecular descriptors) makes it often difficult to utilize the relevant features during the process of updating the set of candidates to be examined. In this article, we developed a new sequential optimization algorithm for molecular problems based on reinforcement learning, multi-armed linear bandit framework, and online, dynamic feature selections in which relevant molecular descriptors are updated along with the experiments. We also designed a stopping condition aimed to guarantee the reliability of the chosen candidate from the dataset pool. The developed algorithm was examined by comparing with Bayesian optimization (BO), using two synthetic datasets and two real datasets in which one dataset includes hydration free energy of molecules and another one includes a free energy difference between enantiomer products in chemical reaction. We found that the dynamic feature selection in representing the desired properties along the experiments provides a better performance (e.g., time required to find the best candidate and stop the experiment) as the overall trend and that our multi-armed linear bandit approach with a dynamic feature selection scheme outperforms the standard BO with fixed feature variables. The comparison of our algorithm to BO with dynamic feature selection is also addressed. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF