Abstract [eng] |
Bayesian optimization (BO) has recently become a popular approach for the global optimization of black-box functions. However, existing BO approaches do not support a large number of observations or require specialized hardware, which limits the usability of these methods for regular users. The main goal of this thesis is to enhance the scalability and efficiency of existing BO algorithms, ensuring their applicability for a broad range of optimization problems. In this thesis, we propose two new algorithms based on the Generalized Product of Experts (gPoE) model, which allow BO to scale to problems with a large number of observations without the need for specialized hardware for optimization. Additionally, we have shown that optimization accuracy can be improved by combining the gPoE model with a search space reduction method. We experimentally demonstrated the efficiency and scalability of these algorithms compared to existing ones in terms of accuracy and reduction in runtime. Furthermore, we show that the gPoE based BO algorithm can be extended to global optimization problems with heteroscedastic noise. |