HyperTuner: a cross-layer multi-objective hyperparameter auto-tuning framework for data analytic services

The Journal of Supercomputing(2024)

引用 0|浏览11
暂无评分
摘要
Hyperparameters optimization (HPO) is vital for machine learning models. Besides model accuracy, other tuning intentions such as model training time and energy consumption are also worthy of attention from data analytic service providers. Therefore, it is essential to take both model hyperparameters and system parameters into consideration to execute cross-layer multi-objective hyperparameter auto-tuning. Toward this challenging target, we propose HyperTuner in this paper which leverages a well-designed ADUMBO algorithm to find the Pareto-optimal configuration set. Compared with vanilla Bayesian optimization-based methods, ADUMBO selects the most promising configuration from the generated Pareto candidate set during each iteration via maximizing a novel adaptive uncertainty metric. We evaluate HyperTuner on our local distributed TensorFlow cluster, and experimental results show that it is always able to find a better Pareto configuration front superior in both convergence and diversity compared with the other four baseline algorithms. Besides, experiments with different training datasets, different optimization objectives, and different machine learning platforms verify that HyperTuner can well adapt to various data analytic service scenarios.
更多
查看译文
关键词
Multi-objective hyperparameter optimization,Bayesian optimization,Configuration parameter,Data analytic services
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要