Self-Learning Mapreduce Scheduler In Multi-Job Environment

Changhang Lin,Wenzhong Guo,Changhui Lin

2013 INTERNATIONAL CONFERENCE ON CLOUD COMPUTING AND BIG DATA (CLOUDCOM-ASIA)(2013)

引用 12|浏览6
暂无评分
摘要
Hadoop, as the most widely adopted open-source implementation of MapReduce framework, makes MapReduce widely accessible. However, it is currently limited by its default MapReduce scheduler. To achieve better performance, the scheduler should take into consideration nodes' computing power and system resources in heterogeneous environment. Further more, from job perspective, tasks' non-linear progress is also an important factor. Some research work has been carried out to enhance the performance of MapReduce, but they are not satisfactory in terms of considering characteristics of both nodes and jobs. To overcome this drawback, we propose a Self-Learning MapReduce Scheduler (SLM), which outperforms the existing schedulers in multi-job environment. Since competitions on system resources may make a task's progress unpredictable, SLM determines the progress of each job based on its own historical information. In particular, on the self-learning stage of a job, with the feedback information from the first few tasks, SLM calculates the task phase weights. With these phase weights, SLM can obtain more accurate execution time estimation, which is the most important condition to finding stragglers (slow tasks). Experimental results show that, SLM can effectively improve the accuracy of execution time estimation and straggler identification, leading to the rational utilization of resources and shortening jobs' execution time especially in multi-job environment.
更多
查看译文
关键词
Hadoop,heterogeneous environment,multi-job,speculative execution,straggler
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要