Dynamic and Transparent Data Tiering for In-Memory Databases in Mixed Workload Environments.

ADMS@VLDB(2015)

引用 25|浏览47
暂无评分
摘要
Current in-memory databases clearly outperform their diskbased counterparts. In parallel, recent PCIe-connected NAND flash devices provide significantly lower access latencies than traditional disks allowing to re-introduce classical memory paging as a cost-efficient alternative to storing all data in main memory. This is further eased by new, dedicated APIs which bypass the operating system, optimizing the way data is managed and transferred between a DRAM caching layer and NAND flash. In this paper, we will present a new approach for in-memory databases that leverages such an API to improve data management without jeopardizing the original performance superiority of in-memory databases. The approach exploits data relevance and places less relevant data onto a NAND flash device. For real-world data access skews, the approach is able to efficiently evict a substantial share of the data stored in memory while suffering a performance loss of less than 30%.
更多
查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要