An Associativity Threshold Phenomenon in Set-Associative Caches

PROCEEDINGS OF THE 35TH ACM SYMPOSIUM ON PARALLELISM IN ALGORITHMS AND ARCHITECTURES, SPAA 2023(2023)

引用 1|浏览37
暂无评分
摘要
In an alpha-way set-associative cache, the cache is partitioned into disjoint sets of size alpha, and each item can only be cached in one set, typically selected via a hash function. Set-associative caches are widely used and have many benefits, e.g., in terms of latency or concurrency, over fully associative caches, but they often incur more cache misses. As the set size alpha decreases, the benefits increase, but the paging costs worsen. In this paper we characterize the performance of an alpha-way set-associative LRU cache of total size k, as a function of alpha = alpha(k). We prove the following, assuming that sets are selected using a fully random hash function: For alpha = omega(log k), the paging cost of an alpha-way set-associative LRU cache is within additive O(1) of that a fully-associative LRU cache of size (1- o(1))k, with probability 1 - 1/poly(k), for all request sequences of length poly(k). For alpha = o(log k), and for all c = O(1) and r = O(1), the paging cost of an alpha-way set-associative LRU cache is not within a factor c of that a fully-associative LRU cache of size k/r, for some request sequence of length O(k(1.01)). For alpha = omega(log k), if the hash function can be occasionally changed, the paging cost of an alpha-way set-associative LRU cache is within a factor 1 + o(1) of that a fully-associative LRU cache of size (1- o(1))k, with probability 1 - 1/poly(k), for request sequences of arbitrary (e.g., super-polynomial) length. Some of our results generalize to other paging algorithms besides LRU, such as least-frequently used (LFU).
更多
查看译文
关键词
Set-associative cache,paging,LRU
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要