Machine Unlearning in Learned Databases: An Experimental Analysis
Proceedings of the ACM on Management of Data(2023)
摘要
Machine learning models based on neural networks (NNs) are enjoying
ever-increasing attention in the DB community. However, an important issue has
been largely overlooked, namely the challenge of dealing with the highly
dynamic nature of DBs, where data updates are fundamental, highly-frequent
operations. Although some recent research has addressed the issues of
maintaining updated NN models in the presence of new data insertions, the
effects of data deletions (a.k.a., "machine unlearning") remain a blind spot.
With this work, for the first time to our knowledge, we pose and answer the
following key questions: What is the effect of unlearning algorithms on
NN-based DB models? How do these effects translate to effects on downstream DB
tasks, such as selectivity estimation (SE), approximate query processing (AQP),
data generation (DG), and upstream tasks like data classification (DC)? What
metrics should we use to assess the impact and efficacy of unlearning
algorithms in learned DBs? Is the problem of machine unlearning in DBs
different from that of machine learning in DBs in the face of data insertions?
Is the problem of machine unlearning for DBs different from unlearning in the
ML literature? what are the overhead and efficiency of unlearning algorithms?
What is the sensitivity of unlearning on batching delete operations? If we have
a suitable unlearning algorithm, can we combine it with an algorithm handling
data insertions en route to solving the general adaptability/updatability
requirement in learned DBs in the face of both data inserts and deletes? We
answer these questions using a comprehensive set of experiments, various
unlearning algorithms, a variety of downstream DB tasks, and an upstream task
(DC), each with different NNs, and using a variety of metrics on a variety of
real datasets, making this also a first key step towards a benchmark for
learned DB unlearning.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要