Individually Conditional Individual Mutual Information Bound On Generalization Error

2021 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT)(2021)

引用 32|浏览48
暂无评分
摘要
We propose a new information-theoretic bound on generalization error based on a combination of the error decomposition technique of Bu et al. and the conditional mutual information (CMI) construction of Steinke and Zakynthinou. In a previous work, Haghifam et al. proposed a different bound combining the two aforementioned techniques, which we refer to as the conditional individual mutual information (CIMI) bound. However, in a simple Gaussian setting, both the CMI and the CIMI bounds are order-wise worse than that by Bu et al.. This observation motivated us to propose the new bound, which overcomes this issue by reducing the conditioning terms in the conditional mutual information. In the process of establishing this bound, a conditional decoupling lemma is established, which also leads to a meaningful dichotomy and comparison among these information-theoretic bounds.
更多
查看译文
关键词
Mutual information,Training,Random variables,Heuristic algorithms,Training data,Noise measurement,Upper bound
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要