Noisy Interpolation Learning with Shallow Univariate ReLU Networks

ICLR 2024(2023)

引用 0|浏览15
暂无评分
摘要
We study the asymptotic overfitting behavior of interpolation with minimum norm ($\ell_2$ of the weights) two-layer ReLU networks for noisy univariate regression. We show that overfitting is tempered for the $L_1$ loss, and any $L_p$ loss for $p<2$, but catastrophic for $p\geq 2$.
更多
查看译文
关键词
Interpolation Learning,Benign Overfitting,ReLU Networks
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要