Harnessing Hierarchical Label Distribution Variations in Test Agnostic Long-tail Recognition
arxiv(2024)
摘要
This paper explores test-agnostic long-tail recognition, a challenging
long-tail task where the test label distributions are unknown and arbitrarily
imbalanced. We argue that the variation in these distributions can be broken
down hierarchically into global and local levels. The global ones reflect a
broad range of diversity, while the local ones typically arise from milder
changes, often focused on a particular neighbor. Traditional methods
predominantly use a Mixture-of-Expert (MoE) approach, targeting a few fixed
test label distributions that exhibit substantial global variations. However,
the local variations are left unconsidered. To address this issue, we propose a
new MoE strategy, 𝖣𝗂𝗋𝖬𝗂𝗑𝖤, which assigns experts to different
Dirichlet meta-distributions of the label distribution, each targeting a
specific aspect of local variations. Additionally, the diversity among these
Dirichlet meta-distributions inherently captures global variations. This
dual-level approach also leads to a more stable objective function, allowing us
to sample different test distributions better to quantify the mean and variance
of performance outcomes. Theoretically, we show that our proposed objective
benefits from enhanced generalization by virtue of the variance-based
regularization. Comprehensive experiments across multiple benchmarks confirm
the effectiveness of 𝖣𝗂𝗋𝖬𝗂𝗑𝖤. The code is available at
.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要