FOAA: Flattened Outer Arithmetic Attention For Multimodal Tumor Classification
CoRR(2024)
摘要
Fusion of multimodal healthcare data holds great promise to provide a
holistic view of a patient's health, taking advantage of the complementarity of
different modalities while leveraging their correlation. This paper proposes a
simple and effective approach, inspired by attention, to fuse discriminative
features from different modalities. We propose a novel attention mechanism,
called Flattened Outer Arithmetic Attention (FOAA), which relies on outer
arithmetic operators (addition, subtraction, product, and division) to compute
attention scores from keys, queries and values derived from flattened
embeddings of each modality. We demonstrate how FOAA can be implemented for
self-attention and cross-attention, providing a reusable component in neural
network architectures. We evaluate FOAA on two datasets for multimodal tumor
classification and achieve state-of-the-art results, and we demonstrate that
features enriched by FOAA are superior to those derived from other fusion
approaches. The code is publicly available at
\href{https://github.com/omniaalwazzan/FOAA}{https://github.com/omniaalwazzan/FOAA}
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要