AdaptSFL: Adaptive Split Federated Learning in Resource-constrained Edge Networks
CoRR(2024)
摘要
The increasing complexity of deep neural networks poses significant barriers
to democratizing them to resource-limited edge devices. To address this
challenge, split federated learning (SFL) has emerged as a promising solution
by of floading the primary training workload to a server via model partitioning
while enabling parallel training among edge devices. However, although system
optimization substantially influences the performance of SFL under
resource-constrained systems, the problem remains largely uncharted. In this
paper, we provide a convergence analysis of SFL which quantifies the impact of
model splitting (MS) and client-side model aggregation (MA) on the learning
performance, serving as a theoretical foundation. Then, we propose AdaptSFL, a
novel resource-adaptive SFL framework, to expedite SFL under
resource-constrained edge computing systems. Specifically, AdaptSFL adaptively
controls client-side MA and MS to balance communication-computing latency and
training convergence. Extensive simulations across various datasets validate
that our proposed AdaptSFL framework takes considerably less time to achieve a
target accuracy than benchmarks, demonstrating the effectiveness of the
proposed strategies.
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要