Model-Distributed Inference in Multi-Source Edge Networks

2023 IEEE International Conference on Acoustics, Speech, and Signal Processing Workshops (ICASSPW)(2023)

引用 2|浏览21
暂无评分
摘要
Distributed inference techniques can be broadly classified into data-distributed and model-distributed schemes. In data-distributed inference (DDI), each worker carries the entire deep neural network (DNN) model, but processes only a subset of the data. However, feeding the data to workers results in high communication costs especially when the data is large. An emerging paradigm is model-distributed inference (MDI), where each worker carries only a subset of DNN layers. In MDI, a source device that has data processes a few layers of DNN and sends the output to a neighboring device. This process ends when all layers are processed in a distributed manner. In this paper, we investigate MDI with multiple sources, i.e., when more than one device has data. We design a multisource MDI (MS-MDI), which optimizes task scheduling decisions across multiple source devices and workers. Experimental results on a real-life testbed of NVIDIA Jetson TX2 edge devices show that MS-MDI improves the inference time significantly as compared to baselines.
更多
查看译文
关键词
Model distribution, model distributed inference, deep neural network (DNN), distributed DNN
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要