Increased Leverage of Transprecision Computing for Machine Vision Applications at the Edge

Journal of Signal Processing Systems(2022)

引用 1|浏览11
暂无评分
摘要
The practical deployment of machine vision presents particular challenges for resource constrained edge devices. With a clear need to execute multiple tasks with variable workloads, there is a need for a robust approach that can dynamically adapt at runtime and which can maintain the maximum quality of service (QoS) within the available resource constraints. A lightweight approach that monitors the runtime workload constraints and leverages accuracy-throughput trade-offs on a graphics processing unit (GPU), is presented. It includes optimisation techniques that identify the configurations for each task in terms of optimal accuracy, energy and memory and management of the transparent switching between configurations. Using a neural network architecture search that statically generates a range of implementations that target a resource-precision trade-off, we explore the detection of the optimal parameters for the required QoS under specific memory and energy constraints. For an accuracy loss of 1
更多
查看译文
关键词
Edge Computing, Approximate Computing, Transprecision Computing, Machine Vision
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要