Wearable Split Computing with DANCE:

Data Adaptive Neural Compression Engine

Paper: https://danjacobellis.net/_static/split.pdf

Code: https://github.com/danjacobellis/PAC

image

Abstract

Split computing is an approach to mobile, embedded, and wearable sensing where a minimal amount of on-device computation is used to transform raw sensor data into compressed representations that can efficiently transmitted for further processing in the cloud. The widening gap between foundation models and conventional perception approaches in terms of versatility, ease of deployment, and computational cost makes split-computing more attractive than ever. However, standard lossy compression techniques have not kept up with the high number and fidelity of modern sensors, leading vast amounts of data to be stored indefinitely or discarded entirely. To address this, we propose DANCE (Data Adaptive Neural Compression Engine), a framework to build high-efficiency neural codecs via specialization to a given dataset or sensor. For the 7-channel spatial audio and fisheye camera on the Project Aria smartglasses, split-computing via DANCE outperforms conventional codecs like JPEG in terms of both compression ratio and machine perception quality, while offering more than 100x more efficient inference compared to existing neural audio and image compression methods.