Network Offloading Policies for Cloud Robotics: a Learning-based Approach Sandeep Chinchali∗, Apoorva Sharmaz, James Harrisonx, Amine Elhafsiz, Daniel Kang∗, Evgenya Pergamenty, Eyal Cidony, Sachin Katti∗y, Marco Pavonez Departments of Computer Science∗, Electrical Engineeringy, Aeronautics and Astronauticsz, and Mechanical Engineeringx Stanford University, Stanford, CA fcsandeep, apoorva, jharrison, amine, ddkang, evgenyap, ecidon, skatti,
[email protected] Abstract—Today’s robotic systems are increasingly turning to computationally expensive models such as deep neural networks Mobile Robot Cloud (DNNs) for tasks like localization, perception, planning, and Limited Network object detection. However, resource-constrained robots, like low- Cloud Model power drones, often have insufficient on-board compute resources Robot Model or power reserves to scalably run the most accurate, state-of- Sensory the art neural network compute models. Cloud robotics allows Input Offload mobile robots the benefit of offloading compute to centralized Logic servers if they are uncertain locally or want to run more accurate, Offload Compute compute-intensive models. However, cloud robotics comes with a Local Compute key, often understated cost: communicating with the cloud over Image, Map congested wireless networks may result in latency or loss of data. Databases Query the cloud for better accuracy? In fact, sending high data-rate video or LIDAR from multiple Latency vs. Accuracy vs. Power … robots over congested networks can lead to prohibitive delay for