Robotics Development That Feels Like Modern Software Engineering

    Phillip Thomas
    Phillip ThomasOct 16, 2025
    Share

    Overview

    Traditional robotics development is slow because your code lives far from your hardware. Every change means a new build, a redeploy, and a round of trial and error before you even see what broke. Remote development on make87 eliminates that loop — you code on the robot itself.

    With dev mode, your Git repo syncs directly into a running container on the robot. You can edit, rebuild, and debug live against real sensors and actuators — no image rebuilds, no SSH gymnastics, and no stale logs. Whether your robot sits in the lab or on another continent, it feels like developing locally.

    Table of Contents

    1. Why Iteration Is So Slow
    2. Remote Development on the Robot
    3. How This Fits With Existing Workflows
    4. Benefits for Developers
    5. Try It Yourself

    Introduction

    In Part 1 we showed a voice-controlled robot arm you can deploy in minutes. Fast deployment is only the first step — the next is making development itself feel like modern software engineering.

    With make87 you can run apps in dev mode directly on the robot: edit, debug, and test against real hardware in real time. Instead of a few painful iterations per day, you can make dozens.


    Why Iteration Is So Slow

    Every robotics developer has lived this loop:

    • Write code in sim or with logged data
    • Package and deploy to the robot
    • Watch it fail in some new, hardware-only way
    • Collect logs, guess, redeploy
    • Repeat until you’re out of time

    The result: only a handful of hardware iterations per day, even for small changes.


    Remote development on the Robot

    On make87, every app can run in two ways:

    • Production mode: immutable, optimized container images built for deployment
    • Dev mode: a base image that pulls your Git repo into the container at startup, letting you work against real hardware without repackaging

    In dev mode you can:

    • Work directly on your Git branch inside the container
    • Edit, build, and run code as you would locally
    • Restart programs or rebuild/re-run compiled code instantly
    • Debug with breakpoints while receiving live data from sensors and other components

    Build Additions: Customize Your Dev Environment

    Dev mode starts with a base dev image. You can use one of our build kits as a base — or define your own if you have specific dependencies.

    On top of that, you can add build additions from our public repo. Each addition extends the container with tools for faster interaction:

    • Web IDE: run a VS Code server in the browser for quick edits and experiments
    • SSH server: connect from your own IDE with your own workflow and AI coding assistant
    • Custom additions: define your own if you need other tools or setup scripts

    Each addition also runs setup scripts (Git pull to /home/state/code, dependency install, etc.) so the container is ready to iterate immediately. Access is tunneled through the make87 platform — encrypted and restricted to your client.

    Local IDE connected via SSH, debugging robot behavior in real time


    How This Fits With Existing Workflows

    This doesn’t replace simulation or rosbag replay — both are essential. Simulation is for safe stress-testing, and rosbags for reproducible bugs and model training.

    Live development on the robot is another tool: ideal for prototypes, parameter tuning, and chasing real-world edge cases.

    Remote dev isn’t new — but in robotics it’s usually a mess:

    • Direct network or VPN hacks
    • Dependency pollution on the robot
    • Manual setup that destroys speed gains

    make87 removes that overhead. Any robot in your fleet can be reached securely over the internet. Dev environments spin up automatically, with logs and shell access always available — in both dev and prod mode.


    Benefits for Developers

    • Dozens of iterations per day instead of a few
    • Breakpoints with live data — debug perception, control, or agent logic against real hardware
    • Logs and container access — inspect logs form any app in dev or prod. Persistent errors across runs.
    • Visualizers on demand — deploy custom visualziers like Rerun alongside your apps to see real-time interaction
    • Remote-ready — develop or troubleshoot on any robot in your fleet, anywhere with an internet connection

    Try It Yourself

    Deploy the Voice-Controlled Robot Arm template, then enable development builds in the options.

    Within minutes you’ll have:

    • A working voice-controlled robot system
    • Web IDE access for live edits and debugging
    • SSH access for your preferred local tools
    • Logs and direct container access for each app
    • Real hardware responding to your changes instantly — from anywhere

    Dev mode isn’t limited to this demo — you can enable it on any app you deploy.

    Share Article

    ← Back to blog

    Shape the Future of Physical AI

    Build With Us

    We work with those building the future of Physical AI.
    If that’s you, let’s talk.

    Vancouver, Canada • Karlsruhe, Germany

    © 2025 make87 Corp.