ROS2 Robot Platform

Rover, the AI Dog: a robot that can come when called

This is the build log of my Raspberry Pi 4 robot. It is a blend of ROS2, real sensors, and a control layer I can iterate on quickly. The goal is simple: call it from another room, watch it reason, and see it roll in like a good dog.

Demo video coming soon

“Rover, come.”

The demo I am working toward starts with a voice prompt from another room. The robot spins up, plots a path, shows its live reasoning, and then rolls straight to me. I want that moment to feel fast, clear, and a little bit magical.

Why I built this

Most robotics projects either stop at the sensor demo or vanish into perfect but non‑moving diagrams. This one is meant to be a story: real hardware, real tradeoffs, and a consistent push toward autonomy. RoboPi is where I test ideas around mapping, perception, and AI control without waiting for a lab.

Right now the fundamentals work: the LIDAR is live, the IMU is stable, and I can visualize and control everything from a browser. That gives me a clean baseline to experiment on motion, mapping, and higher‑level decision making.

What works today

Live Sensors

  • RPLIDAR publishes `/scan` at 10 Hz for 360° coverage.
  • BNO085 IMU publishes `/imu/data` at 20 Hz.
  • Optional camera stream via `usb_cam`.

Visualization + Control

  • Foxglove bridge for live 3D visualization.
  • FastAPI server for web-based control.
  • Scripted bring‑up, status, and shutdown.

How it is built

The ROS2 workspace contains the LIDAR driver, custom message definitions, and my bring‑up package. A separate robot-control service handles GPIO, exposes the API, and publishes commands back into ROS2. Foxglove rides on a WebSocket bridge so I can inspect live data without extra tooling.

Key Topics

  • /scan from the RPLIDAR driver.
  • /imu/data from the BNO085 node.
  • /map when slam_toolbox is enabled.
  • /cmd_vel for motion commands.

Ports

  • 8765 Foxglove WebSocket
  • 8000 Robot Control API
  • 8001/8002 MCP endpoints

Quick start (the short version)

Start the stack

~/ros2_robot/scripts/start_robot.sh

# Start mapping (IMU + SLAM)
~/ros2_robot/scripts/start_robot.sh --with-imu --with-slam

Status + shutdown

~/ros2_robot/scripts/status_robot.sh

~/ros2_robot/scripts/stop_robot.sh

Foxglove

Connect to ws://<pi_ip>:8765, add a 3D panel, and subscribe to /scan.

Hardware snapshot

Compute

Raspberry Pi 4 Model B (4GB) running Ubuntu 22.04 aarch64 with ROS2 Humble.

Primary sensors

SLAMTEC RPLIDAR A2M12 over USB and an Adafruit BNO085 IMU on I2C6.

Expansion

Camera + motor hardware are staged for navigation and autonomy work.

What is next

Navigation

  • slam_toolbox integration for consistent maps.
  • Nav2 for goal-based autonomy.
  • Full TF tree + URDF model.

Perception

  • Odometry via sensor fusion (LIDAR + IMU + camera).
  • Compressed camera transport.
  • Stabilize usb_cam pixel formats.

Control

  • Refine obstacle avoidance + safety timeouts.
  • Expose higher-level behaviors through MCP APIs.
  • Battery monitoring + status LEDs.

Docs last updated January 28, 2026.

Links

Code Repository Docs Repository