Systems Architecture

Spookey's software is split across two processors, each handling what it does best. The ESP32 runs all real-time tasks: a PID balance loop that reads the BNO085 IMU and corrects posture continuously, plus data collection from the FSR pressure sensors (two per foot) that detect weight distribution during stance and gait. All of that sensor data feeds into the ESP32, which then communicates with the Raspberry Pi 5 over a UART serial connection via TX/RX lines.

The Raspberry Pi 5 handles higher-level logic. It receives the processed state from the ESP32 and runs the ROS 2 node network that coordinates everything else. Two LiDAR sensors, one in each ankle, connect directly to the Pi over USB and publish distance data to an obstacle detection node. If anything is closer than 30cm, the node flags it and the system responds accordingly, whether that means pausing motion or rerouting the step.

ROS 2 Node Structure (src/robot_scripts)

The codebase lives inside a ROS 2 workspace organized under the robot_scripts package. Each node handles one job:

  • balance.py reads IMU feedback from the ESP32 and issues corrective servo commands to maintain upright posture

  • servo_mover.py translates target angles into PWM signals through the PCA9685 driver

  • walk_basic.py executes a simple forward step sequence, the foundation for more complex gait patterns

  • obstacle_node.py subscribes to LiDAR data from both ankles and publishes proximity warnings when objects enter the 30cm threshold

  • spukai.py connects SpukAI, Spookey's reasoning layer, to the rest of the control network

  • spukai_command_bridge.py parses JSON commands from SpukAI and routes them to the appropriate motion or balance node

  • spukai_cli.py provides a local terminal interface for manual testing and overrides

  • heartbeat.node monitors system activity and confirms all critical processes are running

Nodes are launched through ROS 2 launch files that bring up the full stack as one synchronized system.

Simulation

Before testing anything on hardware, I prototype it in NVIDIA Isaac Sim. The workflow starts with Spookey's URDF, which I built and exported from Fusion 360, and uses it to simulate sensor behavior, joint response, and balance dynamics in a physics environment. This lets me validate node logic, tune PID parameters, and catch issues before a single servo moves on the real robot. Several adjustments that made it onto the physical build were first identified through sim, saving a lot of debugging time.

The next stage is Isaac Lab, NVIDIA's GPU-accelerated reinforcement learning framework built on top of Isaac Sim. The plan is to train two separate policies: one focused on dexterous hand control using the 8-DOF AmazingHand units, and one focused on stable bipedal locomotion, getting Spookey to walk forward reliably across different surface conditions. Both will run in parallelized simulation environments before anything is deployed to hardware.