SKIP TO CONTENT
Fjärrstridsgrupp Alfa
SV UK EDITION 2026-Q2 ACTIVE
UNCLASSIFIED
FSG-A // CLUSTER 2 — AUTONOMOUS // 2.6

UGV INTEGRATION
ROS2 GROUND VEHICLES IN LISA 26

KEY TAKEAWAY
UGVs (Unmanned Ground Vehicles) run ROS2 (Robot Operating System 2), not ArduPilot. Lisa 26 accepts ROS2 data through a MAVLink bridge — a small program that translates ROS2 topics into MAVLink messages that Lisa 26 understands. UGVs provide ground-level observation, logistics delivery, and CASEVAC in areas too dangerous for soldiers.

UGV INTEGRATION

UGV software
ROS2 Humble on Ubuntu 22.04
Navigation
Nav2 (ROS2 navigation stack) — GPS-denied capable with SLAM
Lisa 26 bridge
mavros2 node — translates ROS2 topics → MAVLink UDP
Communication
LTE or WiFi mesh to GCS → Starlink → Lisa 26
AI
Jetson Orin Nano with YOLOv8 (same as aerial drones)
Example platform
AgileX Scout Mini (€3,500) or custom tracked platform

Lisa 26 speaks MAVLink. ROS2 UGVs speak ROS2 topics. The bridge is MAVROS2 — a ROS2 node that translates between the two protocols. Install: sudo apt install ros-humble-mavros. Configure: point MAVROS2 at the Lisa 26 server's UDP port. The UGV's position (/odom topic), camera detections (/detections topic), and status (/battery_state topic) are automatically translated into MAVLink messages that Lisa 26 understands. From Lisa 26's perspective, the UGV looks like any other drone — it appears on the COP with a ground vehicle icon instead of an air vehicle icon.

GPS-Denied Ground Navigation

UGVs running ROS2 Nav2 can navigate without GPS using LiDAR SLAM (Simultaneous Localization and Mapping). A 2D LiDAR scanner (RPLiDAR A1, €90) mounted on the UGV builds a map of walls, trees, and obstacles while simultaneously tracking the vehicle's position on that map. Combined with wheel odometry (measuring wheel rotations), the UGV maintains centimeter-level position accuracy in structured environments (buildings, forests with distinct features). In open terrain with few features, accuracy degrades to 1-5m over long distances.

Example UGV Platform

UGV — MINIMUM VIABLE CONFIGURATION

Chassis
AgileX Scout Mini (4WD, 10 kg payload) — €3,500
Computer
Jetson Orin Nano Super — €230
LiDAR
RPLiDAR A1 (2D, 12m range, SLAM) — €90
Camera
Intel RealSense D435i (depth + RGB) — €300
MANET radio
Silvus SL5200 — contact for pricing
Battery
24V 20Ah LiFePO4 (included with chassis)
Runtime
3 hours at walking speed (8 km/h)
Total (excl. MANET)
~€4,120

ROS2 Humble installation: sudo apt install ros-humble-desktop ros-humble-navigation2 ros-humble-slam-toolbox ros-humble-mavros. Full SLAM navigation works out of the box with RPLiDAR A1. No GPS required — LiDAR SLAM provides centimeter-level accuracy in structured environments (buildings, forest with distinct features).

← Del av Ekf3 Sensor Fusion

External source: Obemannat markfordon – Wikipedia

ROS2 Navigation Stack — From LiDAR to Autonomous Driving

ROS2 Nav2 provides the complete autonomous navigation pipeline: LiDAR point cloud processing (obstacle detection within 30 meters in all directions), SLAM mapping (building a 2D occupancy grid of the environment in real time), path planning (A* or Dijkstra algorithm finding the shortest collision-free route), and trajectory tracking (PID control keeping the UGV on the planned path within 10 cm accuracy on flat terrain). The entire stack runs on an onboard computer (Jetson Orin Nano or Intel NUC) processing at 10 Hz update rate.

Obstacle avoidance is reactive: if a new obstacle appears in the planned path (fallen tree, crater from artillery, person crossing the route), the local planner generates an avoidance trajectory within 200 milliseconds. The UGV slows, diverts around the obstacle, and resumes the original path. This works for static and slow-moving obstacles. Fast-moving threats (incoming vehicles, running soldiers) require higher processing rates that stress the onboard computer — current limitation is reliable avoidance up to 3 m/s closing speed at 5 km/h UGV travel speed.

The UGV appears on the Lisa 26 COP as a blue ground vehicle icon. Position updates transmit via MAVLink GLOBAL_POSITION_INT at 1 Hz. Lisa 26 L2 can assign waypoint missions to the UGV: "deliver cargo to grid PA 2345 6789 via route avoiding known minefields." The UGV receives the waypoint via MAVLink MISSION_ITEM_INT and executes autonomously. During transit, Lisa 26 monitors progress and reroutes if new threats appear on the planned path.

The integration uses the same MAVLink protocol that drones use — no separate communication system or interface needed. The Silvus MANET radio on the UGV joins the same mesh network as airborne drones. This means the UGV shares bandwidth and routing with the rest of the drone fleet. At brigade scale with 50+ drones and 5-10 UGVs, bandwidth allocation becomes important — UGV telemetry at 1 Hz consumes negligible bandwidth (200 bytes/second) compared to drone video streams (160 kbit/s per drone).

Terrain Limitations and Workarounds

Wheeled UGVs (Clearpath Husky) fail in mud deeper than 15 cm, slopes above 20 degrees, and dense vegetation without paths. Tracked UGVs (Milrem THeMIS) handle all of these but cost 10 times more. The practical workaround: use wheeled UGVs on established paths (roads, trails, cleared corridors) and accept that off-path delivery requires either tracked platforms or human carriers. Lisa 26 route planning accounts for trafficability — the DEM-derived slope analysis identifies passable routes for wheeled and tracked vehicles separately, ensuring the UGV is never assigned a route it cannot physically traverse.

Implementation

# pip install numpy
# UGV Integration — ROS2 to Lisa 26 via MAVLink Bridge
# pip install pymavlink

import time
import json

class UGVBridge:
    """Bridge ROS2 UGV navigation to Lisa 26 COP."""
    
    def __init__(self, ros2_node, mavlink_conn):
        self.ros2 = ros2_node
        self.mav = mavlink_conn
        self.payload_kg = 0
        self.battery_pct = 100
    
    def send_position_to_lisa26(self):
        """Send UGV position as ground vehicle on Lisa 26 COP."""
        pose = self.ros2.get_current_pose()
        
        self.mav.mav.global_position_int_send(
            int(time.time() * 1000),
            int(pose.latitude * 1e7),
            int(pose.longitude * 1e7),
            int(pose.altitude * 1000),
            0,  # relative altitude
            int(pose.vx * 100), int(pose.vy * 100), int(pose.vz * 100),
            int(pose.heading * 100)
        )
    
    def receive_waypoint_from_lisa26(self):
        """Lisa 26 sends delivery waypoint to UGV."""
        msg = self.mav.recv_match(type='MISSION_ITEM', blocking=True, timeout=5)
        if msg:
            return {"lat": msg.x, "lon": msg.y, "action": "deliver"}
        return None
    
    def navigate_to(self, lat, lon):
        """ROS2 Nav2 autonomous navigation with obstacle avoidance."""
        goal = self.ros2.create_nav_goal(lat, lon)
        self.ros2.navigate(goal)  # Uses LiDAR SLAM + obstacle avoidance
        
        while not self.ros2.goal_reached():
            self.send_position_to_lisa26()
            time.sleep(1)

# Lisa 26 sees UGV as blue ground icon on COP
# Commander can assign delivery missions from tablet

Related Chapters

Sources

See the categorized source sections earlier on this page for specific citations supporting each claim. Cross-referenced technical baselines: ArduPilot developer documentation; ExpressLRS hardware documentation; NATO STANAG 4609 Ed. 4 (motion imagery metadata), 4671 (UAV airworthiness), 2022 (intelligence evaluation); Watling & Reynolds, "Meatgrinder: Russian Tactics in the Second Year of Its Invasion of Ukraine", RUSI (2023); ISW daily campaign assessments at understandingwar.org (archive). FSG-A has no own operational experience.