In this thesis we address the problem of multi-robot localization with a heterogeneous team of low-cost mobile robots. The team consists of a single centralized “observer” with an inertial measurement unit (IMU) and monocular camera, and multiple “picket” robots with only IMUs and Red Green Blue (RGB) light emitting diodes (LED). This team cooperatively navigates a visually featureless environment such as a collapsed building. A combination of camera imagery captured by the observer, as well as IMU measurements on the pickets and observer are fused to estimate motion of the team. A team movement strategy, referred to as inchworm, is formulated as follows; pickets move ahead of the observer and then act as temporary landmarks for the observer to follow. This cooperative approach employs a single Extended Kalman Filter (EKF) to localize the entire heterogeneous multi-robot team, using a formulation of the measurement Jacobian to relate the pose of the observer to the poses of the pickets with respect to the global reference frame. An initial experiment with the inchworm strategy has shown localization within 0.14 m position error and 2.18 degree orientation error over a path-length of 5 meters in an environment with irregular ground, partial occlusions and a ramp. This demonstrates improvement over a camera only relative pose measurement localization technique that was adapted to our team dynamic which produced 0.18m position error and 3.12 degree orientation error over the same dataset.