Someday, military soldiers fighting in the streets of a sprawling megacity will need an airdrop of ammunition, food, or water that can’t be safely delivered by ground convoy or helicopter. But the supplies parachuting from the skies won’t have to rely on GPS signals that suffer from inaccuracy in cluttered city environments or can be disrupted by enemies. The military has been testing new supply airdrops that can automatically aim for a precise landing based on images of the target area.
Recent tests of the Army’s Joint Precision Airdrop System (JPADS) have been trying new navigational software—developed by the Draper Laboratory in Cambridge, Mass., and other companies—to achieve GPS-style accuracy with images alone. The software figures out its current location by comparing ground terrain features, such as trees or buildings seen by onboard cameras, with the latest satellite or drone images of the target area in its database. That allows the software to accurately guide the descent of the parafoil-equipped cargo as it glides toward the ground. It’s all part of a broader effort by the military to test computer-driven versions of old fashioned navigation by sight.
Moving away from the modern military’s reliance on GPS has big advantages. Anyone who has tried using GPS directions on their smartphone while walking or driving in a city knows how GPS accuracy can suffer at times. The current reliance on GPS-guided airdrops could prove challenging for troops who will inevitably find themselves patrolling or fighting within huge cities in the future. Enemy jamming of GPS signals or possibly even direct attacks on the satellites forming the GPS constellation could also deny crucial positional information.
That’s why the JPADS testing is just one possible use of vision-based navigation for the military. Similar systems could guide the descent of paratroopers jumping out of aircraft, robotic drones flying surveillance or strike missions, military aircraft piloted by humans, or possibly even vehicles on the ground.
“This camera-based navigation can conceivably be extended to any platform that has a need to know where it is, whether it be an autonomous vehicle, manned aircraft, unmanned aircraft or something else,” said Chris Bessette, JPADS program manager at Draper Laboratory.
The JPADS program has been testing Draper Laboratory’s vision-based software called “Lost Robot.” For the latest JPADS tests, commercial off-the-shelf cameras attached to the airdropped cargo provide images of the ground below. The software compares what the camera sees with the latest satellite images of the ground target area—satellite images that can be loaded into the JPADS database just before the mission takes off. That means the vision-based JPADS can function as “drop and forget” cargo delivery that automatically steers itself to the target without requiring outside signals or information.
Ideally, the Army wants to see if the vision-based version of JPADS can maintain the current GPS-guided system’s accuracy. One of the current GPS-guided systems can deliver 2,000 pounds of airdropped cargo to a target area with an accuracy that is significantly better than its threshold accuracy of 150 meters. A larger version of JPADS has the requirement of delivering 10,000 pounds of cargo to a target within 250 meters.
Eventually, the Army may consider infrared cameras or other video camera equipment to help deal with the system’s current sight limitations. More high-end sensors beyond off-the-shelf cameras could also improve the software’s performance by getting more sophisticated image data from the ground. But the JPADS team wants to keep the added costs relatively low for the supply airdrops to remain cost-effective in the future.