C4ISR and the big picture

On a per-line-item basis year after year, the majority of the DoD’s technology budget has gone into digitizing the . As soon as a weapon or a platform became “smart,” planners wanted it interconnected to everything else. To this day, programs that add to help digitize the battlefield get preferential funding treatment – especially if they show immediate value to either save warfighters’ lives or radically improve current mission objectives. Everything has computers, from the ordnance itself to the WWII-era towed Howitzers. Data is routed around the battlefield and beamed to TOCs or back to CONUS and stored on huge disks or in real-time local memory.

So we’ve got “data,” but what the heck do we do with it all?

On one hand, we have the digital capabilities that technology offers. On the other, there are warfighting problems that need solutions. When battlefield data is used to solve problems and accentuate capabilities, everyone wins. Companies like GE Intelligent Platforms have recently been very successful winning C4ISR programs that marry small form factor systems to data to provide and reconnaissance information. The M1A2 Abrams Evolutionary Design (AED) tank program awarded just shy of $1.0 million to GE to prove that Ethernet switch products and SBCs could connect new vetronics with the Commander’s Independent Thermal Viewer (CITV) and provide previously unavailable “sit rep” info using and graphics overlays.

Not only that, at this year’s Association of the U.S. Army (AUSA) meeting, GE demonstrated a 360-degree head-in capability that allows tankers to “see through” a using a helmet-mounted display and external sensors. This is similar to the previous Eagle Vision program demonstrated on the Bradley Fighting Vehicle by SBS (now GE), Sarnoff, and BAE. (It’s also similar to the USAF’s vision of the “glass canopy” where pilots can “look” down through an aircraft’s floor and “see” targets or threats below.)

The GE system concept has evolved into the Distributed Aperture program, according to Frank Willis, director of product management and marketing for the group of GE Intelligent Platforms. Willis says that digital battlefield data collected at an “asset source” becomes locally consolidated by small form factor computers such as VPX, , and , and then “communicated across the battlefield.” The key is local data consolidation and decision making to avoid fire-hosing information on low-bandwidth nets. This way, the data gets used by the asset that needs it the most before being passed along.

According to Larry Schaffer, VP of business development for applied image processing at GE, his company plans to capitalize on turning data into situational awareness using its 3U VPX/REDI-sized IPS5000 as an Army Appliqué-like upgrade that ties into fire control and on-vehicle sensors, and also aggregates heretofore unused “metadata.” Vetronic platform targets are Abrams, Bradley, Stryker, and the UK’s Warrior. A smaller PC/104-based IPS500 uses even more COTS with sights set on Humvees, MRAP, and (JLTV) – a possible HMMWV replacement. Both GE systems contain video engines (input), visualization processors (rendering, post-processing), and storage – along with optional graphics processors for additional displays.

It’s easy to see how data from new sensors can either feed into these types of GE systems or be sent to other assets to paint a “first person” view of the battlefield. For example, DRS Technologies was recently awarded a $1.9 billion IDIQ contract from U.S. Army CECOM for Driver’s Vision Enhancers (DVEs) that allow “seeing” through fog, smoke, and other battlefield obstacles on M1A2s, M2s, HMMWVs, and other ground vehicles. There’s no reason DVE can’t also send this data to (UASs) to augment a platform’s local view to equally enhance situational awareness.

Along the same theme, General Dynamics C4 Systems (Scottsdale) recently delivered the first Prophet Enhanced tactical signals intelligence system to the U.S. Army. Here too, data from battlefield sensors are aggregated to present a battlefield threat picture. The networked ISR system, integrated into a BAE Panther rapid-deployment C2 vehicle, takes sensor data and allows tactical commanders to “see,” “hear,” and “respond” to battlespace data. According to Jane’s Defence Weekly, the program is a follow-on to a fielded and secretive SIGINT program.

But using all this networked digital data doesn’t depend only on hardware; software and data mining play big roles in reconstituting ISR information into new pictures. McObject, a COTS data management software company, has improved its eXtremeDB database product to parse and organize in-memory data in real-time, complete with hooks to , ME, and Microsoft’s .NET. In-memory data mining allows real-time microsecond information retrieval in deployed, harsh environment flash-based computers. As well, semantic fusion technology – an offshoot of artificial intelligence – coupled with natural language processing can combine battlefield data with human-created text data such as sit rep reports to paint a totally different picture of the battlefield.

The company Modus Operandi recently won a DARPA contract designed to previously stovepiped information, such as Word or HTML documents. By overlaying sensor data with human text documents, such as “… at 20:00 hours there was single vehicle activity at the Northern border crossing …”, commanders will get a different view of the battlefield to aid in decision making. In this example, that single vehicle might’ve been previously tracked by a Predator, lost once it left the Area of Operations (AO), but now “found” again when it crossed a border checkpoint. By combining disparate databases, new intelligence is gained. This is just another example of how data, previously streamed onto the digital battlefield but unused, is being exploited using sophisticated COTS hardware and software.