Military Embedded Systems

HD streaming video in wide-area surveillance systems: A reconfigurable hardware processor approach

Story

August 26, 2010

Adam Smith

Alpha Data, Inc.

HD streaming video in wide-area surveillance systems: A reconfigurable hardware processor approach

The demand for higher-quality video imagery in surveillance systems requires an advanced link between the sensors and the usable video data. An image processing system consisting exclusively of software processing isn't capable of meeting these demands, and an ASIC-based system doesn't provide the needed flexibility. Because of the compute-intensive nature of pixel processing, an FPGA-based imaging processing design using JPEG2000 encoding provides a powerful and flexible methodology.

High-definition digital video has become commonplace; most of us have HD equipment right in our own homes. However, moving the technology to real-time military surveillance applications poses major challenges in capturing the HD images and distributing the enormous amounts of data. Image sensors are now being produced in the gigapixel realm, implemented using arrays of smaller megapixel detectors. Image data acquisition must be flexible to facilitate interfacing with a variety of sensors and must be closely coupled to data processing and compression in order to distribute data. Utilizing an FPGA coprocessor has many benefits over the traditional software or ASIC imaging processing system, and packs the complex algorithms into a small, powerful device.

Accordingly, FPGA-based processors can provide the critical link between HD image sensors and low-bandwidth JPEG2000 compressed data. This trend in reconfigurable computing makes it possible to satisfy the needs of powerful imaging systems in airborne and ground vehicle platforms such as Predator UAVs, Reaper UAVs, Hummingbird UAV helicopters, MULE autonomous ground robots, aerostats, and many more. High-performance, FPGA-based computing allows military surveillance systems to receive and process HD sensor data from a variety of sources and compress the data for transmission using JPEG2000.

Pixel processing is painless for FPGAs

The highly computational nature of image pixel processing can be efficiently implemented in modern FPGA devices. A Virtex-6 SX475T FPGA has more than 475 K logic cells and 2,016 embedded DSP blocks, flexible connections to the rest of the system including integrated 6.5 Gb transceivers, and many other powerful elements. These features allow many of the image processing functions to be pipelined together in a single FPGA device for multiple image sensors. The example image processing design detailed in Table 1 shows that less than 7 percent of FPGA resources is needed for a full-function image processing and compression channel.

 

Table 1: Example image processing system – FPGA resource requirements

(Click graphic to zoom by 1.9x)


21

 

 

Image processing system needs: Tough enough?

It’s desired for a military surveillance system to have the capability to view a wide area, for example greater than 16 km2, since not much information can be gained by “looking through a soda straw.” Furthermore, the surveillance system must typically operate at distances many kilometers away from the target to preserve safety while generating high-resolution images. All this requires a colossal amount of image data to be captured and processed. For example, the DARPA ARGUS-IS system uses 368 5-megapixel detectors (a total of 1.8 gigapixels) capable of generating more than 260 Gbps of raw data.

This data must be captured and processed in a real-time system, often with extreme SWaP limitations. Additionally, the imaging system might need many other desired features for use after capturing raw pixel data:

  • Defective pixel correction
  • Color filter array interpolation
  • Color correction
  • Gamma correction
  • Color space conversion
  • Image noise reduction
  • Edge enhancement
  • Video scaling
  • Video windowing – Object tracking
  • Image stabilization
  • Video compression (JPEG2000, H.264, MPEG-4)
  • Data encryption
  • Video output for transmission and/or storage

Reconfigurable computing technologies based on FPGA processors are excellent for meeting these demanding needs. This methodology is smaller, faster, and more efficient than an exclusively software processing system – and much more flexible and cost-effective than ASIC approaches. The essential link between the image sensors and software processing is a perfect fit for FPGA technology.

Adapting to various image-sensor interfaces

Surveillance systems and other imaging platforms are quickly evolving and expanding to utilize a variety of sensors for optical, infrared, and synthetic aperture radar technologies, and the image sizes are always increasing. The physical interface to the image sensors varies, and therefore the processing system needs a flexible and modular interface in order to scale and expand with these systems. Reconfigurable computing systems meet this need by supporting incoming data in many formats and physical interfaces, for example, CameraLink, SDI, raw pixels, LVDS, multi-gigabit serial interfaces, fiber-optics, and so on. An FPGA-based system is well suited for acquiring the image data from a variety of sources and protocols because of the programmable I/O blocks, integrated transceivers, and SERDES blocks.

Additionally, a surveillance system may have many different mission objectives that require flexibility in processing the video data. For example, the system may need to provide full-resolution images of the entire field of view every couple of seconds, or to track a moving target with a smaller image window that updates at 10 Hz. The system may need to control the data bandwidth that the image system has to downlink, or to store the data for later transmission. The FPGA-based image processor can be reconfigured in the system, allowing for upgrades and additional features such as pattern matching or motion detection to be added without modifying the hardware. This is a great advantage over any predefined frame grabber or ASIC-based hardware. Figure 1 shows the flexibility of a reconfigurable image processing system capable of receiving image data from a variety of sources and interfaces, implementing a selection of computationally complex image processing functions and outputting a compressed video stream via a choice of interfaces.

 

Figure 1: Reconfigurable image processing hardware’s flexibility

(Click graphic to zoom by 1.9x)


21

 

 

Upfront data compression using JPEG2000

The data received from color image sensors in any system must first be interpolated to get the RGB components for each pixel, and then typically converted to YCrCb color space to reduce the amount of data using chroma sub-sampling. However, by utilizing an FPGA processing system, the video data can be further compressed with JPEG2000 and the bandwidth reduced by an additional factor of 2:1 in a lossless compression, 10:1 in a visually lossless compression, and up to 100:1 in a lossy compression. The compressed data provided by the FPGA image processor allows the system to easily store, further process by software, and transmit the data to users. Image data compression using JPEG2000 provides superior results for a system without upfront data compression and reduces data for the system to handle. Figure 2 shows an example of an image compressed at different JPEG2000 ratios, along with the example data sizes this provides.

 

Figure 2: JPEG2000 image compression example

(Click graphic to zoom by 1.5x)


22

 

 

The JPEG2000 codec is especially popular for military surveillance applications, as it offers very high performance, low latency, the ability to perform lossless compression, and other very useful features. This compression standard works by breaking down individual image frames into increasingly lower resolution sub-bands using wavelet transformations. The data is then arranged into packets that contain information about groups of pixels in a succession. The resulting image stream proffers useful features such as selectable areas, selectable resolution, error resilience, and bandwidth limiting, in addition to allowing a system to work with it in the image stream’s compressed format. For instance, the JPEG2000 data can be used in remote surveillance systems to downlink flexible combinations of small HD image windows or wide area snapshots from the same image sensors, all without having to decompress the data first.

It can also be used at a specific bandwidth by simply extracting a portion of the JPEG2000 data and truncating the higher-resolution portion of the stream. This is very useful for transmitting the image data, as the most important information in an image can be delivered using less than 20 percent of the JPEG2000 data stream. This error resilience is furthered because each video frame is individually compressed, which allows the data link to operate with less error correction. Therefore, there are fewer concerns over data loss with JPEG2000 as compared with other compression standards, which require the information from multiple frames for decoding. Finally, JPEG2000 data can be viewed at different resolutions and quality from the same original stream.

Although a JPEG2000 encoder is quite complex to implement, an FPGA provides plenty of resources to include this feature in an image processing design. There are IP cores readily available to integrate into an FPGA design, or an FPGA can be used to interface to ADV212 ASICs (JPEG2000 System-on-Chip). Both of these options have their own benefits and trade-offs (Table 2), and either of these codec options can be used on COTS hardware such as the Alpha Data ADM-XRC-5T2-ADV mezzanine FPGA board.

 

Table 2: Design trade-offs when implementing JPEG2000 image compression using an IP core in an FPGA versus using external ASIC devices interfaced to an FPGA.

(Click graphic to zoom by 1.7x)


22

 

 

Utilizing COTS FPGA hardware for image processors

To facilitate advanced image data processing, a modular FPGA image processor board – such as those that are PMC/XMC-based – can be used in industry-standard embedded computing systems including those in the CompactPCI, VME, VXS, VPX, PCI, PCI Express, and other form factors. These small (74 mm x 144 mm) modules are available with the largest FPGA devices offered and can be used in workstation development systems, or moved directly into ruggedized deployed surveillance systems. For JPEG2000 compression, COTS boards offer large amounts of onboard memory that IP cores require and/or dedicated JPEG2000 ASICs. The host computing system can easily control the hardware image processor cards to apply user settings, reconfigure the FPGA for various video operations, control image windowing, command target tracking, and receive the output video streams. Such COTS hardware is typically provided with software drivers and API interfaces that make the control and data transfer seamless in the system.

Adam Smith is Vice President of Sales and Support at Alpha Data Inc. His background is in designing FPGA firmware, digital, and analog electronics. Adam has enjoyed a career with Lockheed Martin Space Systems and Alpha Data after earning a BSEE from Colorado School of Mines and an MBA from the University of Colorado. He can be contacted at [email protected].

Alpha Data Inc. 408-467-5076 www.alpha-data.com

 

Featured Companies

Alpha Data, Inc.

611 Corporate Circle Suite H
Golden, CO 80401