Remote sensing systems such as unmanned aerial vehicles (UAVs) and terrestrial-based sensor networks have been increasingly used in a surveillance, reconnaissance, and intelligence-gathering role both at the civilian and battlegroup levels. These systems benefit from advances in communications and computing technology which enable the design of low-cost devices that incorporate multimodal sensing, processing, and communication capabilities.
Along with other sensing modalities, the use of remote sensing with visible and infrared cameras has also grown significantly. Especially following the September 11th attacks, video surveillance for security purposes is being generalized in cities and in critical industrial or transportation infrastructures. Moreover, for environmental applications, persistent surveillance of areas using unmanned aerial vehicles or abandoned ground sensors is being developed. In all airborne sensor applications and in many of the distributed/abandoned sensor systems, wireless transmission of video images must be achieved by limited power devices. In existing systems, emphasis has been put on reducing the data rates, using compression algorithms within or close to the sensors.
Nevertheless, the existing solutions do not adequately accommodate remote imaging systems since: (i) power consumption of the data compression hardware is a major issue for abandoned sensors and for light UAVs, not well addressed by MPEGx compression standards; and (ii) very low data rates are strongly required for all remote sensing applications, whether it is to minimize the radiofrequency footprint of the transmission or because the UAV payloads normally include several high spatial, spectral or temporal resolution sensors. To cope with such growing compression ratios existing MPEGx compression profiles may result in poor image quality.
Besides, sampling, a key concept of signal acquisition, still follows the long-term paradigm dictated by the classical Shannon/Nyquist theorem, which specifies that to avoid losing information when capturing a signal, we must sample at least twice faster than the signal bandwidth. But at the dawn of the new millennium, the theory of compressed sensing (CS) reveals that a relatively small number of random incoherent projections of a natural signal can contain most of its salient information. As a result, if a signal is compressible in some orthonormal basis, then a very accurate reconstruction can be obtained from random projections using a small subset of the projection coefficients. When viewed in the framework of data compression/decompression, the CS theory provides new insights favoring the design of efficient algorithms for real-time remote sensing applications.
Hence, in CS-ORION, our focus is on the design, testing, and evaluation of compressive sensing architectures for enhancing the high-quality video acquisition and delivery capabilities of remote sensing devices that will enable them to provide efficient remote imaging in aerial and terrestrial surveillance. We will address limitations to current video coding methods which restrict the use of remote sensing devices to offering only a low-quality streaming video to the user. In order to overcome these limitations, novel algorithms which go well beyond the currently developed techniques and standards are needed. Equally importantly, the proposed methods must be developed with the hardware constraints of the platform of operation in mind, including restrictions regarding computational, memory, and power consumption, as well as the available reserved bandwidth.