Detection from hyperspectral images compressed using rate distortion and optimization techniques under JPEG2000 part 2
This research studies the effect of two different bit rate allocation strategies in JPEG2000 part 2 compression of Hyperspectral data on the results of background classification. Hyperspectral imagery (HSI) brings a whole new set of capabilities in the field of remote sensing. The major disadvantage being its analysis and processing that leads to high computation and memory costs. This thesis proposes lossy compression to HSI with very high target hit rate. We compare traditional bit rate allocation approach based on the high bit rate quantizer model with the Rate Distortion Optimal (RDO) approach that produces a bit rate allocation optimal in the mean squared error (MSE) sense. ^ The experiments show that for relatively low bit rates both rate allocation strategies perform with excellent and almost similar accuracy (96% at 0.125 bits per pixel per band (bpppb)). However at a very low bit rates RDO outperforms (90% at 0.0375 bpppb) the high bit rate quantizer approach in terms of background classification results. The experiments also confirm that RDO bit rate allocation achieves a lower MSE than the high bit rate quantizer model approach. (Abstract shortened by UMI.)^
Engineering, Electronics and Electrical|Computer Science
Jayaram, Vikram, "Detection from hyperspectral images compressed using rate distortion and optimization techniques under JPEG2000 part 2" (2004). ETD Collection for University of Texas, El Paso. AAI1423694.