Performance Measure-Based Band Group Selection for Agricultural Multispectral Sensor Design

Matthew Allen Lee


Hyperspectral sensors are unfortunately plagued by relative high financial cost and they require large amounts of storage and computation. Too often, these factors limit their applicability in precision agriculture and in particular application requiring real-time signal processing. On the other hand, multispectral sensors are less expensive and require far less computational resources. If the task of the sensor or its platform is well known ahead of time, then it can be beneficial to design a multispectral sensor to accomplish the task as long as the performance is not degraded. In this article, we explore the design of a task specific multispectral sensor based on a new band grouping technique. Band grouping algorithms typically rely on a proximity measure to determine how similar (or dissimilar) the information contained in hyperspectral bands are to each other. Similar bands are then grouped. However, the proximity measure typically does not take into account the interactions of different band groups or how band groups will be used. The theory put forth in this article is unique because it makes global decisions for band groups by utilizing a performance measure to gauge the effectiveness of random partitionings at some given task, such as classification or anomaly detection. The band groups that are most correlated with good performance are then selected. Our technique is compared to the uniform partitioning technique using the Pecan1 data set. The results show that the overall accuracy is better with our technique when the same number  of band groups are selected.


AVIRIS system specifications. Available:

Ball, J. E., Anderson, D. T., & Samiappan, S. (2014). Hyperspectral band selection based on the aggregation of proximity measures for automated target detection. Proceedings of SPIE 9088, Algorithms and Technologies for Multispectral, Hyperspectral, and Ultraspectral Imagery, 1(90), 908814, doi:10.1117/12.2050638

Bhattacharyya, A. (1943). On a measure of divergence between two statistical populations defined by probability distributions. Bulletin of Calcutta Mathematical Society, 35, 99-109.

Bruce, L. M. (2013). Game theory applied to big data analytics in geosciences and remote sensing. Proceedings of IEEE International Geoscience and Remote Sensing Symposium (IGARSS), 4094-4097, doi: 10.1109/IGARSS.2013.6723733

CASI 550 ‐ VNIR Spectrographic Imaging System. Available:

Cheriyadat, A., & Bruce, L. M. (2003). Decision Level Fusion with Best- Bases for Hyperspectral Classification. Proceedings of IEEE Geoscience and Remote Sensing Symposium Workshop on Advances in Techniques for Analysis of Remotely Sensed Data, 399 – 406, doi: 10.1109/WARSD.2003.1295221

Cormen, T. H., Leiserson, C. E., Rivest, R. L., & Stein, C. (2009). Introduction to Algorithms, 3rd Ed. Boston, MA: Massachusetts Institute of Technology.

Cover, T. (2006). Elements of Information Theory, 2nd ed. Hoboken, NJ:Wiley.

Deza, M. M., & Deza, E. (2009). Encyclopedia of Distances. Berlin, Germany: Springer.

Duda, R., Hart, P. E., & Stork, D. G. (2006). Pattern Classification, 2nd ed. Chicester, U.K.: Wiley.

Fisher, R. A. (1936). The Use of Multiple Measurements in Taxonomic Problems. Annals of Eugenics, 7(2), 179–188, doi:10.1111/j.1469-1809.1936.tb02137.x. hdl:2440/15227

Goldberg, D. E. (1989). Genetic Algorithms in Search, Optimization, and Machine Learning. Reading, MA: Addison-Wesley Publishing Company, Inc.

Hughes, G. (1968). On the mean accuracy of statistical pattern recognizers. IEEE Transactions on Information Theory, 14(1), 55-63, doi: 10.1109/TIT.1968.1054102.

Kalluri, H., Prasad, S., & Bruce, L. M. (2010). Decision Level Fusion of Spectral Reflectance and Derivative Information for Hyperspectral Classification and Target Recognition. IEEE Transactions on Geoscience and Remote Sensing, 48(2), 4047-4058, doi: 10.1109/IGARSS.2013.6723733

Kennedy, J., & Eberhart, R. (1995). Particle Swarm Optimization. Proceedings of IEEE International Conference on Neural Networks, 4, 1942–1948, doi:10.1109/ICNN.1995.488968

Kirkpatrick, S., Gelatt Jr,, C. D., & Vecchi, M. P. (1983). Optimization by Simulated Annealing. Science, 220(4598), 671–680, doi:10.1126/science.220.4598.671

Kullback, S., & Liebler, R. A. (1951). On information and sufficiency. Annals of Mathematical Statistics, 22(1), 79-86.

Lee, M. A., Bruce, L. M., & Prasad, S. (2011). Concurrent spatial-spectral band grouping: Providing a spatial context for spectral dimensionality reduction, Proceedings of IEEE Hyperspectral Image and Signal Processing: Evolution in Remote Sensing (WHISPERS), 1-4, doi: 10.1109/WHISPERS.2011.6080949

Manolakis, D., Lockwood, R., & Cooley, T. (2008). On The Spectral Correlation Structure of Hyperspectral Imaging Data. Proceedings of IEEE Internatinal Geoscience and Remote Sensing Symposium (IGARSS), II-581-II-584.

Melgani, F. & Bruzzone, L. (2004). Classification of hyperspectral remote sensing images with support vector machines. IEEE Transactions on Geoscience and Remote Sensing, 42, 1778–1790.

MultiSpec: A Freeware Multispectral Image Data Analysis System, (1994). Available:

Papoulis, A. (1965). Probability, Random Variables, and Stochastic Processes. New York: McGraw Hill.

Prasad, S., & Bruce, L. M. (2008). Decision Fusion With Confidence-Based Weight Assignment for Hyperspectral Target Recognition. IEEE Transactions on Geoscience and Remote Sensing, 46, 1448-1456, doi: 10.1109/TGRS.2008.916207

Ren, J., Kelman, T., & Marshall, S. (2011). Adaptive clustering of spectral components for band selection in hyperspectral imagery. Proceedings of Strathclyde’s Second Annual Academic Hyperspectral Imaging Conference, 90-93.

Russel, S. J., & Norvig, P. (2003). Artificial Intelligence: A Modern Apporach, 2nd Ed. UpperSaddle River, NJ: Pearson Education, Inc.

Samiappan, S., Prasad, S., & Bruce, L. M. (2013). Non-Uniform Random Feature Selection and Kernel Density Scoring With SVM Based Ensemble Classification for Hyperspectral Image Analysis. IEEE Journal of Selected Topics in Applied Earth Observation and Remote Sensing, 6(2), 792-800.

SpecTIR Advanced Hyperspectral & Geospatial Solutions. Available:

Su1, H., & Du1, P. (2014). Multiple Classifier Ensembles with Band Clustering for Hyperspectral Image Classification. European Journal of Remote Sensing, 47, 217-227.

Venkataraman, S., & Bruce, L. M. (2005). Hyperspectral Dimensionality Reduction via Localized Discriminant Bases. Proceedings of IEEE Geoscience and Remote Sensing Symposium (IGARSS), 2, 1245-1248, doi: 10.1109/IGARSS.2005.1525344

Waske, B., Linden, S., Benediksson, J. A., Rabe, A., & Hostert, P. (2010). Sensitivity of support vector machines to random feature selection in classification of hyperspectral data. IEEE Transactions on Geoscience and Remote Sensing, 48, 2880–2889, 2010.

Wei, W., Du, J., & Younan, N. H. (2012). Fast Supervised Hyperspectral Band Selection Using Graphics Processing Unit. Journal of Applied Remote Sensing, 6(2), 061504, doi: 10.1117/1.JRS.6.061504

West, T., Prasad, S., Bruce, L.M., Reynolds, D., & Irby, T. (2009). Rapid Detection of Agricultureal Food Crop Contamination via. Hyperspectral Remote Sensing. Proceedings of IEEE Internatinal Geoscience and Remote Sensing Symposium (IGARSS), IV-889 – IV-892.

Whitsitt, S. J., & Landgrebe, D. A. (1977). Error estimation and separability measure in feature selection for multiclass pattern recognition. School Elect. Eng., Purdue Univ., West Lafayette, IN, Tech. Rep. 77-34.

Full Text: PDF


  • There are currently no refbacks.

Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 License.