Search
Close this search box.

Bajovic D., Sinopoli B., Xavier J.

2009 47th Annual Allerton Conference on Communication, Control, and Computing, Allerton 2009
2009

pp 363

-
370

Abstract:

This paper addresses robust linear dimensionality reduction (RLDR) for binary Gaussian hypothesis testing. The goal is to find a linear map from the high dimensional space where the data vector lives to a low dimensional space where the hypothesis test is carried out. The linear map is designed to maximize the detector performance. This translates into maximizing the Kullback-Leibler (KL) distance between the two projected distributions. In practice, the distribution parameters are estimated from training data, thus subject to uncertainty. This is modeled by allowing the distribution parameters to drift within some confidence regions. We address the case where only the mean values of the Gaussian distributions, m0 and m1, are uncertain with confidence ellipsoids defined by the corresponding covariance matrices, S0 and S1. Under this setup, we find the linear map that maximizes the KL distance for the worst case drift of the mean values. We solve the problem globally for the case of linear mapping to one dimension, reducing it to a grid search over a finite interval. Our solution shows superior performance compared to robust linear discriminant analysis techniques recently proposed in the literature. In addition, we use our RLDR solution as a building block to derive a sensor selection algorithm for robust event detection, in the context of sensor networks. Our sensor selection algorithm shows quasi-optimal performance: worst-case KL distance for suboptimal sensor selection is at most 15% smaller than worst-case KL distance for the optimal sensor selection obtained by exhaustive search.