Saturday, February 14, 2015

Decorrelation Stretching

Decorrelation stretch is used to enhance color differences in images with high interchannel correlation. Therefore it allows us to see details that are otherwise not so obvious or invisible to the human eye.

I first came across decorrelation stretch in a Matlab webinar and was curious to find out how it works. From Matlab online help [1] and some other online resources [2, 3] I learned the theory and implemented it in C++ using OpenCV. You can find a very nice application of decorrelation stretching to uncover rock art here.

The theory is well explained in [2, 3], so I won't go into details but provide a brief explanation.

To remove interchannel correlation in input data, we project it to a space that removes this correlation. We can find such space by performing PCA on input data. We'll then scale the projected data in the new eigenspace and project it back to the original space and stretch it. The process is illustrated in the figure below for a 2-D dataset.



The operation can be expressed as a pointwise linear transformation:

y = St RT Sc R (x - μ) + target_μ

x : n x 1 input data point.
y : n x 1 output data point.
μ : n x 1 mean of the input data.
R : n x n rotation matrix formed by the eigenvectors found from PCA. R (x - μ) projects zero mean input data to the new eigenspace.
Sc : n x n scaling matrix which is a diagonal matrix. Elements in the main diagonal are of the form 1/σi, where σ is the standard deviation of the corresponding variable in the new eigenspace. Multiplication by Sc makes variables in eigenspace have unit variance.
RT : RT projects data back to original space (RT = R-1, since rotation matrix satisfies RT R = I). Reprojected data will still have unit variance.
St : n x n stretching matrix which is a diagonal matrix. Elements in the main diagonal are of the form target_σi, where target_σ is the desired standard deviation for the corresponding variable in the input data space. Multiplication by St sets the variance of variables in input data space to desired variance.
target_μ : backprojected data is still zero mean. Adding target_μ shifts mean of the variables to the desired mean.

The dstretch function below takes as input a multi-channel image and two optional column vectors specifying the desired mean and standard deviation for each channel of the output decorrelation stretched image.

 
#include <opencv2/opencv.hpp>

/*
input       : p x q x n multi-channel image.
targetMean  : n x 1 vector containing desired mean for each channel of the dstretched image. If empty, mean of the input data is used.
targetSigma : n x 1 vector containing desired sigma for each channel of the dstretched image. If empty, sigma of the input data is used.

returns floating point dstretched image
*/

Mat dstretch(Mat& input, Mat& targetMean = Mat(), Mat& targetSigma = Mat())
{
   CV_Assert(input.channels() > 1);
 
   Mat dataMu, dataSigma, eigDataSigma, scale, stretch;

   Mat data = input.reshape(1, input.rows*input.cols);
   /*
   data stored as rows. 
   if x(i) = [xi1 xi2 .. xik]' is vector representing an input point, 
   data is now an N x k matrix:
   data = [x(1)' ; x(2)' ; .. ; x(N)']
   */

   // take the mean and standard deviation of input data
   meanStdDev(input, dataMu, dataSigma);

   /*
   perform PCA that gives us an eigenspace.
   eigenvectors matrix (R) lets us project input data into the new eigenspace.
   square root of eigenvalues gives us the standard deviation of projected data.
   */
   PCA pca(data, Mat(), CV_PCA_DATA_AS_ROW);

   /*
   prepare scaling (Sc) and strecthing (St) matrices.
   we use the relation var(a.X) = a^2.var(X) for a random variable X and 
   set
   scaling factor a = 1/(sigma of X) for diagonal entries of scaling matrix.
   stretching factor a = desired_sigma for diagonal entries of stretching matrix.
   */

   // scaling matrix (Sc)
   sqrt(pca.eigenvalues, eigDataSigma);
   scale = Mat::diag(1/eigDataSigma);

   // stretching matrix (St)
   // if targetSigma is empty, set sigma of transformed data equal to that of original data
   if (targetSigma.empty())
   {
      stretch = Mat::diag(dataSigma);
   }
   else
   {
      CV_Assert((1 == targetSigma.cols) &&  (1 == targetSigma.channels()) && 
         (input.channels() == targetSigma.rows));

      stretch = Mat::diag(targetSigma);
   }
   // convert to 32F
   stretch.convertTo(stretch, CV_32F);
 
   // subtract the mean from input data
   Mat zmudata;
   Mat repMu = repeat(dataMu.t(), data.rows, 1);
   subtract(data, repMu, zmudata, Mat(), CV_32F);

   // if targetMean is empty, set mean of transformed data equal to that of original data
   if (!targetMean.empty())
   {
      CV_Assert((1 == targetMean.cols) && (1 == targetMean.channels()) && 
         (input.channels() == targetMean.rows));

      repMu = repeat(targetMean.t(), data.rows, 1);
   }

   /*
   project zero mean data to the eigenspace, normalize the variance and reproject,
   then stretch it so that is has the desired sigma: StR'ScR(x(i) - mu), R'R = I.
   since the x(i)s are organized as rows in data, take the transpose of the above
   expression: (x(i)' - mu')R'Sc'(R')'St' = (x(i)' - mu')R'ScRSt,
   then add the desired mean:
   (x(i)' - mu')R'ScRSt + mu_desired
   */
   Mat transformed = zmudata*(pca.eigenvectors.t()*scale*pca.eigenvectors*stretch);
   add(transformed, repMu, transformed, Mat(), CV_32F);

   // reshape transformed data
   Mat dstr32f = transformed.reshape(input.channels(), input.rows);

   return dstr32f;
}  

The demo program below shows how dstretch was used to highlight different regions of a Landsat image.
int _tmain(int argc, _TCHAR* argv[])
{
   // rgb image
   Mat bgr = imread("14_L.jpg");
   // target mean and sigma
   Mat mean = Mat::ones(3, 1, CV_32F) * 120;
   Mat sigma = Mat::ones(3, 1, CV_32F) * 50;
   // convert to CIE L*a*b* color space
   Mat lab;
   cvtColor(bgr, lab, CV_BGR2Lab);
   // dstretch Lab data. dstretch outputs a floating point matrix
   Mat dstrlab32f = dstretch(lab, mean, sigma);
   // convert to uchar
   Mat dstrlab8u;
   dstrlab32f.convertTo(dstrlab8u, CV_8UC3);
   // convert the stretched Lab image to rgb color space
   Mat dstrlab2bgr;
   cvtColor(dstrlab8u, dstrlab2bgr, CV_Lab2BGR);
   // dstretch RGB data
   Mat dstrbgr32f = dstretch(bgr, mean, sigma);
   // convert to uchar
   Mat dstrbgr8u;
   dstrbgr32f.convertTo(dstrbgr8u, CV_8UC3);

   imshow("RGB", bgr);
   imshow("dstretched CIE L*a*b* converted to RGB", dstrlab2bgr);
   imshow("dstretched RGB", dstrbgr8u);
   waitKey();

   return 0;
} 
Input image: Philadelphia, Pennsylvania, USA image courtesy of the U.S. Geological Survey
dstretched CIE L*a*b* image converted to RGB
dstretched RGB image
Data distribution of the images
(Data distribution plotted using Matplotlib mplot3D toolkit.)

References
[1]Matlab decorrstretch documentation http://in.mathworks.com/help/images/ref/decorrstretch.html
[2]http://www.dstretch.com/DecorrelationStretch.pdf
[3]A. R. Gillespie, A. B. Kahle, and R. E. Walker, “Color enhancement of highly correlated images. I. Decorrelation and HSI contrast stretches,” Remote Sensing of Environment, vol. 20, no. 3, pp. 209–235, Dec. 1986.