Monday, September 18, 2006

QCQP craziness

I'm trying to demo a CCA program that uses a QCQP optimizer as opposed to using the spectral method typically used to solve the problem. I'm doing this to try and see if the two methods yield equivalent or at least similar solutions. I have run into problems with the QCQP however, as the the program depends on matrices which are not positive semidefinite. Mathematically, the program is to minimize

-w'Sw
s.t.
w'Xw - 1 < 0 w'Yw - 1 < 0 where the matrices S, X, Y are S = [ 0 X'Y
Y'X 0 ]

X = [ X'X 0
0 0 ]

and

Y = [ 0 0
0 Y'Y ]


and X'X, Y'Y, X'Y, Y'X are covariance matrices between data of different views.


S is symmetric, but it is not generally positive semidefinite (PSD) as its eigenvalues are not non-negative. In fact, the way S is set up, it always has both positive and negative eigenvalues. The problem with this is that is S is not PSD then the optimization problem is not convex, so the QCQP will not solve the problem.

I'm not exactly sure how to resolve this issue. There should be some way of couching the problem in such a manner that it becomes convex.

Thoughts:

  • The spectral method requires solving a generalized eigenvalue problem of the form Ax = cBx. This problem has 2d eigenvalues: corresponding positive and negative values. This is itself another optimization problem, the largest eigenvalue problem, which can be posed as an optimization problem itself. (I think the largest eigenvalue problem is an SDP though. )

  • Another option is to skip the QCQP and go directly to formulating an SDP problem.

  • I have a feeling that I could spend a lot of time on the QCQP, and if it is actually impossible to formulate the problem as such then I'd be wasting a lot of time. This seems like the type of issue that Gert or another grad student may have some valuable insight on.

0 Comments:

Post a Comment

<< Home