2

In sparse optimization, I am trying to solve the problem $$ \min_{x\in \mathbb R^{n}} \quad f(x) + \|x\|_1 $$ and at optimality, $x^*$ may be sparse. If I define the sparse manifold as $\mathcal M = \{z : z_i = 0 \text{ whenever } x_i^* = 0\}$, there exist methods in which, for certain cases of $f$, can identify the sparse manifold; e.g. $x^{(k)} \to x^*$, there exist no finite $k_0$ for which $x^{(k)} = x^*$, but there does exist a finite $k_0$ for which, for all $k > k_0$, $x^{(k)} \in \mathcal M$.

Now is it possible to extend this to low-rank matrices? Specifically, now I am trying to solve some problem $$ \min_{X\in \mathbb R^{m\times n}} \quad f(X) + \|X\|_* $$ where $\|X\|_* = $ the sum of the singular values of $X$, and at optimality, $\text{rank}(X) = r< \min\{m,n\}$. Then I define the low rank manifold as $$ \mathcal M = \left\{\sum_{i=1}^r s_i u_iv_i^T: s_i \in \mathbb R\right\} $$ where $X^* = \sum_{i=1}^r \sigma_i u_i v_i^T$ is the ``economic" SVD of $X^*$.

Are there any existing methods in which $X^{(k)} \to X^*$, and for some finite $k_0$, for all $k \geq k_0$, $X^{(k)}\in \mathcal M$?

Y. S.
  • 1,237
  • 3
  • 9
  • 14

1 Answers1

1

Yes. You can see results like this in the thesis of Jingwei Liang titled "Convergence Rates of First-Order Operator Splitting Methods" where they are called finite identification results (ch. 6, 7, 8).

The nuclear norm is a partly smooth function with respect to the manifold you've identified; the thesis provides several manifold identification results for different algorithms applied to problems involving this class of functions.

TSF
  • 126
  • 3
  • Thanks, actually that is a great line of work. I believe you are right, though I suppose their followup work also contains some similar goodies. – Y. S. Feb 21 '22 at 16:16