Questions tagged [convex-optimization]
38 questions
10
votes
1 answer
Ideas on Matrix Factorization / Transformations for $ {L}_{1} $ Minimization
I am starting with a typical $\ell_1$ basis pursuit problem:
$$
\min_{\mathbf{x}} \Vert \mathbf{x} \Vert_1 \quad \mathrm{s.t.} \quad \Vert \mathbf{ERx} - \mathbf{y} \Vert_2 \leq \epsilon,
$$
where $\mathbf{R}\in\mathbb{C}^{M \times P}$, and…
AnonSubmitter85
- 709
- 4
- 11
9
votes
1 answer
How Can I Use MATLAB to Solve a Total Variation Denoising / Deblurring Problem?
The Total Variation Denoising Problem is given by:
$$ \arg \min_{x} \frac{1}{2} {\left\| A x - y \right\|}_{2}^{2} + \lambda \operatorname{TV} \left( x \right) $$
Where $ \operatorname{TV} \left( \cdot \right) $ is the Total Variation Norm.
How…
CEB12
8
votes
4 answers
Solving Convex Optimization Problem Used for High Quality Denoising
The highest voted answer to this question suggests that to denoise a signal while preserving sharp transitions one should
minimize the objective function:
$$ |x-y|^2 + b|f(y)| $$
where $x$ is the noisy signal, $y$ is the denoised signal, $b$ is…
John Robertson
- 1,082
- 1
- 8
- 13
7
votes
1 answer
Least Angle Regression (LARS) without Matrix Inversion
Sorry if this is too damned long. I did what I could to abbreviate it.
The question is about Least Angle Regression (LARS).
I'm new to numerical work with matrices.
I believe I have a way to compute Least Angle without explicit matrix inversion. I'm…
MackTuesday
- 433
- 2
- 7
5
votes
1 answer
Solving a Weighted Basis Pursuit Denoising Problem (BPDN) with MATLAB / CVX
Following up from an answer by @Royi on adding weights to BPDN problem , I would like to use CVX to test this approach. How can we formulate in CVX the regularized LS L1 norm with weights given by a vector $c$, as follows:
$$ \arg…
bla
- 576
- 1
- 3
- 14
5
votes
2 answers
Quadratic Programming with Linear Equality Constraints
I need to solve an equality constrained minimization problem as give below
$$\min_{\textbf{w}} \mathbf{w}^TR\mathbf{w} $$
such that
$$X\mathbf{w} = \mathbf{1}$$
where $R\in \mathbb{R}^{n\times n}$ is covariance matrix (hence positive…
user5045
- 191
- 1
- 1
- 11
5
votes
1 answer
Convex Optimization with $ {L}_{1, 2} $ Regularization Term
I have an optimization problem such as follow:
$$\underset{X}{\operatorname{argmin}}\sum _s \left \| T_sX_{:,s} - Y_{:,s} \right \|^2_2 +\lambda\left \| GX \right \|_{2,1} \tag{1}$$
I have introduced a new…
strahd
- 99
- 5
5
votes
1 answer
How to Formulate a Constraint Which Ensures All Variables Have the Same Sign
I'm trying to include a constraint in my problem (to be solved by any convex optimization solver). Let {a,b,c,d ...} be a finite set of continuous variables. How to formulate a constraint which ensure all of these variables, in the same time…
Mohan Lal
- 51
- 1
4
votes
1 answer
Adding Variance \ Weights Information When Solving a Basis Pursuit Denoising Problem (BPDN)
Having a "measured" vector $\mathbf{y}$ with its statistics (counts or variance per element), one can use weighted least squares approach to solve the linear system $$\mathbf{A}\mathbf{x} = \mathbf{y}$$ by minimizing $$(\mathbf{y} -…
bla
- 576
- 1
- 3
- 14
4
votes
1 answer
How Could One Accelerate the Convergence of the Least Mean Squares (LMS) Filter?
How can the convergence of an LMS filter be accelerated?
Can we do better than the vanilla algorithm?
Thomas
- 626
- 1
- 1
- 14
4
votes
1 answer
Converting Hadamard Product into Matrix Multiplication in Image Deconvolution with Total Variation (TV) Using ADMM
I would like to solve the following Image Deconvolution equation by ADMM.
$$\mathbf { \min\frac{1}{2}\Vert{Cx-b}\Vert_2^2+\Vert w\circ (D x)\Vert_1 \tag 1}$$
Where, $x$ is a vector of unknown pixel values, $b$ is measurements,and $C$ is the point…
Sushi man in Japan
- 95
- 6
4
votes
1 answer
Is Sum of Absolute Value / $ {L}_{1} $ Norm of Differences Convex?
I'm not sure how to approach this exercise.
One idea is to derive it w.r.t z, show that there is a min-extremum at $z=f_k$ and then show that for each value from the right and the left of the loss function it is positive which will prove that it…
Ilya.K.
- 165
- 4
4
votes
2 answers
Justification for Squared $ {L}_{2} $ Data and Smoothness Term as an Error Bound
Often in variational methods (and not only) we have an energy that is of the form:
$$E(u) = \frac{1}{2}\|f-u\|^2_2 + \frac{\alpha}{2}\|\psi(u)\|^2_2,$$
where the first term is referred to as the data term, and the second as the smoothness term. I…
lightxbulb
- 165
- 4
4
votes
1 answer
Solving LASSO (Basis Pursuit Denoising Form) with LARS
I'm now working on using LARS (Least Angle Regression) algorithm to solve a LASSO problem in Basis Pursuit Denoising form like:
\begin{align*}
\quad && \arg \min_{\beta}{\left\| y - X\beta \right\|}_{2}^{2} + \lambda {\left\| \beta \right\|}_{1} &&…
queuer
- 43
- 4
4
votes
1 answer
How to solve ADMM for TV Minimization Problem For Different Sizes $A$ and $x$ in $Ax=b$
I have matrix $A$ that is $(M \times M)$ square matrix, $x$ $(M \times N)$ matrix, $b$ is $(M \times N)$ matrix. Knowing $A$ and $b$ I would like to get the $x$ from the equation $Ax=b$. $N=p \times q$, so consider $x$ as an $M$ number of $p \times…
johanson
- 65
- 5