Note: this question is about a common data science problem, but I am solving it using a specific piece of software. I believe the problem is common enough that these principles will be common across all solvers, but I understand that this question may be deemed to be too dependent on this particular piece of software to be relevant to Cross Validated.
I am using OSQP to solve a QP problem. When a solution is found, the solver outputs 2 vectors.
x: the primal solution
y: the dual solution.
The solver minimizes the usual QP problem, subject to double bounded inequality constraints $l< Ax<u$
I want to know which constraints have the most influence on the solution. My understanding is that the dual solution is the Lagrangian, which can be interpreted as the sensitivity of the objective function to a constraint. It is not clear to me if the sensitivity refers to the lower or upper bound though. I am assuming that negative dual values refer to lower bounds and positive to upper.
So as an experiment I found an element of the primal solution, which was at it's lower bound of a simple constraint and the constraint had a Lagrangian of -1.39. I then relaxed the lower bound of that constraint and ran the solver again. The optimal solution remained unchanged and this particular constraint now has a Lagrangian of -0.6. If the optimal is sensitive to this binding constraint, why didn't the optimal move when I relaxed this constraint? Is my interpretation of the dual correct? Is there something missing from my understanding? Or is this question too dependent on this particular software to be answered here?