It does not make sense to use reduced-rank regression with a binary dependent variable.
Reduced-rank regression is usual regression with a rank constraint on the coefficients matrix. It only makes sense for multivariate regression, i.e. for regression with multiple response variables. If e.g. there are $p$ predictor variables and $q$ response variables, then the matrix of regression coefficients is $p\times q$, and a rank constraint can be non-trivial. If there is only one single response variable, then the "matrix" of regression coefficients is just a $p\times 1$ vector which has rank $1$. So its rank cannot be further constrained at all. Compare:
$$\mathbf Y = \mathbf X \mathbf B + \epsilon,\quad \mathbf B\in\mathbb R^{p\times q}$$
$$\mathbf y = \mathbf X \boldsymbol \beta + \epsilon,\quad \boldsymbol\beta\in\mathbb R^p$$
If your outcome variable is binary, it can be coded as a sequence of $0$s and $1$s, making the response variable one-dimensional. So RRR cannot add anything here. You will be just running ordinary regression.
(Which can be fine: for two classes, regression is equivalent to linear discriminant analysis (LDA). You just need to make sure that you are not overfitting, and use a regularization if needed.)
Regarding alternatives, nowadays one of the most standard approaches in your situation would be logistic regression regularized with elastic net.