Yes, logistic regression is capable of learning a rule of the form "if any of the input variables are 1, then predict success." Such a model might look something like:
$$\hat{p} = \sigma(-10 + 20x_1 + 20x_2 + 20x_3)$$
where $\sigma$ is the sigmoid function. 10 is arbitrary and is chosen for concreteness: all that matter is the intercept is a large negative number and the coefficients are even larger positive numbers. When all three IVs are zero, this is $\sigma(-10) = \epsilon$ (where $\epsilon$ is a very small number) and if even one IV is a 1 then it will be at least $\sigma(10) =1 - \epsilon$. If two or more IVs are 1, it simply gets closer to 1. Because the decision threshold for for logistic regression is usually 0.5, every dependent class will be predicted with perfect accuracy.
Notice that this rule is of the form $x_1=1 \cup x_2=1 \cup x_3=1 $, in other word it is an "OR" expression. Logistic regression can also learn "AND" expressions. Interesting enough, however, it cannot learn "XOR" rules unless you provide interaction terms in the design matrix. And in general it cannot learn arbitrary logical expressions. Compare this to say, a decision tree, which can. This limitation was one of the early motivations for exploring the multilayered perceptrons which eventually led to deep learning. But I digress.
If the rule is true for literally every row in your data set, we say the DV classes are "linearly separable." Try to fit a vanilla logistic regression to linearly separable data with your favorite statistical software and the parameters will not converge: the fit keeps getting better as the parameters get arbitrarily bigger so it will "blow up." That is to say, eventually the program will hit a maximum number of iterations or the limits of its floating point representation and quit with an error message. These numerical issues can easily be fixed by adding a regularization term to the model. Regularization is widely supported typically is as easy as checking a box or passing an optional argument into a function.