Based on your phrasing, it seems you are equating thermodynamic entropy with information entropy. The concepts are related, but you have to be careful because they are used differently in the two fields.
Shannon entropy measures unpredictability. You are correct that entropy is maximum when the outcome is the most uncertain. An unbiased coin has maximum entropy (among coins), while a coin that comes up Heads with probability 0.9 has less entropy. Contrary to your next statement, however, max entropy = maximum information content.
Suppose we flip a coin 20 times. If the coin is unbiased, the sequence might look like this:
TTHTHHTTHHTHHTHTHTTH
If the coin comes up Heads with probability 0.9, it might look more like this:
HHHHHHHHHHTHHHHHHTHH
The second signal contains less information. Suppose we encode it using run length encoding, like this:
10T6T2
which we interpret as "10 heads, then 1 tail, then 6 heads, then a tail, then 2 heads". Compare this to the same encoding method applied to the first signal:
TT1T2TT2T2T1T1TT1
We can't compress the signal from the maximum entropy coin as much, because it contains more information.
As for your specific questions:
A thermodynamic system in equilibrium has maximum entropy in the sense that its microstate is maximally uncertain given its macrostate (e.g. its temperature, pressure, etc). From our perspective as observers, this means that our knowledge of its microstate is less certain than when it was not in equilibrium. But the system in equilibrium contains more information because its microstate is maximally unpredictable. The quantity that has decreased is the mutual information between the macrostate and the microstate. This is the sense in which we "lose (mutual) information" when entropy increases. The loss is relative to the observer.
As long as the process is random, each new symbol adds information to the sequence. The symbols are random variables, so each one has a distribution for which we can calculate entropy. The information content of the sequence is measured with joint entropy.