Each cell in the LSTM has four components: the cell weights, the input gate, the forget gate, and the output gate. Each component has weights associated with all of its input from the previous layer, plus input from the previous time step. So if there are $n_i$ cells in an LSTM layer, and $n_{i-1}$ in the earlier layer, there will be $n_{i-1} + n_i$ inputs to each component of the cell. Since there are four components, that means there are $4(n_{i-1} + n_i)$ weights associated with each cell. And since we have $n_i$ cells, that means there are $4 n_i (n_{i-1} + n_i)$ weights associated with that layer.
In your first example, we have $n_0 = 39$, $n_1 = n_2 = n_3 = 1024$, and $n_4 = 34$. So the overall number of weights is about 21M.
In the second example we have $n_0 = 205$, $n_1 = ... = n_5 = 700$, and $n_6 = 205$. So the total number of weights is about 19M.