Note, for the below proof to work you need to assume that the function is $g$ monotonic. (and also note that for non-monotonic functions there might not be always proof possible)
Proof using chain rule
Let's consider for simplicity the likelihood function as a function of a single variable:
$$\mathcal{L}(\theta \vert x_1,x_2, \dots, x_n) = h(\theta)$$
If instead of $\theta$ we use a different parameter $\eta$ and they have the relationship $\theta = g(\eta)$ then the new likelihood is
$$\mathcal{L}(\eta \vert x_1,x_2, \dots, x_n) = h(g(\eta)) = H(\eta)$$
And it's derivative is found with the chain rule
$$ H'(\eta) = h'(g(\eta)) \cdot g'(\eta)$$
And this is zero when $g'(\eta)$ is zero (we can exclude this possibility by restricting ourselves to monotonic functions $h$ as transformation), or when $h'(g(\eta))$ is zero.
So if $\theta_{ML}$ is the parameter such that $h'(\theta_{ML}) =0$ then $h'(g(\eta))$ is zero when $g(\eta) = \theta_{ML}$.
Intuitive graph
Possibly the following graph may help.
When we express the function $f(x)$ in terms of a different parameter $t$ (and in the example $x = 0.1/t$), then it is like stretching and reshaping the graph along the x-axis/x-coordinate, but the peak remains at the same value.
The stretching will change the slope according to the above used chain rule. But for the peak, the slope (which is equal to zero) remains the same.

This graph is inspired by this q&a. In that question it is about the transformation of the probability density function. The probability density function does not transform like the likelihood function and will have an additional factor that makes that the peak can be at a different location.