Yes to (1) and no to (2). Let me explain.
- The reasoning is from the transformation theorem. This is it generally. Assume you have two original random variables $X_1$ and $X_2$, along with their joint density $f_{X_1,X_2}(x_1,x_2)$. The transformation theorem gives you the joint density of two new random variables $Y_1 = g_1(X_1,X_2)$ and $Y_2 = g_2(X_1,X_2)$. Assume that these $g_i$s are smooth enough to possess the derivatives I write down:
$$
g_{Y_1,Y_2}(y_1,y_2) = f_{X_1,X_2}(x_1[y_1,y_2],x_2[y_1,y_2])|\det(J)|
$$
where
$J = \left( \begin{array}{cc}
\frac{\partial x_1}{\partial y_1} & \frac{\partial x_1}{\partial y_2} \\
\frac{\partial x_2}{\partial y_1} & \frac{\partial x_2}{\partial y_2} \end{array} \right).
$
- Now assume that $X_1$ and $X_2$ start off to be independent. This is your case you're dealing with. That means $f_{X_1,X_2}(x_1,x_2) = f_{X_1}(x_1)f_{X_2}(x_2)$. Also, if you only make $Y_1$ a function of $X_1$ and only make $Y_2$ a function of $X_2$, then $J$ is diagonal, right? Now plug that stuff into the general new density and you'll see why $Y_1$ and $Y_2$ are still independent:
\begin{align*}
g_{Y_1,Y_2}(y_1,y_2) &= f_{X_1,X_2}(x_1[y_1,y_2],x_2[y_1,y_2])|\det(J)| \\
&= f_{X_1}(x_1[y_1,y_2])f_{X_2}(x_2[y_1,y_2]) |\frac{\partial x_1}{\partial y_1}\frac{\partial x_2}{\partial y_2}| \\
&= f_{X_1}(x_1[y_1,y_2])|\frac{\partial x_1}{\partial y_1}| f_{X_2}(x_2[y_1,y_2]) |\frac{\partial x_2}{\partial y_2}|
\end{align*}
Still factors. Hence $Y_1$ is independent of $Y_2$.
- The functions $g_i$ don't need to be one-to-one, but you'd have to use the more beefed up version of the transformation to justify it. Same idea would apply, though.
Edit: @Whuber linked to a good thread that shows one of his answers demonstrating the same result using sigma fields, which is much more elegant and more generally applicable. His version always works, as long as the transformations are measurable, while mine only works for continuous random variables and certain types of transformations.
Regarding your second example where you ask about a random variable $X$ and it's square $X^2$: "[s]ince[sic] functions of independent random variables are independent"...neither of these answers will apply. Also you need to qualify what type of functions you're talking about.
With the way I understood your question, your second point seemed to be trying to come up with a sort of counterexample to help you better understand your situation. This is why I didn't really address it. But the reason it's false is because $p(X^2|X=x)$ is discrete with all of its mass on $x^2$, while the marginal $p(X^2)$ is continuous (chi-square). So they're obviously very different.