GLMs model the conditional distribution of a response given
a set of predictors $[Y|\underline{x}]$,
where $\underline{x}=(x_1,x_2,...,x_p)'$.
One of the main components of the model is the link function, which relates the conditional expectation $E(Y|\,\underline{x}\,)=\mu(\,\underline x\,)$ to the linear predictor ($\eta=\underline{x}'\beta$), through the link $g$; that is, it specifies that the model for the conditional mean is $g(\mu_i)=\underline{x}_i'\beta$ (here the $i$ subscript is denoting the random variable for the $i$th observation).
For example, a Poisson family has a default log-link $\log(\mu) = \eta = \underline{x}_i'\beta$, while the default link for the Gaussian is the identity link $\mu= \eta =\underline{x}_i'\beta$
In the case of the power link, some power of the conditional mean is linear in the predictors:
$$\mu^\lambda=\underline{x}_i'\beta\,.$$
The Poisson family in R includes the square root link, which is an example of a power link. With the power-link function you can specify some power that's not otherwise available.
In short, the power link is used so you can have a mean-model like $E(Y|x)^\lambda = \beta_0+\beta_1x_1+...+\beta_px_p$ for whichever value of $\lambda$ you choose.
An example:
glm(mpg~disp,mtcars,family=Gamma(link=power(1/3)))
Here $Y$ is is the variable mpg
and $x$ is disp
. This model says that $E(Y|x)=\mu(x)$, where $\mu(x)^\frac13=\beta_0+\beta_1x$ and that the conditional distribution of $Y$ is gamma with mean $\mu$ (so the variance function is $\mu^2$). Here's the resulting fit:

(this isn't a suitable model for several reasons, but it will do to demonstrate the point)
The Tweedie family (which is not implemented in the vanilla glm function in R) has for its canonical link a form of power link, ($\eta=\frac{\mu^{1-p}}{1-p}$ -- as well as a power variance function $V(\mu)=\mu^p$).