I refer to my answer to the question: How to calculate decision boundary from support vectors?, where; in the case of a linear kernel, $K(x,y)= x \cdot y$, where '$\cdot$' is the inner product, so if you have n features for each observation then $ x \cdot y=\sum_{i=1}^n x_i y_i$.
In the answer I referred to supra, you can see that equation for the boundary (the separating hyperplane) is $f(x)=\sum_{k \in SV} \alpha_k y_k s_k \cdot x + b$. For computing $b$ you should take one observation for which the Lagrange multiplier is strictly smaller than $C$, and strictly positive. Assume this is object $m$ and use it to compute $b$ as $b =\frac{1}{y_m} - \sum_{k \in SV} \alpha_k y_k x_m \cdot s_k$. (it could be that your software computes $b$)
These are the same equations as in the answer that I referred to, with $K(x,y)=x \cdot y$.
From the equation $f(x)$ of the hyperplane, it follows that the normal vector of the separating hyperplane is $w=\sum_{k \in SV} \alpha_k y_k s_k$ (as $y_k$ and $\alpha_k$ are numbers and $s_k$ are vectors, the result is a vector). (it could be that your software computes $w$).
(note that, for a hyperplane with equation $ n \cdot x + c$ (where $n$ and $x$ are vectors and $c$ is a scalar), $n$ is the normal vector to the plane).
If you want to compute the distance from a point (an observation in your case) $x_0$ (the feature vector of the observation), then this distance $D(x_0)$ is given by $D(x_0)=\frac{|f(x_0)|}{\sqrt{w \cdot w}}$ (the vertical bars mean 'take the absolute value').
You can compute $D(x_0)$ for every observation $x_0$ that you have, just take the absolute value of function value $f(x_0)$, $f$ as supra, and divide it by the norm of the normal vector.