What I am interested in doing is taking a 360 degree stereo video (Oculus Rift etc), and a description of how the viewing direction changes with time (rather than reading from gyroscopes) and then rendering that to a simple side by side rectangular video (so that you can stick the result on a smartphone and view it with the inbuilt video viewer).
This will naturally involve something like sphere mapping the source, and rendering the result, as the smartphone apps do. I am interested in doing this transformation offline. Is there software around that already does this? Or what are the most suitable libraries to use if I want to program it myself?