We’re right now hacking together a proof of concept for an IMU VR camera using 2 cannibalized Xi video cameras and a 9-axis IMU and are seeking help integrating the IMU data to reposition our video frame-by-frame within a stereo-panoramic space. We’d like to stick with the emerging standard of L/R equirectangular 360 video (though are open to alternatives such as twin spheres in WebVR) so are looking into using AfterEffects Expressions or Mocha.
We have world class expertise with 3D printing, equirectangular video, and spatial audio, but need specific help with this task.
We also have an organic arugula farmer who wants to use our camera for covering the Iowa caucuses in February as a citizen reporter. :) We think, with some mods and pivots, we might have the seeds for an interesting live mobile VR camera in time for the political conventions this summer.
Please contact me directly if you’re interested in learning more. Thanks, and Happy New Year!
Michael Naimark <michael@naimark.net>