{"id":17378,"date":"2016-01-04T10:26:02","date_gmt":"2016-01-04T15:26:02","guid":{"rendered":"https:\/\/itp.nyu.edu\/opportunities\/?p=17378"},"modified":"2016-01-04T10:26:02","modified_gmt":"2016-01-04T15:26:02","slug":"job-seeking-ae-coding-help-for-vr-camera-project","status":"publish","type":"post","link":"https:\/\/itp.nyu.edu\/opportunities\/2016\/01\/04\/job-seeking-ae-coding-help-for-vr-camera-project\/","title":{"rendered":"[JOB] seeking AE coding help for VR camera project"},"content":{"rendered":"<p>We\u2019re right now hacking together a proof of concept for an\u00a0<a href=\"http:\/\/www.naimark.net\/projects\/imuvrcam.html\" target=\"_blank\">IMU VR camera<\/a>\u00a0using 2 cannibalized\u00a0<a href=\"https:\/\/www.google.com\/webhp?sourceid=chrome-instant&amp;ion=1&amp;espv=2&amp;ie=UTF-8#q=Xiaomi+camera\" target=\"_blank\">Xi video cameras<\/a>\u00a0and a\u00a0<a href=\"http:\/\/riembedded.com\/sld-ahr-logger\/\" target=\"_blank\">9-axis\u00a0IMU<\/a>\u00a0and are seeking help integrating the IMU data to reposition our video frame-by-frame within a stereo-panoramic space. We\u2019d like to stick with the emerging standard of L\/R equirectangular 360 video (though are open to alternatives such as twin spheres in WebVR) so are looking into using AfterEffects Expressions or Mocha.<\/p>\n<div><\/div>\n<div>We have world class expertise with 3D printing, equirectangular\u00a0video, and spatial audio, but need specific help with this task.<\/div>\n<div><\/div>\n<div>We also have an organic arugula farmer who wants to use our camera for covering the Iowa caucuses in February as a citizen reporter. :) We think, with some mods and pivots, we might have the seeds for an interesting live mobile VR camera in time for the political conventions this summer.<\/div>\n<div><\/div>\n<div>Please contact me directly if you\u2019re interested in learning more. Thanks, and Happy New Year!<\/div>\n<div><\/div>\n<div><b class=\"gmail_sendername\">Michael Naimark<\/b> <span dir=\"ltr\">&lt;<a href=\"mailto:michael@naimark.net\" target=\"_blank\">michael@naimark.net<\/a>&gt;<\/span><\/div>\n<p>&nbsp;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>We\u2019re right now hacking together a proof of concept for an\u00a0IMU VR camera\u00a0using 2 cannibalized\u00a0Xi video cameras\u00a0and a\u00a09-axis\u00a0IMU\u00a0and are seeking help integrating the IMU data&#8230;<\/p>\n","protected":false},"author":122,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[3],"tags":[],"class_list":["post-17378","post","type-post","status-publish","format-standard","hentry","category-job","entry"],"_links":{"self":[{"href":"https:\/\/itp.nyu.edu\/opportunities\/wp-json\/wp\/v2\/posts\/17378","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/itp.nyu.edu\/opportunities\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/itp.nyu.edu\/opportunities\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/itp.nyu.edu\/opportunities\/wp-json\/wp\/v2\/users\/122"}],"replies":[{"embeddable":true,"href":"https:\/\/itp.nyu.edu\/opportunities\/wp-json\/wp\/v2\/comments?post=17378"}],"version-history":[{"count":0,"href":"https:\/\/itp.nyu.edu\/opportunities\/wp-json\/wp\/v2\/posts\/17378\/revisions"}],"wp:attachment":[{"href":"https:\/\/itp.nyu.edu\/opportunities\/wp-json\/wp\/v2\/media?parent=17378"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/itp.nyu.edu\/opportunities\/wp-json\/wp\/v2\/categories?post=17378"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/itp.nyu.edu\/opportunities\/wp-json\/wp\/v2\/tags?post=17378"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}