Abstract
Teleportation will be a revolutionary technology that has the potential to dominate the future by transforming the way we interact with the world and each other. It enables users to simulate physical presence in remote locations and experience them in a highly immersive and interactive way using VR, AR, and haptic feedback. The potential applications of VR virtual teleportation are vast, ranging from virtual tourism to remote collaboration and telemedicine. Foresight is essential in understanding the implications of this technology and anticipating its potential impact on society.
Over my time at ITP, I have been able to investigate how Virtual Reality and Artificial Intelligence could help me overcome one of the major challenges VR developers and researchers face -- the problem of virtual locomotion i.e., techniques that allow a user to navigate efficiently in a virtual environment without getting VR sick. For my thesis, I am trying to resolve one of the biggest issues facing VR as a platform, which is the limitation of the user’s physical space. Not everyone has a lab, empty warehouse, or open space in their home or office, and even if they do, the hardware also limits the physical space the user can take advantage of. Fitting the entirety of the environment within a few square meters is a strict limitation for many applications. A method of moving the user within a larger space is needed, but current methods come with drawbacks. Developing a new movement method that avoids these drawbacks will help ensure a better experience for the user.
I came up with a solution that could make the experience of virtual reality effortless and obtrusive as possible by creating low latency applications.
Research
I had an enriching discussion with Meta Product Engineer & Microsoft Research Engineer about Holoportation, a 3D capture technology that uses VR and AR to enable users to interact with each other in a holographic view. One of the discussed advancements was AI workmate, which can assist users with daily tasks, including scheduling and translation. I also focused on VR motion sickness, which can cause nausea, dizziness, and headaches. As a part of my research I interviewed Dr. Raj Tandon, who suggested preventive treatments such as fresh air, aromatherapy, ginger supplements, low-latency VR games, and avoiding heavy meals before playing.
Technical Details
3D Model: Created 3D model in Blender (Why Blender? Because it is open source)
1.Created a room model.
2.Unwrapping (for texture) the room model. UV unwrapped.
3.Export for Unity in FBX format
Unity Setup:
Imported 3D Model (FBX)
Assigned materials
Build and player settings for MAC
Saved scene – Hypercinema Assignment
Created folder structure (Materials, 3D Model, Scripts, Animations)
Lighting in Unity:
URP – Universal Rendered Pipeline Package downloaded – For better lighting used URP (for global illumination, soft shadows)
Direct light setup, shadows control in direct light
Lighting settings – GI controls, shadow controls, map controls (shadow map resolution)
(1024, 2048, 4056- for shadow map quality)
All materials converted into URP
If its not converted to URP the object is in purple color.
Door Animations:
Created Animator
Created Animation (Door open close animation)
Animation controlled with script
Scripts: C# scripting : Used Unity API
Player control using keyboard arrow keys to move left, right, forward and back. Rotation control with mouse.
When camera collider collides with door, It opens automatically.
Closed button for QUIT application
Video and Audio player controls ( what scripts.- Only when the door opens the Video and audio plays.
360 Video Setup:
Attached spheres to the 3 doors.
Attached video player to the sphere and assigned 360 video
Camera Control:
Player control using keyboard arrow keys to move left, right, forward and back. Rotation control with mouse.
Build for Mac:
In player settings assigned application logo and set application name.
Build for MAC application