Augmented Spaces (ITPG-GT 2356)

With recent advances in hardware and software, millions of us now carry unprecedented spatial computing and world sensing technologies in our pockets. With these technologies in hand, how do we design AR experiences that are contextual at the core – that are sensitive to the spaces we inhabit and the behaviors of people in those spaces? How do we augment this better understanding of reality? This course will be a hands-on workshop where we create spatially aware, contextually driven AR applications unique to particular situations. We will examine the opportunities and challenges when designing for site-specific experiences – museums, live events, retail, medical settings, industrial environments, schools, and others. Topics will include image and object recognition, world mapping, people tracking, location anchors, the ARKit “depth api” (LiDAR enabled features), spatial audio, scene understanding and semantics, and more. For design and development, we’ll primarily use Apple technologies – ARKit, RealityKit and RealityComposer. We’ll also tap a variety of cloud services to store, move, process, and bring intelligence to the data generated and consumed in our experiences. 3D modeling skills are helpful but not required. While we’ll cover the basics, students should expect to spend additional time outside of class learning Swift and other related programming concepts. Full-time access to an iOS device (LiDAR-enabled is ideal but not required) and a Mac laptop running the latest operating systems are required. As part of the design process, we’ll host workshops and guest critiques with designers from top studios around New York City as well as directly interfacing with various teams at Apple.

Interactive Telecommunications (Graduate)
4 credits – 15 Weeks

Sections (Fall 2021)


ITPG-GT 2356-000 (23992)
09/02/2021 – 12/14/2021 Wed
3:00 PM – 6:00 PM (Late afternoon)
at Brooklyn Campus
Instructed by