Augmented Hacking (ITPG-GT 2356)

With recent advances in hardware and software, millions of us now carry unprecedented spatial computing and world sensing technologies in our pockets. With these technologies in hand, how do we design AR experiences that are contextual at the core – that are sensitive to the spaces we inhabit and the behaviors of people in those spaces? How do we augment this better understanding of reality? This course will be a hands-on workshop where we create spatially aware, contextually driven AR applications unique to particular situations. We will examine the opportunities and challenges when designing for site-specific experiences – museums, live events, retail, medical settings, industrial environments, schools, and others. Topics will include image and object recognition, world mapping, people tracking, location anchors, the ARKit “depth api” (LiDAR enabled features), spatial audio, scene understanding and semantics, and more. For design and development, we’ll primarily use Apple technologies – ARKit, RealityKit and RealityComposer. We’ll also tap a variety of cloud services to store, move, process, and bring intelligence to the data generated and consumed in our experiences. 3D modeling skills are helpful but not required. While we’ll cover the basics, students should expect to spend additional time outside of class learning Swift and other related programming concepts. Full-time access to an iOS device (LiDAR-enabled is ideal but not required) and a Mac laptop running the latest operating systems are required. As part of the design process, we’ll host workshops and guest critiques with designers from top studios around New York City as well as directly interfacing with various teams at Apple.

Interactive Telecommunications (Graduate)
4 credits – 14 Weeks

Sections (Fall 2023)


ITPG-GT 2356-000 (21868)
09/07/2023 – 12/14/2023 Thu
3:00 PM – 5:00 PM (Late afternoon)
at Brooklyn Campus
Instructed by Lam, Nien · Buys, Sebastian

Mobile Lab (ITPG-GT 2180)

One of the most transformative consumer products in history, the iPhone remains the standard bearer for great design and user experience. With the latest version of iOS and the introduction of the iPhone X, Apple puts depth sensing and augmented reality in our pockets. How do we take advantage of this incredible platform to produce our own compelling experiences? This course will be a hands-on workshop where we explore the world beyond generic apps and push the boundaries of what’s possible on iOS hardware. We will cover both the design and technical elements that pertain to end-to-end mobile app development. Topics will include augmented reality, the Internet of Things, new interfaces for musical expression, voice interaction, wearables, and others… While the class will involve iOS programming instruction, the instructors will help facilitate coding using an established code base to ensure technical skills are not a bottleneck to implementation. Students should expect to spend additional time outside of class learning Swift and other related programming concepts. Full-time access to an iOS device and a Mac laptop running the latest operating systems are required. As part of the design process, we’ll host workshops and guest critiques with designers from top studios around New York City. At the end of the class, students will have a working app to add to their portfolios and a strong basis for future mobile application work.

Interactive Telecommunications (Graduate)
4 credits – 15 Weeks

Sections (Spring 2020)


ITPG-GT 2180-000 (23066)
01/27/2020 – 05/11/2020 Thu
3:00 PM – 5:00 PM (Late afternoon)
at Brooklyn Campus
Instructed by Lam, Nien · Buys, Sebastian