Machine Envisioned Stories: The Future of Storytelling with Augmented Reality
The stories we consume today are authored by not just humans but algorithmically tuned machines. If a story is a series of events that are suddenly given priority, importance, and structure, who or what decides this? Hosted in ArtCenter’s Immersion Lab and created in collaboration with Snap Inc Research, BBC Research, and Microsoft Research, Machine Envisioned Stories is a research studio that will explore the next phase of narrative design using cameras, machine vision, machine learning, augmented reality, and game development software. Through presentations from Snap and creative prototyping in the Immersion Lab, students will propose new ways to identify, author, and share events in collaboration with autonomous machines. Together, we will examine how the camera and machine vision (key component of AR) — feature, pattern, object, facial recognition — can co-author, revealing to us new types of events and details that once went unnoticed. How might these new, machine envisioned stories change how we understand and relate to one another?
We will be building upon my research and the work and insights developed from the spring TDS studio that is currently running. We will execute 1-3 projects. Students will have the option to attend IMX 2020 in Barcelona where we will be participating in a collaborative workshop with R&D groups from Snap, BBC, and Microsoft, as well as the other schools participating in this shared research challenge.