York University, Future Cinema Lab,
Project: Development of Media Artist's interface to Augmented Reality: "MAXTag"

Development Team:
Andrew Roth – A.R. System Design
Andrei Rotenstein – Lead Software Developer
Mikhail Sizintsev – Software Developer
white paper
 

AR Marker Tracking System by Mark Fiala

Project Coordination:
Caitlin Fisher
Geoffrey Alan Rhodes

MAXTag initial project: 52 Card Psycho

The incorporation of optical marker tracking into Max/MSP came from a desire to rapidly prototype Augmented Reality while creating a workflow that would be familiar to media artists. There have been only a few platforms that make AR simple for artists and students to experiment with their own content and we hope to make another powerful tool to that end. In addition, MAXTag represents the most powerful tool of its kind to date, allowing for real time tracking of high multiples of markers in standard ambient lighting conditions. This is not the first time Max/MSP has been used in Augmented Reality; however, we’ve taken a novel approach to incorporating ARTag as a standardized Max object in order for users to take advantage of the environment.

Max, by Cycling ’74, is a graphical programming language that is being used as a platform for experimental multimedia performances and installations. The Audio Signal Processing library, MSP, and the Video Signal Processing library, Jitter, have become tightly incorporated into the Max environment. The three libraries are commonly referred to as simply Max/MSP.

The biggest advantage to using MAXTag is its scalability. As a first time user, you can build an application quickly and begin playing with your own content. All one needs is a computer with a webcam and a Quicktime video. It is very easy after that to transfer the application to better computers with industrial cameras in order to run the same application at a higher resolution and framerate. Initially conceived to produce the 52 Card Psycho project, where a deck of cards is mapped with 52 individual videos, MAXTag has been designed to map two dimensional video content to markers, but we hope to develop 3D object functionality in the future.

At this time, the MAXTag object is already extremely robust, largely due to the innovations of Mark Fiala's Marker Tracking System. Running on a MacBook laptop, MAXTag can recognize and map 52 markers with individual videos at 320X240 with an output & live camera resolution of 640X480 at no less than 15 frames per second (with a desktop computer the resolution and framerates are higher). This preserves the goal of realism and effectively instantaneous marker tracking.

The Future Cinema/ ARLab is currently undergoing discussions for the outside licensing of this Max object, and continuing to develop and update the object.

For information see futurestories.ca