GOAL: To experiment with non-traditional interface input mechanisms, such as motion tracking.

OBJECTIVE: Build a Max program that plays audio and/or video based on user motion.  Use presentation mode to design a functional and aesthetically pleasing user interface for your application.

PROCESS:

In brief, we will be looking at how to detect motion within a camera’s field of view.  So the first thing you need to ask yourself is what kind of app do you want to build?  Things like an ‘air’ drum kit, a swipe based video switcher, a burglar alarm, an art installation, etc are all within the realm of possibility.  Brainstorm and sketch some ideas about how you could use user motion to drive and audio/visual system and why you would want to do so.  You should make at least one role based prototype for your idea as well (storyboard or scenario).

To get rolling with the code we’ll need to get camera input and detect motion. Of course, there’s an updated super patch.

While motion detection will let us know if something (anything) in the cameras view has changed, it works only moment to moment.  Standing still can trick it into thinking there is nothing there.  Presence detection collects a running average of the environment to use as a subtraction frame.  This will let us know if something new (a person, cat, package, etc) is in the environment, even if it is still.  For our presence detection we’ll also need to install a package or library.

Both those techniques work on the camera’s entire field of view, but what if we want to monitor a specific area, like a door way, or multiple areas? We can use the scissors object to cut up the camera frame and run our tracking on each section of it:

You already know how to play video and audio files,  for a refresher see the previous modules. But what about images?  Or UI buttons?

One thing that may be helpful though would be some data smoothing.  You’ll notice that the motion data jumps around quite a bit due to signal noise in the system. We can smooth some of this out with filtering.  Finally, lets make out code look good with presentation mode.

Requirements:

  • Brainstorm and make a role prototype for your motion/presence detection project.
  • Use [jit.qt.grab] or the Vizzie Grabbr for input.
  • Use [jit.scissors] to break your video grab into at least a 2 X 2 grid.
  • Use [jit.movie], [sfplay~], and/or [jit.playlist] for output.  All media files should auto-load using filename attributes or loadbangs. All files total must be less than 250GB.
  • Use motion/presence detection via frame differencing to control parameters of the output using [jit.op @absdiff].
  • Implement an user adjustable threshold on the motion/presence with [jit.op @op >].
  • Use encapsulation to keep your code organized.
  • Use comments to explain the functionality of the code.  All user input elements should be labeled.
  • Use comments to explain the concept behind your piece.  This should be a full paragraph.  Include a question about your work that you’d like the peer reviewers to address.
  • Use presentation mode do show the user a clean version of your UI with just the elements they need to interact with it.  Anything that is not user facing should not be in presentation mode, such as all your encapsulated code.  You can make your patch open in presentation mode by going to the inspector > click the ‘P’ icon at the top center of the inspector pane > select the Basic tab > check the Open in Presentation box.
  • Be interesting.
  • Create a 1-2minute video screen capture demonstration of your code with audio narration explaining how it works and who it is for.

Optional Considerations

  • Use [send] & [receive] to help clean up your patch (see the super patch)
  • Control the flow of data using gates and switches (see the super patch)
  • Try implementing data smoothing with [slide] (see the super patch)
  • Consider camera placement.  What would it mean for a camera to be ceiling mounted and thus tracking movement across the floor of a room or stage? A camera mounted over a desk tracking a users hands?  Face on seated does not have to be the default.

Save your max patch and all the video and/or audio clips in the same folder.  Add a scan/photo/copy of your role prototype to the folder as ‘role.pdf’.  Place your video explanation in this folder as ‘demo.mp4’. Name this folder MotionYourName.  Submit a zip archive of this folder here.  (You can create a zip archive on Mac by right clicking the folder and selecting ‘Compress’.)

As your Max license may be expiring soon, you might want to create an executable of your projects so that you can run them in the future.  This essentially turns a patch into a standalone app.  There is a great video series on this (parts 1, 2 , 3, and 4) that gets into the details of optimization, creating icons, reducing file size, etc.  But really you get everything you need in part 1.  This is entirely optional.

STUDENT WORK