Main image
3rd October
written by Andrew

We use Mac’s as our presentation systems for EEG experiments at the Canadian Centre for Behavioural Neuroscience. My thesis work revolves around auditory attention. Part of that has been developing visual and auditory systems that work on a tight schedule. We use OpenGL for the majority of our visual experiments, and Core Audio for our acoustic ones.

To understand timing on the OS X, we need to discuss mach time. You’re going to run into mach time in every corner of the operating system, so you might as well get comfortable with it. Mach time is processor dependent, and doesn’t represent wall clock time. You can convert it into relative, processor independent time, and to do that I suggest reading the Q&A article. I’ll wait. Core Audio also provides routines for manipulating time stamps.

Prior to 10.4, NSTimer was often used to drive rendering, but the practice is no longer recommended. According to the NSTimer documentation “[…] the effective resolution of the time interval for a timer is limited to on the order of 50-100 milliseconds.” That is simply not good enough.

I’ve seen people enable vsynch and count elapsed frames, or grab time stamps immediately after a swap buffer command. This can work, but not particularly well. What we really want to know is what time our frame is going to be displayed so we can accurately place time stamps on the recorded EEG data.

Instead, this Q&A article shows you how to set up a CVDisplayLink, which lets you prepare a frame for a given time stamp. As of 10.4 this has been the preferred way of scheduling rendering. Notice that your drawing code will be running in a second thread, so plan accordingly.

In the above Q&A article, the call to [[self openGLContext] setValues:&swapInt forParameter:NSOpenGLCPSwapInterval] turns on vertical sync, which causes the display link requests to match vertical sync times.

With just these Q&A’s you’ve got nearly everything you need. There is still one more component – input. You could place a button on a view and grab the time stamp when your selector gets called, but what you really want is what time the mouse click occurred. We can get this by overriding NSButton, and handling – (void)mouseDown:(NSEvent*)event; (Don’t forget to call [super mouseDown:event]). NSEvent has a selector named “timestamp” that gives us back the time the event occurred in seconds. This is really the mach time the event occurred converted to seconds, and provided as a floating point value.

You should now have everything you need write accurately timed experiments. I’ll post code at some point in the future, but it may take some time; I’m thesising.

Good luck.