I am writing an audio player for OSX. One view is a custom view that displays a waveform. The waveform is stored as a instance variable of type NSImage with an NSBitmapImageRep. The view also displays a progress indicator (a thick red line). Therefore, it is updated/redrawn every 30 milliseconds.
Since it takes a rather long time to recalculate the image, I do that in a background thread after every window resize and update the displayed image once the new image is ready. In the meantime, the original image is scaled to fit the view like this:
// The drawing rectangle is slightly smaller than the view, defined by
// the two margins.
NSRect drawingRect;
drawingRect.origin = NSMakePoint(sideEdgeMarginWidth, topEdgeMarginHeight);
drawingRect.size = NSMakeSize([self bounds].size.width-2*sideEdgeMarginWidth,
[self bounds].size.height-2*topEdgeMarginHeight);
[waveform drawInRect:drawingRect
fromRect:NSZeroRect
operation:NSCompositeSourceOver
fraction:1];
The view makes up the biggest part of the window. During live resize, audio starts choking. Selecting the "big" graphic card on my Macbook Pro makes it less bad, but not by much. CPU utilization is somewhere around 20-40% during live resizes.
Instruments suggests that rescaling/redrawing of the image is the problem. Once I stop resizing the window, CPU utilization goes down and audio stops glitching.
I already tried to disable image interpolation to speed up the drawing like this:
[[NSGraphicsContext currentContext]
setImageInterpolation:NSImageInterpolationNone];
That helps, but audio still chokes during live resizes.
Do you have an idea how to improve this?
The main thing is to prevent the audio from choking.