DSP - Filter sweep effect

Posted by Trap on Stack Overflow See other posts from Stack Overflow or by Trap
Published on 2010-06-16T17:45:41Z Indexed on 2010/06/16 17:52 UTC
Read the original article Hit count: 291

Filed under:
|
|
|
|

I'm implementing a 'filter sweep' effect (I don't know if it's called like that). What I do is basically create a low-pass filter and make it 'move' along a certain frequency range.

To calculate the filter cut-off frequency at a given moment I use a user-provided linear function, which yields values between 0 and 1.

My first attempt was to directly map the values returned by the linear function to the range of frequencies, as in cf = freqRange * lf(x). Although it worked ok it looked as if the sweep ran much faster when moving through low frequencies and then slowed down during its way to the high frequency zone. I'm not sure why is this but I guess it's something to do with human hearing perceiving changes in frequency in a non-linear manner.

My next attempt was to move the filter's cut-off frequency in a logarithmic way. It works much better now but I still feel that the filter doesn't move at a constant perceived speed through the range of frequencies.

How should I divide the frequency space to obtain a constant perceived sweep speed?

Thanks in advance.

© Stack Overflow or respective owner

Related posts about audio

Related posts about filter