DSP - Filtering frequencies using DFT
- by Trap
I'm trying to implement a DFT-based 8-band equalizer for the sole purpose of learning. To prove that my DFT implementation works I fed an audio signal, analyzed it and then resynthesized it again with no modifications made to the frequency spectrum. So far so good.
I'm using the so-called 'standard way of calculating the DFT' which is by correlation. This method calculates the real and imaginary parts both N/2 + 1 samples in length. To attenuate a frequency I'm just doing:
float atnFactor = 0.6;
Re[k] *= atnFactor;
Im[k] *= atnFactor;
where 'k' is an index in the range 0 to N/2, but what I get after resynthesis is a slighty distorted signal, especially at low frequencies.
The input signal sample rate is 44.1 khz and since I just want a 8-band equalizer I'm feeding the DFT 16 samples at a time so I have 8 frequency bins to play with.
Can someone show me what I'm doing wrong? I tried to find info on this subject on the internet but couldn't find any.
Thanks in advance.