Small-o(n^2) implementation of Polynomial Multiplication
Posted
by
AlanTuring
on Stack Overflow
See other posts from Stack Overflow
or by AlanTuring
Published on 2012-11-12T14:50:53Z
Indexed on
2012/11/12
17:00 UTC
Read the original article
Hit count: 398
I'm having a little trouble with this problem that is listed at the back of my book, i'm currently in the middle of test prep but i can't seem to locate anything regarding this in the book.
Anyone got an idea?
A real polynomial of degree n is a function of the form f(x)=a(n)x^n+?+a1x+a0, where an,…,a1,a0 are real numbers. In computational situations, such a polynomial is represented by a sequence of its coefficients (a0,a1,…,an). Assuming that any two real numbers can be added/multiplied in O(1) time, design an o(n^2)-time algorithm to compute, given two real polynomials f(x) and g(x) both of degree n, the product h(x)=f(x)g(x). Your algorithm should **not** be based on the Fast Fourier Transform (FFT) technique.
Please note it needs to be small-o(n^2), which means it complexity must be sub-quadratic.
The obvious solution that i have been finding is indeed the FFT, but of course i can't use that. There is another method that i have found called convolution, where if you take polynomial A to be a signal and polynomial B to be a filter. A passed through B yields a shifted signal that has been "smoothed" by A and the resultant is A*B. This is supposed to work in O(n log n) time. Of course i am completely unsure of implementation.
If anyone has any ideas of how to achieve a small-o(n^2) implementation please do share, thanks.
© Stack Overflow or respective owner