Seminar: July 8, 4:30pm, Ry 277
Martin Furer, Pennsylvania State University
How fast can we multiply?
In 1962, Karatsuba has presented an integer multiplication algorithm which multiplies two binary or decimal numbers in less than quadratic time, showing that school multiplication is not optimal. Faster methods have been designed quickly. All are based on some version of the Chinese Remainder Theorem. Most methods reduce integer multiplication to fast polynomial multiplication, which itself is a scheme for the evaluation of polynomials, followed by multiplication of their values and interpolation.
For more than 35 years, the fastest known multiplication method has been the Scho"nhage-Strassen algorithm. It is based on the fast Fourier transform (FFT) and runs in time O(n log n log log n). Only in 2007 has a faster algorithm come closer to the conjectured optimal running time of O(n log n). The running time bounds hold for multi-tape Turing machines. The same bounds are valid for the size of Boolean circuits. Interestingly, this latest improvement would have come much earlier and more natural if there were an integer k > 0 such that for every m > 0, there is a Fermat prime between the Fermat numbers F[m] and F[2^{m + k}].