Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

Fast Fourier Transform (FFT) and Big Data

DZone's Guide to

Fast Fourier Transform (FFT) and Big Data

FFT the forerunner of asymptotic algorithmic analysis to reduce computational complexity.

· Performance Zone
Free Resource

The most direct way to compute a Fourier transform numerically takes O(n2) operations. The Fast Fourier Transform (FFT) can compute the same result in O(n log n) operations. If n is large, this can be a huge improvement.

James Cooley and John Tukey (re)discovered the FFT in 1965. It was thought to be an original discovery at the time. Only later did someone find a sketch of the algorithm in the papers of Gauss.

Daniel Rockmore wrote the article on the Fast Fourier Transform in The Princeton Companion to Applied Mathematics:

[Cooley] told me that he believed that the Fast Fourier transform could be thought of as one of the inspirations for asymptotic algorithmic analysis and the study of computational complexity...

And in the new world of 1960s 'Big Data,' a clever reduction in computational complexity could make a tremendous difference.
Topics:
big data analytics ,mathematical programming ,complexity ,algorithms

Published at DZone with permission of John Cook, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}