From Wikipedia, the free encyclopedia
Fourier transforms
Continuous Fourier transform
Fourier series
Discrete-time Fourier transform
Discrete Fourier transform
Fourier analysis
Related transforms
The first four partial sums of the Fourier series for a square wave
In mathematics, a Fourier series (English pronunciation: /ˈfɔərieɪ/) decomposes periodic functions or periodic signals into the sum of a (possibly infinite) set of simple oscillating functions, namely sines and cosines (or complex exponentials). The study of Fourier series is a branch of Fourier analysis.
Contents
[hide]
Definition[edit]
In this section, s(x) denotes a function of the real variable x, and s is integrable on an interval [x0, x0 + P], for real numbers x0 and P. We will attempt to represent s in that interval as an infinite sum, or series, of harmonically related sinusoidal functions. Outside the interval, the series is periodic with period P. It follows that if s also has that property, the approximation is valid on the entire real line. The case P = 2π is prominently featured in the literature, presumably because it affords a minor simplification, but at the expense of generality.
For integers N > 0, the following summation is a periodic function with period P:
Using the identities:
Function s(x) (in red) is a sum of six sine functions of different amplitudes and harmonically-related frequencies. Their summation is called a Fourier series. The Fourier transform, S(f) (in blue), which depicts amplitude vs frequency, reveals the 6 frequencies and their amplitudes. we can also write the function in these equivalent forms:
:
where:
When the coefficients (known as Fourier coefficients) are computed as follows:[7]
approximates on and the approximation improves as N → ∞. The infinite sum, is called the Fourier series representation of The Fourier series does not always converge, and even when it does converge