Jump to content

Dini test

From Wikipedia, the free encyclopedia
(Redirected from Dini's test)

In mathematics, the Dini and Dini–Lipschitz tests are highly precise tests that can be used to prove that the Fourier series of a function converges at a given point. These tests are named after Ulisse Dini and Rudolf Lipschitz.[1]

Definition

[edit]

Let f be a function on [0,2π], let t be some point and let δ be a positive number. We define the local modulus of continuity at the point t by

Notice that we consider here f to be a periodic function, e.g. if t = 0 and ε is negative then we define f(ε) = f(2π + ε).

The global modulus of continuity (or simply the modulus of continuity) is defined by

With these definitions we may state the main results:

Theorem (Dini's test): Assume a function f satisfies at a point t that
Then the Fourier series of f converges at t to f(t).

For example, the theorem holds with ωf = log−2(1/δ) but does not hold with log−1(1/δ).

Theorem (the Dini–Lipschitz test): Assume a function f satisfies
Then the Fourier series of f converges uniformly to f.

In particular, any function that obeys a Hölder condition satisfies the Dini–Lipschitz test.

Precision

[edit]

Both tests are the best of their kind. For the Dini-Lipschitz test, it is possible to construct a function f with its modulus of continuity satisfying the test with O instead of o, i.e.

and the Fourier series of f diverges. For the Dini test, the statement of precision is slightly longer: it says that for any function Ω such that

there exists a function f such that

and the Fourier series of f diverges at 0.

See also

[edit]

References

[edit]
  1. ^ Gustafson, Karl E. (1999), Introduction to Partial Differential Equations and Hilbert Space Methods, Courier Dover Publications, p. 121, ISBN 978-0-486-61271-3