Dispersion of electromagnetic waves is usually described in terms of an integrodifferential equation. We show that whenever a differential operator can be found that annihilates the susceptibility kernel of the medium, dispersion can be modeled by a partial differential equation without nonlocal operators.
© 1998 Optical Society of America
Original Manuscript: December 15, 1997
Revised Manuscript: March 23, 1998
Manuscript Accepted: March 24, 1998
Published: August 1, 1998
R. L. Ochs and G. Kristensson, "Using local differential operators to model dispersion in dielectric media," J. Opt. Soc. Am. A 15, 2208-2215 (1998)