Appendix B the Fourier Transform of Causal Functions
Total Page:16
File Type:pdf, Size:1020Kb
Appendix A Distribution Theory We expect the reader to have some familiarity with the topics of distribu tion theory and Fourier transforms, as well as the theory of functions of a complex variable. However, to provide a bridge between texts that deal with these topics and this textbook, we have included several appendices. It is our intention that this appendix, and those that follow, perform the func tion of outlining our notational conventions, as well as providing a guide that the reader may follow in finding relevant topics to study to supple ment this text. As a result, some precision in language has been sacrificed in favor of understandability through heuristic descriptions. Our main goals in this appendix are to provide background material as well as mathematical justifications for our use of "singular functions" and "bandlimited delta functions," which are distributional objects not normally discussed in textbooks. A.l Introduction Distributions provide a mathematical framework that can be used to satisfy two important needs in the theory of partial differential equations. First, quantities that have point duration and/or act at a point location may be described through the use of the traditional "Dirac delta function" representation. Such quantities find a natural place in representing energy sources as the forcing functions of partial differential equations. In this usage, distributions can "localize" the values of functions to specific spatial 390 A. Distribution Theory and/or temporal values, representing the position and time at which the source acts in the domain of a problem under consideration. The second important role of distributions is to extend the process of differentiability to functions that fail to be differentiable in the classical sense at isolated points in their domains of definition. In this role, distribu tions may localize the singular behavior of the derivatives of functions-for example, the singularities of the derivative caused by jump discontinuities. In a related role, the process of differentiation itself may be localized via distributions to specific positions and/or temporal values in a domain of interest. For example, dipole sources may be distributionally represented by localizing the derivative operation to a point. Or, an "exploding reflector" may be represented by localizing the derivative normal to a given surface to points on that surface. A.2 Localization via Dirac Delta functions The classical definition of the Dirac delta "function" originally used by physicists was for lxl ::/= 0 (A.2.1) for x = 0, with the property that the "delta function" would not introduce any scaling of its own if it appeared in an integral. This second property is expressed by the integral I: 8(x)dx = 1, where we may think of 8(x) as being integrated with a function (not shown) whose value is unity near x = 0. The definition and property above are contradictory as written, because any function that is nonzero at only a finite number of isolated points has an integral that vanishes, even if the nonzero values of the function are arbitrarily large. The modern approach to distributions, based on the work of Laurent Schwartz [Schwartz, 195Q-1951], follows from a consideration of the theory of topological vector spaces. One result of this theory identifies 8(x) as a "measure." (A measure is a weight applied in the Lebesgue formulation of the in tegral.) Such modern treatments may be found in other, more advanced texts [Friedlander, 1982; Petersen, 1983; Hormander, 1983]; Treves, 1967; and Treves, 1980]. While this material is far outside the scope of our book, the exposition we present does, in fact, depend on these more advanced results. In particular, we need to have the ability to approximate functions and distributions, to an arbitrary degree of precision, with sequences of A.2 Localization via Dirac Delta functions 391 smooth functions. This ability is guaranteed by theorems that are beyond the scope of our text. We will take a more classical approach through so-called "delta se quences." Such sequences of functions are discussed in mathematical physics and applied mathematics texts, such as Butkov [1968], Nussenzveig [1972], and Stakgold [1979]. This is the "generalized function" approach to describing distributions, and is also largely founded on the work of Schwartz. Distributions As the Limits of Delta Sequences The basic idea is to consider distributions to be a generalization of the concept of function. The way this is done is to consider a distribution, or rather the result of integration of a function with a distribution, as being a "linear functional." A functional is an operator that assigns a number to each function of a particular class of test functions. (The operation is linear because the linear combination of values assigned to several functions is the same as the value assigned to a linear combination of the same functions.) We then create a sequence of linear functionals, defined as integration with elements of a sequence of functions {Sn(x)}, in such a way that as n ---. oo, the limiting functional behaves the same way that the distribution in question would behave. In this way, the limiting functional can make sense even though the pointwise limit of the sequence { Sn (x)} as n ---. oo does not exist as an ordinary function. This is one way to arrive at dis tributions such as 8(x). The properties of distributions can be determined only through integration with a "sufficiently well behaved" test function out of an appropriate class. For example, the reader may have already encountered the "box se quence." The graph of each function in the sequence is a box, with the integral of each function being equal to one. As n increases, the width of each successive box decreases towards zero, while the height, necessar ily, must increase beyond all bounds. If we allow infinity as a limiting value, then this sequence leads back to the "function" introduced in (A.2.1). Furthermore, 2~~ /_: f(x)Sn(x)dx = f(O), for any f(x) in an appropriate class of test functions. This is the "sifting property" of the Dirac delta function. As a second example, and one that arises in various forms in the text, we introduce the sequence, sin(mrx) _....:....__,;_' x=FO, 1l'X s.(x) ~ { n n = 1,2,· ·· . X= 0, 392 A. Distribution Theory This sequence of functions does not have pointwise limits (except for infinity at x = 0), yet it can be shown that this sequence has the same sifting property as did the sequence of the previous example [Sneddon, 1972]. Let us now address the question of the appropriate class of test func tions. From the standard theory of distributions, these are defined to be those functions that are infinitely differentiable and have "compact sup port," meaning that each function is nonzero on a closed and bounded interval, but is zero everywhere else. Formally, test functions having both of these properties are called C0 test functions. Such a function and all of its derivatives will vanish smoothly (that is to say, without singularities) out side of some closed and bounded interval. In fact, the name "test function" is reserved exclusively for the C0 functions in much of the mathematical literature that deals with the subject of partial differential equations. By restricting the test functions to be extraordinarily well-behaved, an extremely wide range of entities can act as "distributions" because a broad class of functions, and other mathematical objects, including all delta se quence functions and their limits, can be integrated with a C0 function. The reader may have some question as to the validity of this approach, however. Is it not our interest to create distributions that will operate on ordinary functions? The answer to this question is that it is a standard trick in functional analysis to derive results for extremely well-behaved functions and then extend those results to "ordinary" functions by show ing that ordinary functions may be approximated by the well-behaved ones. The validity of such approximations depends on powerful theorems from functional analysis that show that well-behaved functions exist "arbitrarily closely" to any ordinary function that we may choose. One such result is that any continuous function f(x) with compact support can be uniformly approximated on a finite interval by C0 functions. For example, if ¢n(x) represents a delta sequence, that is, it behaves like 8(x) as n---+ oo, and is nonzero in the domain of support of f(x), then it is possible to write 1/Jn(x) =I: f(~)¢n(X- ~) d~. (A.2.2) This is just the convolution of our original function with an appropri ately chosen test function. There exists an n such that the C0 function 1/Jn(x) approximates f(x) to any desired order of accuracy. This process of building smoother approximate functions through convolution with a test function is called "regularization" in the language of mathematical analysis. In the language of signal processing, this process is equivalent to applying a "smoothing filter" that preserves the values of the function on some part of its domain. That is, in the frequency domain, we multiply by a function that is (nearly) equal to unity over some bandwidth and then tapers smoothly to zero. We use the term "neutralizer" for such functions in several places in the text. In mathematical literature, these functions A.2 Localization via Dirac Delta functions 393 are often referred to as "mollifiers," "approximate identity functions," or "partitions of unity." 1 All of the properties of the delta function (really, the Dirac distribution) that are implied by the physicist's original formal 2 definition may now be derived, with the primary property being the sifting property, introduced above, ¢(0) = Ldx ¢(x)8(x).