- Source: Differential operator
In mathematics, a differential operator is an operator defined as a function of the differentiation operator. It is helpful, as a matter of notation first, to consider differentiation as an abstract operation that accepts a function and returns another function (in the style of a higher-order function in computer science).
This article considers mainly linear differential operators, which are the most common type. However, non-linear differential operators also exist, such as the Schwarzian derivative.
Definition
Given a nonnegative integer m, an order-
m
{\displaystyle m}
linear differential operator is a map
P
{\displaystyle P}
from a function space
F
1
{\displaystyle {\mathcal {F}}_{1}}
on
R
n
{\displaystyle \mathbb {R} ^{n}}
to another function space
F
2
{\displaystyle {\mathcal {F}}_{2}}
that can be written as:
P
=
∑
|
α
|
≤
m
a
α
(
x
)
D
α
,
{\displaystyle P=\sum _{|\alpha |\leq m}a_{\alpha }(x)D^{\alpha }\ ,}
where
α
=
(
α
1
,
α
2
,
⋯
,
α
n
)
{\displaystyle \alpha =(\alpha _{1},\alpha _{2},\cdots ,\alpha _{n})}
is a multi-index of non-negative integers,
|
α
|
=
α
1
+
α
2
+
⋯
+
α
n
{\displaystyle |\alpha |=\alpha _{1}+\alpha _{2}+\cdots +\alpha _{n}}
, and for each
α
{\displaystyle \alpha }
,
a
α
(
x
)
{\displaystyle a_{\alpha }(x)}
is a function on some open domain in n-dimensional space. The operator
D
α
{\displaystyle D^{\alpha }}
is interpreted as
D
α
=
∂
|
α
|
∂
x
1
α
1
∂
x
2
α
2
⋯
∂
x
n
α
n
{\displaystyle D^{\alpha }={\frac {\partial ^{|\alpha |}}{\partial x_{1}^{\alpha _{1}}\partial x_{2}^{\alpha _{2}}\cdots \partial x_{n}^{\alpha _{n}}}}}
Thus for a function
f
∈
F
1
{\displaystyle f\in {\mathcal {F}}_{1}}
:
P
f
=
∑
|
α
|
≤
m
a
α
(
x
)
∂
|
α
|
f
∂
x
1
α
1
∂
x
2
α
2
⋯
∂
x
n
α
n
{\displaystyle Pf=\sum _{|\alpha |\leq m}a_{\alpha }(x){\frac {\partial ^{|\alpha |}f}{\partial x_{1}^{\alpha _{1}}\partial x_{2}^{\alpha _{2}}\cdots \partial x_{n}^{\alpha _{n}}}}}
The notation
D
α
{\displaystyle D^{\alpha }}
is justified (i.e., independent of order of differentiation) because of the symmetry of second derivatives.
The polynomial p obtained by replacing partials
∂
∂
x
i
{\displaystyle {\frac {\partial }{\partial x_{i}}}}
by variables
ξ
i
{\displaystyle \xi _{i}}
in P is called the total symbol of P; i.e., the total symbol of P above is:
p
(
x
,
ξ
)
=
∑
|
α
|
≤
m
a
α
(
x
)
ξ
α
{\displaystyle p(x,\xi )=\sum _{|\alpha |\leq m}a_{\alpha }(x)\xi ^{\alpha }}
where
ξ
α
=
ξ
1
α
1
⋯
ξ
n
α
n
.
{\displaystyle \xi ^{\alpha }=\xi _{1}^{\alpha _{1}}\cdots \xi _{n}^{\alpha _{n}}.}
The highest homogeneous component of the symbol, namely,
σ
(
x
,
ξ
)
=
∑
|
α
|
=
m
a
α
(
x
)
ξ
α
{\displaystyle \sigma (x,\xi )=\sum _{|\alpha |=m}a_{\alpha }(x)\xi ^{\alpha }}
is called the principal symbol of P. While the total symbol is not intrinsically defined, the principal symbol is intrinsically defined (i.e., it is a function on the cotangent bundle).
More generally, let E and F be vector bundles over a manifold X. Then the linear operator
P
:
C
∞
(
E
)
→
C
∞
(
F
)
{\displaystyle P:C^{\infty }(E)\to C^{\infty }(F)}
is a differential operator of order
k
{\displaystyle k}
if, in local coordinates on X, we have
P
u
(
x
)
=
∑
|
α
|
=
k
P
α
(
x
)
∂
α
u
∂
x
α
+
lower-order terms
{\displaystyle Pu(x)=\sum _{|\alpha |=k}P^{\alpha }(x){\frac {\partial ^{\alpha }u}{\partial x^{\alpha }}}+{\text{lower-order terms}}}
where, for each multi-index α,
P
α
(
x
)
:
E
→
F
{\displaystyle P^{\alpha }(x):E\to F}
is a bundle map, symmetric on the indices α.
The kth order coefficients of P transform as a symmetric tensor
σ
P
:
S
k
(
T
∗
X
)
⊗
E
→
F
{\displaystyle \sigma _{P}:S^{k}(T^{*}X)\otimes E\to F}
whose domain is the tensor product of the kth symmetric power of the cotangent bundle of X with E, and whose codomain is F. This symmetric tensor is known as the principal symbol (or just the symbol) of P.
The coordinate system xi permits a local trivialization of the cotangent bundle by the coordinate differentials dxi, which determine fiber coordinates ξi. In terms of a basis of frames eμ, fν of E and F, respectively, the differential operator P decomposes into components
(
P
u
)
ν
=
∑
μ
P
ν
μ
u
μ
{\displaystyle (Pu)_{\nu }=\sum _{\mu }P_{\nu \mu }u_{\mu }}
on each section u of E. Here Pνμ is the scalar differential operator defined by
P
ν
μ
=
∑
α
P
ν
μ
α
∂
∂
x
α
.
{\displaystyle P_{\nu \mu }=\sum _{\alpha }P_{\nu \mu }^{\alpha }{\frac {\partial }{\partial x^{\alpha }}}.}
With this trivialization, the principal symbol can now be written
(
σ
P
(
ξ
)
u
)
ν
=
∑
|
α
|
=
k
∑
μ
P
ν
μ
α
(
x
)
ξ
α
u
μ
.
{\displaystyle (\sigma _{P}(\xi )u)_{\nu }=\sum _{|\alpha |=k}\sum _{\mu }P_{\nu \mu }^{\alpha }(x)\xi _{\alpha }u_{\mu }.}
In the cotangent space over a fixed point x of X, the symbol
σ
P
{\displaystyle \sigma _{P}}
defines a homogeneous polynomial of degree k in
T
x
∗
X
{\displaystyle T_{x}^{*}X}
with values in
Hom
(
E
x
,
F
x
)
{\displaystyle \operatorname {Hom} (E_{x},F_{x})}
.
Fourier interpretation
A differential operator P and its symbol appear naturally in connection with the Fourier transform as follows. Let ƒ be a Schwartz function. Then by the inverse Fourier transform,
P
f
(
x
)
=
1
(
2
π
)
d
2
∫
R
d
e
i
x
⋅
ξ
p
(
x
,
i
ξ
)
f
^
(
ξ
)
d
ξ
.
{\displaystyle Pf(x)={\frac {1}{(2\pi )^{\frac {d}{2}}}}\int \limits _{\mathbf {R} ^{d}}e^{ix\cdot \xi }p(x,i\xi ){\hat {f}}(\xi )\,d\xi .}
This exhibits P as a Fourier multiplier. A more general class of functions p(x,ξ) which satisfy at most polynomial growth conditions in ξ under which this integral is well-behaved comprises the pseudo-differential operators.
Examples
The differential operator
P
{\displaystyle P}
is elliptic if its symbol is invertible; that is for each nonzero
θ
∈
T
∗
X
{\displaystyle \theta \in T^{*}X}
the bundle map
σ
P
(
θ
,
…
,
θ
)
{\displaystyle \sigma _{P}(\theta ,\dots ,\theta )}
is invertible. On a compact manifold, it follows from the elliptic theory that P is a Fredholm operator: it has finite-dimensional kernel and cokernel.
In the study of hyperbolic and parabolic partial differential equations, zeros of the principal symbol correspond to the characteristics of the partial differential equation.
In applications to the physical sciences, operators such as the Laplace operator play a major role in setting up and solving partial differential equations.
In differential topology, the exterior derivative and Lie derivative operators have intrinsic meaning.
In abstract algebra, the concept of a derivation allows for generalizations of differential operators, which do not require the use of calculus. Frequently such generalizations are employed in algebraic geometry and commutative algebra. See also Jet (mathematics).
In the development of holomorphic functions of a complex variable z = x + i y, sometimes a complex function is considered to be a function of two real variables x and y. Use is made of the Wirtinger derivatives, which are partial differential operators:
∂
∂
z
=
1
2
(
∂
∂
x
−
i
∂
∂
y
)
,
∂
∂
z
¯
=
1
2
(
∂
∂
x
+
i
∂
∂
y
)
.
{\displaystyle {\frac {\partial }{\partial z}}={\frac {1}{2}}\left({\frac {\partial }{\partial x}}-i{\frac {\partial }{\partial y}}\right)\ ,\quad {\frac {\partial }{\partial {\bar {z}}}}={\frac {1}{2}}\left({\frac {\partial }{\partial x}}+i{\frac {\partial }{\partial y}}\right)\ .}
This approach is also used to study functions of several complex variables and functions of a motor variable.
The differential operator del, also called nabla, is an important vector differential operator. It appears frequently in physics in places like the differential form of Maxwell's equations. In three-dimensional Cartesian coordinates, del is defined as
∇
=
x
^
∂
∂
x
+
y
^
∂
∂
y
+
z
^
∂
∂
z
.
{\displaystyle \nabla =\mathbf {\hat {x}} {\partial \over \partial x}+\mathbf {\hat {y}} {\partial \over \partial y}+\mathbf {\hat {z}} {\partial \over \partial z}.}
Del defines the gradient, and is used to calculate the curl, divergence, and Laplacian of various objects.
A chiral differential operator. For now, see [1]
History
The conceptual step of writing a differential operator as something free-standing is attributed to Louis François Antoine Arbogast in 1800.
Notations
The most common differential operator is the action of taking the derivative. Common notations for taking the first derivative with respect to a variable x include:
d
d
x
{\displaystyle {d \over dx}}
,
D
{\displaystyle D}
,
D
x
,
{\displaystyle D_{x},}
and
∂
x
{\displaystyle \partial _{x}}
.
When taking higher, nth order derivatives, the operator may be written:
d
n
d
x
n
{\displaystyle {d^{n} \over dx^{n}}}
,
D
n
{\displaystyle D^{n}}
,
D
x
n
{\displaystyle D_{x}^{n}}
, or
∂
x
n
{\displaystyle \partial _{x}^{n}}
.
The derivative of a function f of an argument x is sometimes given as either of the following:
[
f
(
x
)
]
′
{\displaystyle [f(x)]'}
f
′
(
x
)
.
{\displaystyle f'(x).}
The D notation's use and creation is credited to Oliver Heaviside, who considered differential operators of the form
∑
k
=
0
n
c
k
D
k
{\displaystyle \sum _{k=0}^{n}c_{k}D^{k}}
in his study of differential equations.
One of the most frequently seen differential operators is the Laplacian operator, defined by
Δ
=
∇
2
=
∑
k
=
1
n
∂
2
∂
x
k
2
.
{\displaystyle \Delta =\nabla ^{2}=\sum _{k=1}^{n}{\frac {\partial ^{2}}{\partial x_{k}^{2}}}.}
Another differential operator is the Θ operator, or theta operator, defined by
Θ
=
z
d
d
z
.
{\displaystyle \Theta =z{d \over dz}.}
This is sometimes also called the homogeneity operator, because its eigenfunctions are the monomials in z:
Θ
(
z
k
)
=
k
z
k
,
k
=
0
,
1
,
2
,
…
{\displaystyle \Theta (z^{k})=kz^{k},\quad k=0,1,2,\dots }
In n variables the homogeneity operator is given by
Θ
=
∑
k
=
1
n
x
k
∂
∂
x
k
.
{\displaystyle \Theta =\sum _{k=1}^{n}x_{k}{\frac {\partial }{\partial x_{k}}}.}
As in one variable, the eigenspaces of Θ are the spaces of homogeneous functions. (Euler's homogeneous function theorem)
In writing, following common mathematical convention, the argument of a differential operator is usually placed on the right side of the operator itself. Sometimes an alternative notation is used: The result of applying the operator to the function on the left side of the operator and on the right side of the operator, and the difference obtained when applying the differential operator to the functions on both sides, are denoted by arrows as follows:
f
∂
x
←
g
=
g
⋅
∂
x
f
{\displaystyle f{\overleftarrow {\partial _{x}}}g=g\cdot \partial _{x}f}
f
∂
x
→
g
=
f
⋅
∂
x
g
{\displaystyle f{\overrightarrow {\partial _{x}}}g=f\cdot \partial _{x}g}
f
∂
x
↔
g
=
f
⋅
∂
x
g
−
g
⋅
∂
x
f
.
{\displaystyle f{\overleftrightarrow {\partial _{x}}}g=f\cdot \partial _{x}g-g\cdot \partial _{x}f.}
Such a bidirectional-arrow notation is frequently used for describing the probability current of quantum mechanics.
Adjoint of an operator
Given a linear differential operator
T
{\displaystyle T}
T
u
=
∑
k
=
0
n
a
k
(
x
)
D
k
u
{\displaystyle Tu=\sum _{k=0}^{n}a_{k}(x)D^{k}u}
the adjoint of this operator is defined as the operator
T
∗
{\displaystyle T^{*}}
such that
⟨
T
u
,
v
⟩
=
⟨
u
,
T
∗
v
⟩
{\displaystyle \langle Tu,v\rangle =\langle u,T^{*}v\rangle }
where the notation
⟨
⋅
,
⋅
⟩
{\displaystyle \langle \cdot ,\cdot \rangle }
is used for the scalar product or inner product. This definition therefore depends on the definition of the scalar product (or inner product).
= Formal adjoint in one variable
=In the functional space of square-integrable functions on a real interval (a, b), the scalar product is defined by
⟨
f
,
g
⟩
=
∫
a
b
f
(
x
)
¯
g
(
x
)
d
x
,
{\displaystyle \langle f,g\rangle =\int _{a}^{b}{\overline {f(x)}}\,g(x)\,dx,}
where the line over f(x) denotes the complex conjugate of f(x). If one moreover adds the condition that f or g vanishes as
x
→
a
{\displaystyle x\to a}
and
x
→
b
{\displaystyle x\to b}
, one can also define the adjoint of T by
T
∗
u
=
∑
k
=
0
n
(
−
1
)
k
D
k
[
a
k
(
x
)
¯
u
]
.
{\displaystyle T^{*}u=\sum _{k=0}^{n}(-1)^{k}D^{k}\left[{\overline {a_{k}(x)}}u\right].}
This formula does not explicitly depend on the definition of the scalar product. It is therefore sometimes chosen as a definition of the adjoint operator. When
T
∗
{\displaystyle T^{*}}
is defined according to this formula, it is called the formal adjoint of T.
A (formally) self-adjoint operator is an operator equal to its own (formal) adjoint.
= Several variables
=If Ω is a domain in Rn, and P a differential operator on Ω, then the adjoint of P is defined in L2(Ω) by duality in the analogous manner:
⟨
f
,
P
∗
g
⟩
L
2
(
Ω
)
=
⟨
P
f
,
g
⟩
L
2
(
Ω
)
{\displaystyle \langle f,P^{*}g\rangle _{L^{2}(\Omega )}=\langle Pf,g\rangle _{L^{2}(\Omega )}}
for all smooth L2 functions f, g. Since smooth functions are dense in L2, this defines the adjoint on a dense subset of L2: P* is a densely defined operator.
= Example
=The Sturm–Liouville operator is a well-known example of a formal self-adjoint operator. This second-order linear differential operator L can be written in the form
L
u
=
−
(
p
u
′
)
′
+
q
u
=
−
(
p
u
″
+
p
′
u
′
)
+
q
u
=
−
p
u
″
−
p
′
u
′
+
q
u
=
(
−
p
)
D
2
u
+
(
−
p
′
)
D
u
+
(
q
)
u
.
{\displaystyle Lu=-(pu')'+qu=-(pu''+p'u')+qu=-pu''-p'u'+qu=(-p)D^{2}u+(-p')Du+(q)u.}
This property can be proven using the formal adjoint definition above.
This operator is central to Sturm–Liouville theory where the eigenfunctions (analogues to eigenvectors) of this operator are considered.
Properties
Differentiation is linear, i.e.
D
(
f
+
g
)
=
(
D
f
)
+
(
D
g
)
,
{\displaystyle D(f+g)=(Df)+(Dg),}
D
(
a
f
)
=
a
(
D
f
)
,
{\displaystyle D(af)=a(Df),}
where f and g are functions, and a is a constant.
Any polynomial in D with function coefficients is also a differential operator. We may also compose differential operators by the rule
(
D
1
∘
D
2
)
(
f
)
=
D
1
(
D
2
(
f
)
)
.
{\displaystyle (D_{1}\circ D_{2})(f)=D_{1}(D_{2}(f)).}
Some care is then required: firstly any function coefficients in the operator D2 must be differentiable as many times as the application of D1 requires. To get a ring of such operators we must assume derivatives of all orders of the coefficients used. Secondly, this ring will not be commutative: an operator gD isn't the same in general as Dg. For example we have the relation basic in quantum mechanics:
D
x
−
x
D
=
1.
{\displaystyle Dx-xD=1.}
The subring of operators that are polynomials in D with constant coefficients is, by contrast, commutative. It can be characterised another way: it consists of the translation-invariant operators.
The differential operators also obey the shift theorem.
Ring of polynomial differential operators
= Ring of univariate polynomial differential operators
=If R is a ring, let
R
⟨
D
,
X
⟩
{\displaystyle R\langle D,X\rangle }
be the non-commutative polynomial ring over R in the variables D and X, and I the two-sided ideal generated by DX − XD − 1. Then the ring of univariate polynomial differential operators over R is the quotient ring
R
⟨
D
,
X
⟩
/
I
{\displaystyle R\langle D,X\rangle /I}
. This is a non-commutative simple ring. Every element can be written in a unique way as a R-linear combination of monomials of the form
X
a
D
b
mod
I
{\displaystyle X^{a}D^{b}{\text{ mod }}I}
. It supports an analogue of Euclidean division of polynomials.
Differential modules over
R
[
X
]
{\displaystyle R[X]}
(for the standard derivation) can be identified with modules over
R
⟨
D
,
X
⟩
/
I
{\displaystyle R\langle D,X\rangle /I}
.
= Ring of multivariate polynomial differential operators
=If R is a ring, let
R
⟨
D
1
,
…
,
D
n
,
X
1
,
…
,
X
n
⟩
{\displaystyle R\langle D_{1},\ldots ,D_{n},X_{1},\ldots ,X_{n}\rangle }
be the non-commutative polynomial ring over R in the variables
D
1
,
…
,
D
n
,
X
1
,
…
,
X
n
{\displaystyle D_{1},\ldots ,D_{n},X_{1},\ldots ,X_{n}}
, and I the two-sided ideal generated by the elements
(
D
i
X
j
−
X
j
D
i
)
−
δ
i
,
j
,
D
i
D
j
−
D
j
D
i
,
X
i
X
j
−
X
j
X
i
{\displaystyle (D_{i}X_{j}-X_{j}D_{i})-\delta _{i,j},\ \ \ D_{i}D_{j}-D_{j}D_{i},\ \ \ X_{i}X_{j}-X_{j}X_{i}}
for all
1
≤
i
,
j
≤
n
,
{\displaystyle 1\leq i,j\leq n,}
where
δ
{\displaystyle \delta }
is Kronecker delta. Then the ring of multivariate polynomial differential operators over R is the quotient ring
R
⟨
D
1
,
…
,
D
n
,
X
1
,
…
,
X
n
⟩
/
I
{\displaystyle R\langle D_{1},\ldots ,D_{n},X_{1},\ldots ,X_{n}\rangle /I}
.
This is a non-commutative simple ring.
Every element can be written in a unique way as a R-linear combination of monomials of the form
X
1
a
1
…
X
n
a
n
D
1
b
1
…
D
n
b
n
{\displaystyle X_{1}^{a_{1}}\ldots X_{n}^{a_{n}}D_{1}^{b_{1}}\ldots D_{n}^{b_{n}}}
.
Coordinate-independent description
In differential geometry and algebraic geometry it is often convenient to have a coordinate-independent description of differential operators between two vector bundles. Let E and F be two vector bundles over a differentiable manifold M. An R-linear mapping of sections P : Γ(E) → Γ(F) is said to be a kth-order linear differential operator if it factors through the jet bundle Jk(E).
In other words, there exists a linear mapping of vector bundles
i
P
:
J
k
(
E
)
→
F
{\displaystyle i_{P}:J^{k}(E)\to F}
such that
P
=
i
P
∘
j
k
{\displaystyle P=i_{P}\circ j^{k}}
where jk: Γ(E) → Γ(Jk(E)) is the prolongation that associates to any section of E its k-jet.
This just means that for a given section s of E, the value of P(s) at a point x ∈ M is fully determined by the kth-order infinitesimal behavior of s in x. In particular this implies that P(s)(x) is determined by the germ of s in x, which is expressed by saying that differential operators are local. A foundational result is the Peetre theorem showing that the converse is also true: any (linear) local operator is differential.
= Relation to commutative algebra
=An equivalent, but purely algebraic description of linear differential operators is as follows: an R-linear map P is a kth-order linear differential operator, if for any k + 1 smooth functions
f
0
,
…
,
f
k
∈
C
∞
(
M
)
{\displaystyle f_{0},\ldots ,f_{k}\in C^{\infty }(M)}
we have
[
f
k
,
[
f
k
−
1
,
[
⋯
[
f
0
,
P
]
⋯
]
]
=
0.
{\displaystyle [f_{k},[f_{k-1},[\cdots [f_{0},P]\cdots ]]=0.}
Here the bracket
[
f
,
P
]
:
Γ
(
E
)
→
Γ
(
F
)
{\displaystyle [f,P]:\Gamma (E)\to \Gamma (F)}
is defined as the commutator
[
f
,
P
]
(
s
)
=
P
(
f
⋅
s
)
−
f
⋅
P
(
s
)
.
{\displaystyle [f,P](s)=P(f\cdot s)-f\cdot P(s).}
This characterization of linear differential operators shows that they are particular mappings between modules over a commutative algebra, allowing the concept to be seen as a part of commutative algebra.
Variants
= A differential operator of infinite order
=A differential operator of infinite order is (roughly) a differential operator whose total symbol is a power series instead of a polynomial.
= Bidifferential operator
=A differential operator acting on two functions
D
(
g
,
f
)
{\displaystyle D(g,f)}
is called a bidifferential operator. The notion appears, for instance, in an associative algebra structure on a deformation quantization of a Poisson algebra.
= Microdifferential operator
=A microdifferential operator is a type of operator on an open subset of a cotangent bundle, as opposed to an open subset of a manifold. It is obtained by extending the notion of a differential operator to the cotangent bundle.
See also
Notes
References
Freed, Daniel S. (1987), Geometry of Dirac operators, p. 8, CiteSeerX 10.1.1.186.8445
Hörmander, L. (1983), The analysis of linear partial differential operators I, Grundl. Math. Wissenschaft., vol. 256, Springer, doi:10.1007/978-3-642-96750-4, ISBN 3-540-12104-8, MR 0717035.
Schapira, Pierre (1985). Microdifferential Systems in the Complex Domain. Grundlehren der mathematischen Wissenschaften. Vol. 269. Springer. doi:10.1007/978-3-642-61665-5. ISBN 978-3-642-64904-2.
Wells, R.O. (1973), Differential analysis on complex manifolds, Springer-Verlag, ISBN 0-387-90419-0.
Further reading
Fedosov, Boris; Schulze, Bert-Wolfgang; Tarkhanov, Nikolai (2002). "Analytic index formulas for elliptic corner operators". Annales de l'Institut Fourier. 52 (3): 899–982. doi:10.5802/aif.1906. ISSN 1777-5310.
https://mathoverflow.net/questions/451110/reference-request-inverse-of-differential-operators
External links
Media related to Differential operators at Wikimedia Commons
"Differential operator", Encyclopedia of Mathematics, EMS Press, 2001 [1994]
Kata Kunci Pencarian:
- Fungsi Green
- Persamaan Laplace
- Viktor Vladimirovich Nemytskii
- Persamaan diferensial biasa
- ∂
- Masalah nilai batas
- Polinomial Bernstein–Sato
- Turunan parsial
- Persamaan gelombang elektromagnetik
- Isometri
- Differential operator
- Pseudo-differential operator
- Del
- Operator (mathematics)
- Curl (mathematics)
- Laplace operator
- Elliptic operator
- Linear differential equation
- Laplace operators in differential geometry
- Invariant differential operator