- Source: Orthogonality (mathematics)
In mathematics, orthogonality is the generalization of the geometric notion of perpendicularity to the linear algebra of bilinear forms.
Two elements u and v of a vector space with bilinear form
B
{\displaystyle B}
are orthogonal when
B
(
u
,
v
)
=
0
{\displaystyle B(\mathbf {u} ,\mathbf {v} )=0}
. Depending on the bilinear form, the vector space may contain null vectors, non-zero self-orthogonal vectors, in which case perpendicularity is replaced with hyperbolic orthogonality.
In the case of function spaces, families of functions are used to form an orthogonal basis, such as in the contexts of orthogonal polynomials, orthogonal functions, and combinatorics.
Definitions
In geometry, two Euclidean vectors are orthogonal if they are perpendicular, i.e. they form a right angle.
Two vectors u and v in an inner product space
V
{\displaystyle V}
are orthogonal if their inner product
⟨
u
,
v
⟩
{\displaystyle \langle \mathbf {u} ,\mathbf {v} \rangle }
is zero. This relationship is denoted
u
⊥
v
{\displaystyle \mathbf {u} \perp \mathbf {v} }
.
An orthogonal matrix is a matrix whose column vectors are orthonormal to each other.
An orthonormal basis is a basis whose vectors are both orthogonal and normalized (they are unit vectors).
A conformal linear transformation preserves angles and distance ratios, meaning that transforming orthogonal vectors by the same conformal linear transformation will keep those vectors orthogonal.
Two vector subspaces
A
{\displaystyle A}
and
B
{\displaystyle B}
of an inner product space
V
{\displaystyle V}
are called orthogonal subspaces if each vector in
A
{\displaystyle A}
is orthogonal to each vector in
B
{\displaystyle B}
. The largest subspace of
V
{\displaystyle V}
that is orthogonal to a given subspace is its orthogonal complement.
Given a module
M
{\displaystyle M}
and its dual
M
∗
{\displaystyle M^{*}}
, an element
m
′
{\displaystyle m'}
of
M
∗
{\displaystyle M^{*}}
and an element
m
{\displaystyle m}
of
M
{\displaystyle M}
are orthogonal if their natural pairing is zero, i.e.
⟨
m
′
,
m
⟩
=
0
{\displaystyle \langle m',m\rangle =0}
. Two sets
S
′
⊆
M
∗
{\displaystyle S'\subseteq M^{*}}
and
S
⊆
M
{\displaystyle S\subseteq M}
are orthogonal if each element of
S
′
{\displaystyle S'}
is orthogonal to each element of
S
{\displaystyle S}
.
A term rewriting system is said to be orthogonal if it is left-linear and is non-ambiguous. Orthogonal term rewriting systems are confluent.
A set of vectors in an inner product space is called pairwise orthogonal if each pairing of them is orthogonal. Such a set is called an orthogonal set.
In certain cases, the word normal is used to mean orthogonal, particularly in the geometric sense as in the normal to a surface. For example, the y-axis is normal to the curve
y
=
x
2
{\displaystyle y=x^{2}}
at the origin. However, normal may also refer to the magnitude of a vector. In particular, a set is called orthonormal (orthogonal plus normal) if it is an orthogonal set of unit vectors. As a result, use of the term normal to mean "orthogonal" is often avoided. The word "normal" also has a different meaning in probability and statistics.
A vector space with a bilinear form generalizes the case of an inner product. When the bilinear form applied to two vectors results in zero, then they are orthogonal. The case of a pseudo-Euclidean plane uses the term hyperbolic orthogonality. In the diagram, axes x′ and t′ are hyperbolic-orthogonal for any given
ϕ
{\displaystyle \phi }
.
Euclidean vector spaces
In Euclidean space, two vectors are orthogonal if and only if their dot product is zero, i.e. they make an angle of 90° (
π
2
{\textstyle {\frac {\pi }{2}}}
radians), or one of the vectors is zero. Hence orthogonality of vectors is an extension of the concept of perpendicular vectors to spaces of any dimension.
The orthogonal complement of a subspace is the space of all vectors that are orthogonal to every vector in the subspace. In a three-dimensional Euclidean vector space, the orthogonal complement of a line through the origin is the plane through the origin perpendicular to it, and vice versa.
Note that the geometric concept of two planes being perpendicular does not correspond to the orthogonal complement, since in three dimensions a pair of vectors, one from each of a pair of perpendicular planes, might meet at any angle.
In four-dimensional Euclidean space, the orthogonal complement of a line is a hyperplane and vice versa, and that of a plane is a plane.
Orthogonal functions
By using integral calculus, it is common to use the following to define the inner product of two functions
f
{\displaystyle f}
and
g
{\displaystyle g}
with respect to a nonnegative weight function
w
{\displaystyle w}
over an interval
[
a
,
b
]
{\displaystyle [a,b]}
:
⟨
f
,
g
⟩
w
=
∫
a
b
f
(
x
)
g
(
x
)
w
(
x
)
d
x
.
{\displaystyle \langle f,g\rangle _{w}=\int _{a}^{b}f(x)g(x)w(x)\,dx.}
In simple cases,
w
(
x
)
=
1
{\displaystyle w(x)=1}
.
We say that functions
f
{\displaystyle f}
and
g
{\displaystyle g}
are orthogonal if their inner product (equivalently, the value of this integral) is zero:
⟨
f
,
g
⟩
w
=
0.
{\displaystyle \langle f,g\rangle _{w}=0.}
Orthogonality of two functions with respect to one inner product does not imply orthogonality with respect to another inner product.
We write the norm with respect to this inner product as
‖
f
‖
w
=
⟨
f
,
f
⟩
w
{\displaystyle \|f\|_{w}={\sqrt {\langle f,f\rangle _{w}}}}
The members of a set of functions
f
i
∣
i
∈
N
{\displaystyle {f_{i}\mid i\in \mathbb {N} }}
are orthogonal with respect to
w
{\displaystyle w}
on the interval
[
a
,
b
]
{\displaystyle [a,b]}
if
⟨
f
i
,
f
j
⟩
w
=
0
∣
i
≠
j
.
{\displaystyle \langle f_{i},f_{j}\rangle _{w}=0\mid i\neq j.}
The members of such a set of functions are orthonormal with respect to
w
{\displaystyle w}
on the interval
[
a
,
b
]
{\displaystyle [a,b]}
if
⟨
f
i
,
f
j
⟩
w
=
δ
i
j
,
{\displaystyle \langle f_{i},f_{j}\rangle _{w}=\delta _{ij},}
where
δ
i
j
=
{
1
,
i
=
j
0
,
i
≠
j
{\displaystyle \delta _{ij}=\left\{{\begin{matrix}1,&&i=j\\0,&&i\neq j\end{matrix}}\right.}
is the Kronecker delta.
In other words, every pair of them (excluding pairing of a function with itself) is orthogonal, and the norm of each is 1. See in particular the orthogonal polynomials.
Examples
The vectors
(
1
,
3
,
2
)
T
,
(
3
,
−
1
,
0
)
T
,
(
1
,
3
,
−
5
)
T
{\displaystyle (1,3,2)^{\text{T}},(3,-1,0)^{\text{T}},(1,3,-5)^{\text{T}}}
are orthogonal to each other, since
(
1
)
(
3
)
+
(
3
)
(
−
1
)
+
(
2
)
(
0
)
=
0
,
{\displaystyle (1)(3)+(3)(-1)+(2)(0)=0\ ,}
(
3
)
(
1
)
+
(
−
1
)
(
3
)
+
(
0
)
(
−
5
)
=
0
,
{\displaystyle \ (3)(1)+(-1)(3)+(0)(-5)=0\ ,}
and
(
1
)
(
1
)
+
(
3
)
(
3
)
+
(
2
)
(
−
5
)
=
0
{\displaystyle (1)(1)+(3)(3)+(2)(-5)=0}
.
The vectors
(
1
,
0
,
1
,
0
,
…
)
T
{\displaystyle (1,0,1,0,\ldots )^{\text{T}}}
and
(
0
,
1
,
0
,
1
,
…
)
T
{\displaystyle (0,1,0,1,\ldots )^{\text{T}}}
are orthogonal to each other. The dot product of these vectors is zero. We can then make the generalization to consider the vectors in
Z
2
n
{\displaystyle \mathbb {Z} _{2}^{n}}
:
v
k
=
∑
i
=
0
a
i
+
k
<
n
n
/
a
e
i
{\displaystyle \mathbf {v} _{k}=\sum _{i=0 \atop ai+k
for some positive integer
a
{\displaystyle a}
, and for
1
≤
k
≤
a
−
1
{\displaystyle 1\leq k\leq a-1}
, these vectors are orthogonal, for example
[
1
0
0
1
0
0
1
0
]
{\displaystyle {\begin{bmatrix}1&0&0&1&0&0&1&0\end{bmatrix}}}
,
[
0
1
0
0
1
0
0
1
]
{\displaystyle {\begin{bmatrix}0&1&0&0&1&0&0&1\end{bmatrix}}}
,
[
0
0
1
0
0
1
0
0
]
{\displaystyle {\begin{bmatrix}0&0&1&0&0&1&0&0\end{bmatrix}}}
are orthogonal.
The functions
2
t
+
3
{\displaystyle 2t+3}
and
45
t
2
+
9
t
−
17
{\displaystyle 45t^{2}+9t-17}
are orthogonal with respect to a unit weight function on the interval from −1 to 1:
∫
−
1
1
(
2
t
+
3
)
(
45
t
2
+
9
t
−
17
)
d
t
=
0
{\displaystyle \int _{-1}^{1}\left(2t+3\right)\left(45t^{2}+9t-17\right)\,dt=0}
The functions
1
,
sin
(
n
x
)
,
cos
(
n
x
)
∣
n
∈
N
{\displaystyle 1,\sin {(nx)},\cos {(nx)}\mid n\in \mathbb {N} }
are orthogonal with respect to Riemann integration on the intervals
[
0
,
2
π
]
,
[
−
π
,
π
]
{\displaystyle [0,2\pi ],[-\pi ,\pi ]}
, or any other closed interval of length
2
π
{\displaystyle 2\pi }
. This fact is a central one in Fourier series.
= Orthogonal polynomials
=Various polynomial sequences named for mathematicians of the past are sequences of orthogonal polynomials. In particular:
The Hermite polynomials are orthogonal with respect to the Gaussian distribution with zero mean value.
The Legendre polynomials are orthogonal with respect to the uniform distribution on the interval
[
−
1
,
1
]
{\displaystyle [-1,1]}
.
The Laguerre polynomials are orthogonal with respect to the exponential distribution. Somewhat more general Laguerre polynomial sequences are orthogonal with respect to gamma distributions.
The Chebyshev polynomials of the first kind are orthogonal with respect to the measure
1
1
−
x
2
.
{\textstyle {\frac {1}{\sqrt {1-x^{2}}}}.}
The Chebyshev polynomials of the second kind are orthogonal with respect to the Wigner semicircle distribution.
Combinatorics
In combinatorics, two
n
×
n
{\displaystyle n\times n}
Latin squares are said to be orthogonal if their superimposition yields all possible
n
2
{\displaystyle n^{2}}
combinations of entries.
Completely orthogonal
Two flat planes
A
{\displaystyle A}
and
B
{\displaystyle B}
of a Euclidean four-dimensional space are called completely orthogonal if and only if every line in
A
{\displaystyle A}
is orthogonal to every line in
B
{\displaystyle B}
. In that case the planes
A
{\displaystyle A}
and
B
{\displaystyle B}
intersect at a single point
O
{\displaystyle O}
, so that if a line in
A
{\displaystyle A}
intersects with a line in
B
{\displaystyle B}
, they intersect at
O
{\displaystyle O}
.
A
{\displaystyle A}
and
B
{\displaystyle B}
are perpendicular and Clifford parallel.
In 4 dimensional space we can construct 4 perpendicular axes and 6 perpendicular planes through a point. Without loss of generality, we may take these to be the axes and orthogonal central planes of a
(
w
,
x
,
y
,
z
)
{\displaystyle (w,x,y,z)}
Cartesian coordinate system. In 4 dimensions we have the same 3 orthogonal planes
(
x
y
,
x
z
,
y
z
)
{\displaystyle (xy,xz,yz)}
that we have in 3 dimensions, and also 3 others
(
w
x
,
w
y
,
w
z
)
{\displaystyle (wx,wy,wz)}
. Each of the 6 orthogonal planes shares an axis with 4 of the others, and is completely orthogonal to just one of the others: the only one with which it does not share an axis. Thus there are 3 pairs of completely orthogonal planes:
x
y
{\displaystyle xy}
and
w
z
{\displaystyle wz}
intersect only at the origin;
x
z
{\displaystyle xz}
and
w
y
{\displaystyle wy}
intersect only at the origin;
y
z
{\displaystyle yz}
and
w
x
{\displaystyle wx}
intersect only at the origin.
More generally, two flat subspaces
S
1
{\displaystyle S_{1}}
and
S
2
{\displaystyle S_{2}}
of dimensions
M
{\displaystyle M}
and
N
{\displaystyle N}
of a Euclidean space
S
{\displaystyle S}
of at least
M
+
N
{\displaystyle M+N}
dimensions are called completely orthogonal if every line in
S
1
{\displaystyle S_{1}}
is orthogonal to every line in
S
2
{\displaystyle S_{2}}
. If
dim
(
S
)
=
M
+
N
{\displaystyle \dim(S)=M+N}
then
S
1
{\displaystyle S_{1}}
and
S
2
{\displaystyle S_{2}}
intersect at a single point
O
{\displaystyle O}
. If
dim
(
S
)
>
M
+
N
{\displaystyle \dim(S)>M+N}
then
S
1
{\displaystyle S_{1}}
and
S
2
{\displaystyle S_{2}}
may or may not intersect. If
dim
(
S
)
=
M
+
N
{\displaystyle \dim(S)=M+N}
then a line in
S
1
{\displaystyle S_{1}}
and a line in
S
2
{\displaystyle S_{2}}
may or may not intersect; if they intersect then they intersect at
O
{\displaystyle O}
.
See also
Imaginary number
Orthogonal complement
Orthogonal group
Orthogonal matrix
Orthogonal polynomials
Orthogonal trajectory
Orthogonalization
Gram–Schmidt process
Orthonormal basis
Orthonormality
Pan-orthogonality occurs in coquaternions
Up tack
References
Kata Kunci Pencarian:
- Orthogonality (mathematics)
- Orthogonality
- Orthogonal polynomials
- Orthogonal basis
- Orthogonal matrix
- Orthogonal group
- Matrix (mathematics)
- Orthogonal functions
- Gram–Schmidt process
- Orthogonal complement