- Source: Law (mathematics)
In mathematics, a law is a formula that is always true within a given context. Laws describe a relationship, between two or more expressions or terms (which may contain variables), usually using equality or inequality, or between formulas themselves, for instance, in mathematical logic. For example, the formula
a
2
≥
0
{\displaystyle a^{2}\geq 0}
is true for all real numbers a, and is therefore a law. Laws over an equality are called identities. For example,
(
a
+
b
)
2
=
a
2
+
2
a
b
+
b
2
{\displaystyle (a+b)^{2}=a^{2}+2ab+b^{2}}
and
cos
2
θ
+
sin
2
θ
=
1
{\displaystyle \cos ^{2}\theta +\sin ^{2}\theta =1}
are identities. Mathematical laws are distinguished from scientific laws which are based on observations, and try to describe or predict a range of natural phenomena. The more significant laws are often called theorems.
Notable examples
= Geometric laws
=Triangle inequality: If a, b, and c are the lengths of the sides of a triangle then the triangle inequality states that
c
≤
a
+
b
,
{\displaystyle c\leq a+b,}
with equality only in the degenerate case of a triangle with zero area. In Euclidean geometry and some other geometries, the triangle inequality is a theorem about vectors and vector lengths (norms):
‖
u
+
v
‖
≤
‖
u
‖
+
‖
v
‖
,
{\displaystyle \|\mathbf {u} +\mathbf {v} \|\leq \|\mathbf {u} \|+\|\mathbf {v} \|,}
where the length of the third side has been replaced by the length of the vector sum u + v. When u and v are real numbers, they can be viewed as vectors in
R
1
{\displaystyle \mathbb {R} ^{1}}
, and the triangle inequality expresses a relationship between absolute values.
Pythagorean theorem: It states that the area of the square whose side is the hypotenuse (the side opposite the right angle) is equal to the sum of the areas of the squares on the other two sides. The theorem can be written as an equation relating the lengths of the sides a, b and the hypotenuse c, sometimes called the Pythagorean equation:
a
2
+
b
2
=
c
2
.
{\displaystyle a^{2}+b^{2}=c^{2}.}
Trigonometric identities
Geometrically, trigonometric identities are identities involving certain functions of one or more angles. They are distinct from triangle identities, which are identities involving both angles and side lengths of a triangle. Only the former are covered in this article.
These identities are useful whenever expressions involving trigonometric functions need to be simplified. Another important application is the integration of non-trigonometric functions: a common technique which involves first using the substitution rule with a trigonometric function, and then simplifying the resulting integral with a trigonometric identity.
One of the most prominent examples of trigonometric identities involves the equation
sin
2
θ
+
cos
2
θ
=
1
,
{\displaystyle \sin ^{2}\theta +\cos ^{2}\theta =1,}
which is true for all real values of
θ
{\displaystyle \theta }
. On the other hand, the equation
cos
θ
=
1
{\displaystyle \cos \theta =1}
is only true for certain values of
θ
{\displaystyle \theta }
, not all. For example, this equation is true when
θ
=
0
,
{\displaystyle \theta =0,}
but false when
θ
=
2
{\displaystyle \theta =2}
.
Another group of trigonometric identities concerns the so-called addition/subtraction formulas (e.g. the double-angle identity
sin
(
2
θ
)
=
2
sin
θ
cos
θ
{\displaystyle \sin(2\theta )=2\sin \theta \cos \theta }
, the addition formula for
tan
(
x
+
y
)
{\displaystyle \tan(x+y)}
), which can be used to break down expressions of larger angles into those with smaller constituents.
= Algebraic laws
=Cauchy–Schwarz inequality: An upper bound on the inner product between two vectors in an inner product space in terms of the product of the vector norms. It is considered one of the most important and widely used inequalities in mathematics.
The Cauchy–Schwarz inequality states that for all vectors
u
{\displaystyle \mathbf {u} }
and
v
{\displaystyle \mathbf {v} }
of an inner product space
|
⟨
u
,
v
⟩
|
≤
⟨
u
,
u
⟩
⋅
⟨
v
,
v
⟩
{\displaystyle \left\vert \langle {\mathbf {u}},{\mathbf {v}}\rangle \right\vert \leq \langle {\mathbf {u}},{\mathbf {u}}\rangle \cdot \langle {\mathbf {v}},{\mathbf {v}}\rangle }
where
⟨
⋅
,
⋅
⟩
{\displaystyle \langle \cdot ,\cdot \rangle }
is the inner product. Examples of inner products include the real and complex dot product; see the examples in inner product. Every inner product gives rise to a Euclidean
l
2
{\displaystyle l_{2}}
norm, called the canonical or induced norm, where the norm of a vector
u
{\displaystyle \mathbf {u} }
is denoted and defined by
‖
u
‖
:=
⟨
u
,
u
⟩
,
{\displaystyle \|\mathbf {u} \|:={\sqrt {\langle \mathbf {u} ,\mathbf {u} \rangle }},}
where
⟨
u
,
u
⟩
{\displaystyle \langle \mathbf {u} ,\mathbf {u} \rangle }
is always a non-negative real number (even if the inner product is complex-valued). By taking the square root of both sides of the above inequality, the Cauchy–Schwarz inequality can be written in its more familiar form in terms of the norm:
|
⟨
u
,
v
⟩
|
≤
⟨
u
,
u
⟩
⋅
⟨
v
,
v
⟩
{\displaystyle \left\vert \langle {\mathbf {u}},{\mathbf {v}}\rangle \right\vert \leq \langle {\mathbf {u}},{\mathbf {u}}\rangle \cdot \langle {\mathbf {v}},{\mathbf {v}}\rangle }
Moreover, the two sides are equal if and only if
u
{\displaystyle \mathbf {u} }
and
v
{\displaystyle \mathbf {v} }
are linearly dependent.
= Combinatorial laws
=Pigeonhole principle: If n items are put into m containers, with n > m, then at least one container must contain more than one item. For example, of three gloves (none of which is ambidextrous/reversible), at least two must be right-handed or at least two must be left-handed, because there are three objects but only two categories of handedness to put them into.
= Logical laws
=De Morgan's laws: In propositional logic and Boolean algebra, De Morgan's laws, also known as De Morgan's theorem, are a pair of transformation rules that are both valid rules of inference. They are named after Augustus De Morgan, a 19th-century British mathematician. The rules allow the expression of conjunctions and disjunctions purely in terms of each other via negation. The rules can be expressed in English as:
not (A or B) = (not A) and (not B)
not (A and B) = (not A) or (not B) where "A or B" is an "inclusive or" meaning at least one of A or B rather than an "exclusive or" that means exactly one of A or B. In formal language, the rules are written as
¬
(
P
∨
Q
)
⟺
(
¬
P
)
∧
(
¬
Q
)
,
and
¬
(
P
∧
Q
)
⟺
(
¬
P
)
∨
(
¬
Q
)
,
{\displaystyle {\begin{aligned}\neg (P\lor Q)&\iff (\neg P)\land (\neg Q),\quad {\text{ and }}\\\neg (P\land Q)&\iff (\neg P)\lor (\neg Q),\end{aligned}}}
where P and Q are propositions,
¬
{\displaystyle \neg }
is the negation logic operator (NOT),
∧
{\displaystyle \land }
is the conjunction logic operator (AND),
∨
{\displaystyle \lor }
is the disjunction logic operator (OR),
⟺
{\displaystyle \iff }
is a metalogical symbol meaning "can be replaced in a logical proof with", often read as "if and only if". For any combination of true/false values for P and Q, the left and right sides of the arrow will hold the same truth value after evaluation.
The three Laws of thought are:
The law of identity: 'Whatever is, is.' For all a: a = a.
The law of non-contradiction (alternately the 'law of contradiction'): 'Nothing can both be and not be.'
The law of excluded middle: 'Everything must either be or not be.' In accordance with the law of excluded middle or excluded third, for every proposition, either its positive or negative form is true: A∨¬A.
= Phinominological laws
=Benford's law is an observation that in many real-life sets of numerical data, the leading digit is likely to be small. In sets that obey the law, the number 1 appears as the leading significant digit about 30% of the time, while 9 appears as the leading significant digit less than 5% of the time. Uniformly distributed digits would each occur about 11.1% of the time.
Strong law of small numbers, in a humorous way, states any given small number appears in far more contexts than may seem reasonable, leading to many apparently surprising coincidences in mathematics, simply because small numbers appear so often and yet are so few.
See also
Formula
List of inequalities
List of mathematical identities
List of laws
Statement (logic)
Tautology (logic)
Citations
References
Bertrand Russell, The Problems of Philosophy (1912), Oxford University Press, New York, 1997, ISBN 0-19-511552-X.
Herstein, I. N. (1964), Topics In Algebra, Waltham: Blaisdell Publishing Company, ISBN 978-1114541016
External links
The Encyclopedia of Equation Online encyclopedia of mathematical identities (archived)
A Collection of Algebraic Identities
Kata Kunci Pencarian:
- Universitas Prasetiya Mulya
- Hukum Benford
- Universitas Szeged
- Program Penilaian Pelajar Internasional
- Ilmu
- Terence Tao
- Hukum Stigler
- Irisan (teori himpunan)
- Aturan sinus
- Svetlana Jitomirskaya
- Law (mathematics)
- Mathematics
- Reciprocity law
- Scientific law
- List of scientific laws named after people
- Commutative property
- Benford's law
- Identity (mathematics)
- Formula
- Law of large numbers