- Source: Contraharmonic mean
In mathematics, a contraharmonic mean is a function complementary to the harmonic mean. The contraharmonic mean is a special case of the Lehmer mean,
L
p
{\displaystyle L_{p}}
, where p = 2.
Definition
The contraharmonic mean of a set of positive real numbers is defined as the arithmetic mean of the squares of the numbers divided by the arithmetic mean of the numbers:
C
(
x
1
,
x
2
,
…
,
x
n
)
=
1
n
(
x
1
2
+
x
2
2
+
⋯
+
x
n
2
)
1
n
(
x
1
+
x
2
+
⋯
+
x
n
)
,
=
x
1
2
+
x
2
2
+
⋯
+
x
n
2
x
1
+
x
2
+
⋯
+
x
n
.
{\displaystyle {\begin{aligned}\operatorname {C} \left(x_{1},x_{2},\dots ,x_{n}\right)&={{1 \over n}\left(x_{1}^{2}+x_{2}^{2}+\cdots +x_{n}^{2}\right) \over {1 \over n}\left(x_{1}+x_{2}+\cdots +x_{n}\right)},\\[3pt]&={{x_{1}^{2}+x_{2}^{2}+\cdots +x_{n}^{2}} \over {x_{1}+x_{2}+\cdots +x_{n}}}.\end{aligned}}}
= Two-variable formulae
=From the formulas for the arithmetic mean and harmonic mean of two variables we have:
A
(
a
,
b
)
=
a
+
b
2
H
(
a
,
b
)
=
1
1
2
⋅
(
1
a
+
1
b
)
=
2
a
b
a
+
b
C
(
a
,
b
)
=
2
⋅
A
(
a
,
b
)
−
H
(
a
,
b
)
=
a
+
b
−
2
a
b
a
+
b
=
(
a
+
b
)
2
−
2
a
b
a
+
b
=
a
2
+
b
2
a
+
b
{\displaystyle {\begin{aligned}\operatorname {A} (a,b)&={{a+b} \over 2}\\\operatorname {H} (a,b)&={1 \over {{1 \over 2}\cdot {\left({1 \over a}+{1 \over b}\right)}}}={{2ab} \over {a+b}}\\\operatorname {C} (a,b)&=2\cdot A(a,b)-H(a,b)\\&=a+b-{{2ab} \over {a+b}}={{(a+b)^{2}-2ab} \over {a+b}}\\&={{a^{2}+b^{2}} \over {a+b}}\end{aligned}}}
Notice that for two variables the average of the harmonic and contraharmonic means is exactly equal to the arithmetic mean:
As a gets closer to 0 then H(a, b) also gets closer to 0. The harmonic mean is very sensitive to low values. On the other hand, the contraharmonic mean is sensitive to larger values, so as a approaches 0 then C(a, b) approaches b (so their average remains A(a, b)).
There are two other notable relationships between 2-variable means. First, the geometric mean of the arithmetic and harmonic means is equal to the geometric mean of the two values:
G
(
A
(
a
,
b
)
,
H
(
a
,
b
)
)
=
G
(
a
+
b
2
,
2
a
b
a
+
b
)
=
a
+
b
2
⋅
2
a
b
a
+
b
=
a
b
=
G
(
a
,
b
)
{\displaystyle \operatorname {G} (\operatorname {A} (a,b),\operatorname {H} (a,b))=\operatorname {G} \left({{a+b} \over 2},{{2ab} \over {a+b}}\right)={\sqrt {{{a+b} \over 2}\cdot {{2ab} \over {a+b}}}}={\sqrt {ab}}=\operatorname {G} (a,b)}
The second relationship is that the geometric mean of the arithmetic and contraharmonic means is the root mean square:
G
(
A
(
a
,
b
)
,
C
(
a
,
b
)
)
=
G
(
a
+
b
2
,
a
2
+
b
2
a
+
b
)
=
a
+
b
2
⋅
a
2
+
b
2
a
+
b
=
a
2
+
b
2
2
=
R
(
a
,
b
)
{\displaystyle {\begin{aligned}&\operatorname {G} \left(\operatorname {A} (a,b),\operatorname {C} (a,b)\right)={}\operatorname {G} \left({{a+b} \over 2},{{a^{2}+b^{2}} \over {a+b}}\right)\\={}&{\sqrt {{{a+b} \over 2}\cdot {{a^{2}+b^{2}} \over {a+b}}}}={}{\sqrt {{a^{2}+b^{2}} \over 2}}\\[2pt]={}&\operatorname {R} (a,b)\end{aligned}}}
The contraharmonic mean of two variables can be constructed geometrically using a trapezoid.
= Additional constructions
=The contraharmonic mean can be constructed on a circle similar to the way the Pythagorean means of two variables are constructed. The contraharmonic mean is the remainder of the diameter on which the harmonic mean lies.
History
The contraharmonic mean was discovered by the Greek mathematician Eudoxus in the 4th century BCE.
Properties
It is easy to show that this satisfies the characteristic properties of a mean of some list of values
x
{\textstyle \mathbf {x} }
:
min
(
x
)
≤
C
(
x
)
≤
max
(
x
)
{\displaystyle \min \left(\mathbf {x} \right)\leq \operatorname {C} \left(\mathbf {x} \right)\leq \max \left(\mathbf {x} \right)}
C
(
t
⋅
x
1
,
t
⋅
x
2
,
…
,
t
⋅
x
n
)
=
t
⋅
C
(
x
1
,
x
2
,
…
,
x
n
)
for
t
>
0
{\displaystyle \operatorname {C} \left(t\cdot \mathbf {x} _{1},t\cdot \mathbf {x} _{2},\,\dots ,\,t\cdot \mathbf {x} _{n}\right)=t\cdot C\left(\mathbf {x} _{1},\mathbf {x} _{2},\,\dots ,\,\mathbf {x} _{n}\right){\text{ for }}t>0}
The first property implies the fixed point property, that for all k > 0,
The contraharmonic mean is higher in value than the arithmetic mean and also higher than the root mean square:
min
(
x
)
≤
H
(
x
)
≤
G
(
x
)
≤
L
(
x
)
≤
A
(
x
)
≤
R
(
x
)
≤
C
(
x
)
≤
max
(
x
)
{\displaystyle \min(\mathbf {x} )\leq \operatorname {H} (\mathbf {x} )\leq \operatorname {G} (\mathbf {x} )\leq \operatorname {L} (\mathbf {x} )\leq \operatorname {A} (\mathbf {x} )\leq \operatorname {R} (\mathbf {x} )\leq \operatorname {C} (\mathbf {x} )\leq \max(\mathbf {x} )}
where x is a list of values, H is the harmonic mean, G is geometric mean, L is the logarithmic mean, A is the arithmetic mean, R is the root mean square and C is the contraharmonic mean. Unless all values of x are the same, the ≤ signs above can be replaced by <.
The name contraharmonic may be due to the fact that when taking the mean of only two variables, the contraharmonic mean is as high above the arithmetic mean as the arithmetic mean is above the harmonic mean (i.e., the arithmetic mean of the two variables is equal to the arithmetic mean of their harmonic and contraharmonic means).
= Relationship to arithmetic mean and variance
=The contraharmonic mean of a random variable is equal to the sum of the arithmetic mean and the variance divided by the arithmetic mean. Since the variance is always ≥0 the contraharmonic mean is always greater than or equal to the arithmetic mean.
The ratio of the variance and the mean was proposed as a test statistic by Clapham. This statistic is the contraharmonic mean less one.
= Other relationships
=Any integer contraharmonic mean of two different positive integers is the hypotenuse of a Pythagorean triple, while any hypotenuse of a Pythagorean triple is a contraharmonic mean of two different positive integers.
It is also related to Katz's statistic
J
n
=
n
2
s
2
−
m
m
{\displaystyle J_{n}={\sqrt {\frac {n}{2}}}{\frac {s^{2}-m}{m}}}
where m is the mean, s2 the variance and n is the sample size.
Jn is asymptotically normally distributed with a mean of zero and variance of 1.
Uses in statistics
The problem of a size biased sample was discussed by Cox in 1969 on a problem of sampling fibres. The expectation of size biased sample is equal to its contraharmonic mean, and the contraharmonic mean is also used to estimate bias fields in multiplicative models, rather than the arithmetic mean as used in additive models.
The contraharmonic mean can be used to average the intensity value of neighbouring pixels in graphing, so as to reduce noise in images and make them clearer to the eye.
The probability of a fibre being sampled is proportional to its length. Because of this the usual sample mean (arithmetic mean) is a biased estimator of the true mean. To see this consider
g
(
x
)
=
x
f
(
x
)
m
{\displaystyle g(x)={\frac {xf(x)}{m}}}
where f(x) is the true population distribution, g(x) is the length weighted distribution and m is the sample mean. Taking the usual expectation of the mean here gives the contraharmonic mean rather than the usual (arithmetic) mean of the sample. This problem can be overcome by taking instead the expectation of the harmonic mean (1/x). The expectation and variance of 1/x are
E
[
1
x
]
=
1
m
{\displaystyle \operatorname {E} \left[{\frac {1}{x}}\right]={\frac {1}{m}}}
and has variance
Var
(
1
x
)
=
m
E
[
1
x
−
1
]
n
m
2
{\displaystyle \operatorname {Var} \left({\frac {1}{x}}\right)={\frac {mE\left[{\frac {1}{x}}-1\right]}{nm^{2}}}}
where E is the expectation operator. Asymptotically E[1/x] is distributed normally.
The asymptotic efficiency of length biased sampling depends compared to random sampling on the underlying distribution. if f(x) is log normal the efficiency is 1 while if the population is gamma distributed with index b, the efficiency is b/(b − 1). This distribution has been used in modelling consumer behaviour as well as quality sampling.
It has been used longside the exponential distribution in transport planning in the form of its inverse.
See also
Harmonic mean
Lehmer mean
Pythagorean means
References
External links
Contraharmonic Proportion at PlanetMath.
Kata Kunci Pencarian:
- Contraharmonic mean
- Harmonic mean
- Mean
- Salt-and-pepper noise
- Lehmer mean
- Geometric mean
- CHM
- Neuman–Sándor mean
- Average
- Regression toward the mean