# Weight function

A weight function is a mathematical device used when performing a sum, integral, or average to give some elements more "weight" or influence on the result than other elements in the same set. They occur frequently in statistics and analysis, and are closely related to the concept of a measure. Weight functions can be employed in both discrete and continuous settings. They can be used to construct systems of calculus called "weighted calculus" and "meta-calculus".

## Discrete weights

### General definition

In the discrete setting, a weight function $w\colon A\to {\mathbb {R} }^{+}$ is a positive function defined on a discrete set $A$ , which is typically finite or countable. The weight function $w(a):=1$ corresponds to the unweighted situation in which all elements have equal weight. One can then apply this weight to various concepts.

$\sum _{a\in A}f(a);$ but given a weight function $w\colon A\to {\mathbb {R} }^{+}$ , the weighted sum or conical combination is defined as

$\sum _{a\in A}f(a)w(a).$ One common application of weighted sums arises in numerical integration.

If B is a finite subset of A, one can replace the unweighted cardinality |B| of B by the weighted cardinality

$\sum _{a\in B}w(a).$ If A is a finite non-empty set, one can replace the unweighted mean or average

${\frac {1}{|A|}}\sum _{a\in A}f(a)$ by the weighted mean or weighted average

${\frac {\sum _{a\in A}f(a)w(a)}{\sum _{a\in A}w(a)}}.$ In this case only the relative weights are relevant.

### Statistics

Weighted means are commonly used in statistics to compensate for the presence of bias. For a quantity $f$ measured multiple independent times $f_{i}$ with variance $\sigma _{i}^{2}$ , the best estimate of the signal is obtained by averaging all the measurements with weight $w_{i}={\frac {1}{\sigma _{i}^{2}}}$ , and the resulting variance is smaller than each of the independent measurements $\sigma ^{2}=1/\sum w_{i}$ . The maximum likelihood method weights the difference between fit and data using the same weights $w_{i}$ .

The expected value of a random variable is the weighted average of the possible values it might take on, with the weights being the respective probabilities. More generally, the expected value of a function of a random variable is the probability-weighted average of the values the function takes on for each possible value of the random variable.

### Mechanics

The terminology weight function arises from mechanics: if one has a collection of $n$ objects on a lever, with weights $w_{1},\dotsc ,w_{n}$ (where weight is now interpreted in the physical sense) and locations :${\boldsymbol {x}}_{1},\dotsc ,{\boldsymbol {x}}_{n}$ , then the lever will be in balance if the fulcrum of the lever is at the center of mass

${\frac {\sum _{i=1}^{n}w_{i}{\boldsymbol {x}}_{i}}{\sum _{i=1}^{n}w_{i}}},$ which is also the weighted average of the positions ${\boldsymbol {x}}_{i}$ .

## Continuous weights

### General definition

$\int _{\Omega }f(x)\ dx$ can be generalized to the weighted integral

$\int _{\Omega }f(x)w(x)\,dx$ Note that one may need to require $f$ to be absolutely integrable with respect to the weight $w(x)dx$ in order for this integral to be finite.

### Weighted volume

If E is a subset of $\Omega$ , then the volume vol(E) of E can be generalized to the weighted volume

$\int _{E}w(x)\ dx,$ ### Weighted average

If $\Omega$ has finite non-zero weighted volume, then we can replace the unweighted average

${\frac {1}{{\mathrm {vol} }(\Omega )}}\int _{\Omega }f(x)\ dx$ by the weighted average

${\frac {\int _{\Omega }f(x)\ w(x)dx}{\int _{\Omega }w(x)\ dx}}$ ### Inner product

$\langle f,g\rangle :=\int _{\Omega }f(x)g(x)\ dx$ to a weighted inner product

$\langle f,g\rangle :=\int _{\Omega }f(x)g(x)\ w(x)\ dx.$ See the entry on Orthogonality for more details.