## Orthogonal

• In mathematics, orthogonality is the relation of two lines at right angles to one another (perpendicularity), and the generalization of this relation into n dimensions; and to a variety of mathematical relations thought of as describing non-overlapping, uncorrelated, or independent objects of some kind.

The concept of orthogonality has been broadly generalized in mathematics (including in the areas of mathematical functions, calculus and linear algebra), as well as in areas such as chemistry, and engineering.

The word comes from the Greek ὀρθός (orthos), meaning "upright", and γωνία (gonia), meaning "angle". The ancient Greek ὀρθογώνιον orthogōnion (< ὀρθός orthos 'upright' + γωνία gōnia 'angle') and classical Latin orthogonium originally denoted a rectangle. Later, they came to mean a right triangle. In the 12th century, the post-classical Latin word orthogonalis came to mean a right angle or something related to a right angle.

A set of vectors is called pairwise orthogonal if each pairing of them is orthogonal. Such a set is called an orthogonal set. Nonzero pairwise orthogonal vectors are always linearly independent.

In certain cases, the word normal is used to mean orthogonal, particularly in the geometric sense as in the normal to a surface. For example, the y-axis is normal to the curve y = x2 at the origin. However, normal may also refer to the magnitude of a vector. In particular, a set is called orthonormal (orthogonal plus normal) if it is an orthogonal set of unit vectors. As a result, use of the term normal to mean "orthogonal" is often avoided. The word "normal" also has a different meaning in probability and statistics.

${\displaystyle \langle f,g\rangle _{w}=\int _{a}^{b}f(x)g(x)w(x)\,dx.}$
${\displaystyle \int _{a}^{b}f(x)g(x)w(x)\,dx=0.}$
${\displaystyle \|f\|_{w}={\sqrt {\langle f,f\rangle _{w}}}}$
${\displaystyle \langle f_{i},f_{j}\rangle =\int _{a}^{b}f_{i}(x)f_{j}(x)w(x)\,dx=\|f_{i}\|^{2}\delta _{i,j}=\|f_{j}\|^{2}\delta _{i,j}}$
${\displaystyle \langle f_{i},f_{j}\rangle =\int _{a}^{b}f_{i}(x)f_{j}(x)w(x)\,dx=\delta _{i,j}}$
${\displaystyle \delta _{i,j}=\left\{{\begin{matrix}1&\mathrm {if} \ i=j\\0&\mathrm {if} \ i\neq j\end{matrix}}\right.}$
${\displaystyle \mathbf {v} _{k}=\sum _{i=0 \atop ai+k
${\displaystyle \mathbf {v} _{k}=\sum _{i=0 \atop ai+k
for some positive integer a, and for 1 ≤ ka − 1, these vectors are orthogonal, for example (1, 0, 0, 1, 0, 0, 1, 0)T, (0, 1, 0, 0, 1, 0, 0, 1)T, (0, 0, 1, 0, 0, 1, 0, 0)T are orthogonal.
• In geometry, two Euclidean vectors are orthogonal if they are perpendicular, i.e., they form a right angle.
• Two vectors, x and y, in an inner product space, V, are orthogonal if their inner product ${\displaystyle \langle x,y\rangle }$ is zero. This relationship is denoted ${\displaystyle x\,\bot \,y}$.
• Two vector subspaces, A and B, of an inner product space, V, are called orthogonal subspaces if each vector in A is orthogonal to each vector in B. The largest subspace of V that is orthogonal to a given subspace is its orthogonal complement.
• Given a module M and its dual M, an element m′ of M and an element m of M are orthogonal if their natural pairing is zero, i.e. m′, m⟩ = 0. Two sets S′ ⊆ M and SM are orthogonal if each element of S′ is orthogonal to each element of S.
• A term rewriting system is said to be orthogonal if it is left-linear and is non-ambiguous. Orthogonal term rewriting systems are confluent.
• orthogonal on the closed interval [a, b] if
• orthonormal on the interval [a, b] if
• The vectors (1, 3, 2)T, (3, −1, 0)T, (1, 3, −5)T are orthogonal to each other, since (1)(3) + (3)(−1) + (2)(0) = 0, (3)(1) + (−1)(3) + (0)(−5) = 0, and (1)(1) + (3)(3) + (2)(−5) = 0.
• The vectors (1, 0, 1, 0, ...)T and (0, 1, 0, 1, ...)T are orthogonal to each other. The dot product of these vectors is 0. We can then make the generalization to consider the vectors in Z2n:
• The functions 2t + 3 and 45t2 + 9t − 17 are orthogonal with respect to a unit weight function on the interval from −1 to 1:
${\displaystyle \int _{-1}^{1}\left(2t+3\right)\left(45t^{2}+9t-17\right)\,dt=0}$
• The functions 1, sin(nx), cos(nx) : n = 1, 2, 3, ... are orthogonal with respect to Riemann integration on the intervals [0, 2π], [−π, π], or any other closed interval of length 2π. This fact is a central one in Fourier series.
• In quantum mechanics, a sufficient (but not necessary) condition that two eigenstates of a Hermitian operator, ${\displaystyle \psi _{m}}$ and ${\displaystyle \psi _{n}}$, are orthogonal is that they correspond to different eigenvalues. This means, in Dirac notation, that ${\displaystyle \langle \psi _{m}|\psi _{n}\rangle =0}$ unless ${\displaystyle \psi _{m}}$ and ${\displaystyle \psi _{n}}$ correspond to the same eigenvalue. This follows from the fact that Schrödinger's equation is a Sturm–Liouville equation (in Schrödinger's formulation) or that observables are given by hermitian operators (in Heisenberg's formulation).
Wikipedia