Main
Orthogonal
Orthogonal

In mathematics, orthogonality is the relation of two lines at right angles to one another (perpendicularity), and the generalization of this relation into n dimensions; and to a variety of mathematical relations thought of as describing nonoverlapping, uncorrelated, or independent objects of some kind.
The concept of orthogonality has been broadly generalized in mathematics (including in the areas of mathematical functions, calculus and linear algebra), as well as in areas such as chemistry, and engineering.
The word comes from the Greek ὀρθός (orthos), meaning "upright", and γωνία (gonia), meaning "angle". The ancient Greek ὀρθογώνιον orthogōnion (< ὀρθός orthos 'upright' + γωνία gōnia 'angle') and classical Latin orthogonium originally denoted a rectangle. Later, they came to mean a right triangle. In the 12th century, the postclassical Latin word orthogonalis came to mean a right angle or something related to a right angle.
A set of vectors is called pairwise orthogonal if each pairing of them is orthogonal. Such a set is called an orthogonal set. Nonzero pairwise orthogonal vectors are always linearly independent.
In certain cases, the word normal is used to mean orthogonal, particularly in the geometric sense as in the normal to a surface. For example, the yaxis is normal to the curve y = x^{2} at the origin. However, normal may also refer to the magnitude of a vector. In particular, a set is called orthonormal (orthogonal plus normal) if it is an orthogonal set of unit vectors. As a result, use of the term normal to mean "orthogonal" is often avoided. The word "normal" also has a different meaning in probability and statistics.
 for some positive integer a, and for 1 ≤ k ≤ a − 1, these vectors are orthogonal, for example (1, 0, 0, 1, 0, 0, 1, 0)^{T}, (0, 1, 0, 0, 1, 0, 0, 1)^{T}, (0, 0, 1, 0, 0, 1, 0, 0)^{T} are orthogonal.
 In geometry, two Euclidean vectors are orthogonal if they are perpendicular, i.e., they form a right angle.
 Two vectors, x and y, in an inner product space, V, are orthogonal if their inner product is zero. This relationship is denoted .
 Two vector subspaces, A and B, of an inner product space, V, are called orthogonal subspaces if each vector in A is orthogonal to each vector in B. The largest subspace of V that is orthogonal to a given subspace is its orthogonal complement.
 Given a module M and its dual M^{∗}, an element m′ of M^{∗} and an element m of M are orthogonal if their natural pairing is zero, i.e. ⟨m′, m⟩ = 0. Two sets S′ ⊆ M^{∗} and S ⊆ M are orthogonal if each element of S′ is orthogonal to each element of S.
 A term rewriting system is said to be orthogonal if it is leftlinear and is nonambiguous. Orthogonal term rewriting systems are confluent.
 orthogonal on the closed interval [a, b] if
 orthonormal on the interval [a, b] if
 The vectors (1, 3, 2)^{T}, (3, −1, 0)^{T}, (1, 3, −5)^{T} are orthogonal to each other, since (1)(3) + (3)(−1) + (2)(0) = 0, (3)(1) + (−1)(3) + (0)(−5) = 0, and (1)(1) + (3)(3) + (2)(−5) = 0.
 The vectors (1, 0, 1, 0, ...)^{T} and (0, 1, 0, 1, ...)^{T} are orthogonal to each other. The dot product of these vectors is 0. We can then make the generalization to consider the vectors in Z_{2}^{n}:
 The functions 2t + 3 and 45t^{2} + 9t − 17 are orthogonal with respect to a unit weight function on the interval from −1 to 1:
 The functions 1, sin(nx), cos(nx) : n = 1, 2, 3, ... are orthogonal with respect to Riemann integration on the intervals [0, 2π], [−π, π], or any other closed interval of length 2π. This fact is a central one in Fourier series.
 Various polynomial sequences named for mathematicians of the past are sequences of orthogonal polynomials. In particular:
 The Hermite polynomials are orthogonal with respect to the Gaussian distribution with zero mean value.
 The Legendre polynomials are orthogonal with respect to the uniform distribution on the interval [−1, 1].
 The Laguerre polynomials are orthogonal with respect to the exponential distribution. Somewhat more general Laguerre polynomial sequences are orthogonal with respect to gamma distributions.
 The Chebyshev polynomials of the first kind are orthogonal with respect to the measure
 The Chebyshev polynomials of the second kind are orthogonal with respect to the Wigner semicircle distribution.
 The Hermite polynomials are orthogonal with respect to the Gaussian distribution with zero mean value.
 The Legendre polynomials are orthogonal with respect to the uniform distribution on the interval [−1, 1].
 The Laguerre polynomials are orthogonal with respect to the exponential distribution. Somewhat more general Laguerre polynomial sequences are orthogonal with respect to gamma distributions.
 The Chebyshev polynomials of the first kind are orthogonal with respect to the measure
 The Chebyshev polynomials of the second kind are orthogonal with respect to the Wigner semicircle distribution.
 In quantum mechanics, a sufficient (but not necessary) condition that two eigenstates of a Hermitian operator, and , are orthogonal is that they correspond to different eigenvalues. This means, in Dirac notation, that unless and correspond to the same eigenvalue. This follows from the fact that Schrödinger's equation is a Sturm–Liouville equation (in Schrödinger's formulation) or that observables are given by hermitian operators (in Heisenberg's formulation).

What Else?
Orthogonal