3-Calculus-Vector-Tensor-Operations

Bianchi symmetry

Curvature tensor is 0 if there is no torsion {first Bianchi identity, symmetry} {Bianchi symmetry, tensor}.

Bianchi identity

Curvature-tensor derivative is 0 if there is no torsion {Bianchi identity, tensor} {second Bianchi identity, tensor}.

contravariant

Coordinates can depend directly on dimensions {contravariant, tensor}. Coefficients a times basis vectors e result in coordinates x: a(i) * e(i) = x(i).

covariant

Covariant components relate to contravariant components. If basis vectors are orthogonal, covariant components and contravariant components are equal: a(i) = x(i) / e(i).

If basis vectors are curved coordinates, then a(i) = g(i,j) * a(j), where g(i,j) depend on basis vectors e(i) and e(j). Some g(i,j) components are for covariance, some for contravariance, and some for both. g(i,j) tensor relates basis vectors. g(i,j) elements are functions of curved-space positions.

Euclidean space

g(i,j) elements are 1 or 0 for flat space with orthogonal basis vectors.

coordinate transformation

Physical quantities or coordinates can transform from one coordinate system to another {coordinate transformation}. First coordinate-system vector components are linear functions of second coordinate-system components.

tensor

Tensor coefficients are weights by which to multiply old variables to get new variables. Tensor-term number is old-component number times new-component number. Scalar product of outer-product tensor and old basis vectors obtains new basis-vector scalars.

projection

Linear transformation projects old onto new. Linear transformation is affine geometry.

contravariant

Contravariant component {contravariant}, such as dx, multiplies with tensor. Terms with same index, such as ii, have coefficient one. Terms with different indexes, such as ij, have coefficient zero. In contravariant transformation, only diagonal terms remain. Contravariant component sum is vector expressed in old basis vectors. Diagonal terms are scalars for new basis vectors.

covariant

Covariant component {covariant}, such as partial derivative, is contravariant component times tensor. Terms with different indexes, such as ij, have coefficient one. Terms with same index, such as ii, have coefficient zero. In covariant transformation, diagonal terms are not present. Covariant component sum is weight matrix. Non-diagonal terms are weights.

covariance and contravariance

Covariant means that different old and new components interact. Contravariant means that same old and new components interact. Together, they account for all interactions. Contravariant reduces dimension by one. Covariant does not change dimensions. If components are orthogonal, as in Euclidean space, covariant and contravariant components are the same. Contravariant or covariant transformation does not change symmetrical-tensor value. Contravariant or covariant transformation only changes sign of odd number of skew-symmetrical tensor transformations.

covariant

Linear functions can find coefficients of scalar products from original variables and basis vectors {covariant, tensor}: a(i) = x(i) * e(i). Covariant components relate to contravariant components by relations between basis vectors. If basis vectors are orthogonal, covariant components and contravariant components are equal. If basis vectors are curved coordinates, then a(i) = g(i,j) * a(j), where g(i,j) depend on basis vectors e(i) ... e(j). Some g(i,j) components are for covariance, some for contravariance, and some for both. g(i,j) tensor relates basis vectors. g(i,j) elements are functions of curved-space positions.

g(i,j) elements are 1 or 0 for flat space with orthogonal basis vectors.

covariant transformation

Terms with different indexes, such as ij, have coefficient one. In covariant transformation, new coefficients are new-vector coefficients, and variable number stays the same.

Einstein summation

Notation conventions {Einstein summation convention} can denote tensors.

projection by tensor

Coordinates have unit vectors, such as u for x-axis and v for y-axis. Vector from origin can have coordinates: (2,3) = 2*u + 3*v, where u and v are unit vectors for two-dimensions. Straight vector has constant slope.

Lines and surfaces can curve. At point, line or surface has curvature. Slope change indicates curvature amount. For two dimensions, two orthogonal directions, u and v, can change slope: du and dv, where d is differential. Total curvature has coefficients that depend on dimensions and is sum of changes along dimensions: (Df(u,v) / Dv) * du + (Df(u,v) / Du) * dv, where D is partial derivative and d is differential.

linear

Linear functions depend on one variable raised only to first power: C * x. Bilinear functions depend on product of first-power variables: C * x * y.

symmetry

Functions have symmetry if function variables can interchange, for example, they are Euclidean space dimensions.

tensor

Tensors are linear functions of coefficients times any number of variables: c * v1 * v2 * ... * vN, where c is coefficient, vi are variables, and N can be infinite. Tensors can be sums of these terms: c1 * i1 * j1 * ... * n1 + c2 * i2 * j2 * ... * n2 + ... + cN * iN * jN * ... * nN, where n can be infinite. Vectors are tensors: Cu * u or Cu * u + Cv * v.

scalar product

Bilinear forms are tensors: Cuv * du * dv. Bilinear tensors can be sum over all ij of g(ij) * du * dv, where i and j are dimensions, g is function of dimensions, and u and v are dimension unit vectors. Scalar product of two vectors (a,b) and (c,d) is a*c*i*i + b*d*j*j + a*d*i*j + b*c*j*i = a*c + b*d, which is symmetric bilinear tensor. In scalar products, terms with same index, such as ii, have coefficient one, and terms with different indexes, such as ij, have coefficient zero {contravariant transformation}. Vector projection onto itself makes 100% = 1 of vector. Vector projection onto perpendicular makes zero.

scalar product: symmetry

Tensor projection and scalar product are symmetric, because answer is the same if either vector projects onto other vector.

scalar product: projection

Scalar product projects one vector onto another to find length. Tensor transformations can project {projection, tensor} one vector onto another vector, to give length.

quadratic

If two vectors are the same, scalar product is quadratic. For vector (a,b), sum over all ij of g(ij) * du * du = a*a*i*i + b*b*j*j = a^2 + b^2.

cross product

Vector cross products are vectors and tensors: (a,b) and (c,d) make a*c*i*i + b*d*j*j + a*d*i*j + b*c*j*i = (a*d - b*d)*k, where j*i = - i*j = -k, because opposite direction. In cross products, terms with same index, such as ii, have coefficient zero, and terms with different indexes, such as ij, have coefficient one {covariant transformation}. Divergence of vector from itself is zero. Divergence of vector onto perpendicular makes 100% = 1 divergence. Tensors can be scalar, vector, matrix, and tensor products.

tensor contraction

Tensor order can reduce by two {tensor contraction}. If tensor has contravariant component and covariant component, sum tensor over each component, at same time, to eliminate each component.

trace of tensor

Scalar products contract second-order tensors and are sums of squares along diagonal {trace, tensor} {spur, tensor}.

volume using tensors

Volumes {volume, tensor} are determinants of third-order skew-symmetrical covariant tensors.

Related Topics in Table of Contents

3-Calculus-Vector-Tensor

Drawings

Drawings

Contents and Indexes of Topics, Names, and Works

Outline of Knowledge Database Home Page

Contents

Glossary

Topic Index

Name Index

Works Index

Searching

Search Form

Database Information, Disclaimer, Privacy Statement, and Rights

Description of Outline of Knowledge Database

Notation

Disclaimer

Copyright Not Claimed

Privacy Statement

References and Bibliography

Consciousness Bibliography

Technical Information

Date Modified: 2022.0225