Saturday, August 18, 2018

special relativity - The matrix of the Lorentz transformation is or isn't a tensor?


The matrix of the Lorentz transformation isn't a tensor, because it switches the sign of the non-diagonal components during the inverse transformation, right? So it isn't 'basis independent', but the Minkowski metrics is a tensor?


I haven't found it anywhere and I am slightly confused about tensors. I know the tensor can be recognised by how it transforms but it feels strange that so important matrix is just an ordinary matrix of the linear transformation.



Answer



It's not a tensor, it's a co-ordinate transformation between the global co-ordinates of two inertial observers in flat Minkowski spacetime. So its "job" is to transform the components of vectors and tensors.


Many objects we meet in physics that are heavily used are not tensors or vectors; the lack of such a character usually simply means that the object in question does not have a co-ordinate free definition and is not co-ordinate independent, so you shouldn't "feel strange" that the transformation is not a tensor. The lack is quite natural when we want to calculate with specific co-ordinates as opposed to discussing co-ordinate free aspects of a problem and doesn't mean anything awry or pathological. Another important example that you will probably meet is the Christoffel symbol or Connexion Coefficient array; this one simply tells us how the unit vectors for a given co-ordinate system are mapped under parallel transport and is used for calculating the part of the changes that a tensor field's components undergo under parallel transport so that we can calculate the covariant derivative.


A rank $k$ tensor is simply a homogeneous multilinear scalar function of $k$ vector arguments; its components must must transform in a way that "compensates" for the changes that the components of the vector arguments undergo under the co-ordinate transformation so that the tensor's scalar value does not change.


In the simplest case, a homogeneous uni-linear scalar function of 1 vector argument we have a one-form aka a covector. So if a vector $V=X^j\,\hat{e}_j$ with components $X^j$ becomes the argument of the one-form $K$, the scalar value $K(V)$ must not change under a co-ordinate transformation. In components, $K(V)=K_j\,X^j$, so when the the co-ordinate transformation acts on $V$'s components by $X^j\mapsto \Lambda^j_{\ k} X^k$, we must transform the $K_j$ by a mapping that keeps the value of $K(V)$; this must be a mapping with matrix $\tilde{\Lambda}$ such that $\tilde{\Lambda}_j^{\ k}\Lambda^i_{\ k}=\delta^i_j$, so $\tilde{\Lambda}$ must therefore be the inverse matrix of $\Lambda$.


The rank $k$ tensor is then a generalization of the above where our function $K$ has $k$ slots for vector arguments, instead of only one.


To finish the idea off, we have the idea that vectors can be mapped bijectively to covectors and contrariwise; in Riemannian and Lorentzian geometry this mapping is done by the metric tensor: the vector $V$ is paired by this correspondence with the unique covector $W(\_) = G(V,\,\_)$ that evaluates the inner product $G(V,\,U)$ of $V$ with the covector's argument $U$. So we can conceive of a generalization of the tensor idea above where we allow covector arguments for some or all of a tensor's slots. This is why we have to keep track of whether indices are lowered or raised, and a rank $k$ tensor is now simply a homogeneous scalar function of $k$ vector and / or covector arguments.



No comments:

Post a Comment

classical mechanics - Moment of a force about a given axis (Torque) - Scalar or vectorial?

I am studying Statics and saw that: The moment of a force about a given axis (or Torque) is defined by the equation: $M_X = (\vec r \times \...