Saturday, August 27, 2016

Importance of Kronecker product in quantum computation


To get product state of two states $|\phi \rangle$ and $|\psi \rangle$, we use Kronecker product $|\phi \rangle \otimes |\psi \rangle$. Instead of Kronecker product $\otimes$, can we use Cartesian product, or any other products available in literature? But we do not do so. Here Kronecker product is more efficient than any other products. My question is why Kronecker product? Any physical reasoning or any problem in mathematical formulation for which Kronecker product is so important? The founders of quantum physics did not form it as they wished. Definitely they got some ideas that convinced them the efficiency of Kronecker product. What were them?


I asked the question at my very first class of quantum information theory. Till now I did not get any satisfactory answer but the course is going to be finished. Thank you for your help.



Answer



ACuriousMind's Answer pretty much summed up the reasons, which are essentially mathematical.


If you want to grasp the "physical significance", then I suggest you should work through an example: think of two quantum systems, each with three base states: $\left.\left|1\right.\right>$, $\left.\left|2\right.\right>$ and $\left.\left|3\right.\right>$. The set of linear superpositions in one of these quantum spaces is the set of unit magnitude vectors of the form $\alpha_1\,\left.\left|1\right.\right>+\alpha_2\,\left.\left|2\right.\right>+\alpha_3\,\left.\left|3\right.\right>$, where $\alpha_1^2+\alpha_2^2+\alpha_3^2=1$. Your states are going to be $3$-component vectors and they live in three dimensional spaces.


Now when we combine these two systems, the base states don't combine in a Cartesian product to give a six dimensional space. No, individually, each quantum system stays in its own space spanned by $\{\left.\left|1\right.\right>, \,\left.\left|2\right.\right>,\, \left.\left|3\right.\right>\}$ whilst the other one can be in any state in its own space spanned by its own versions of $\{\left.\left|1\right.\right>, \,\left.\left|2\right.\right>,\, \left.\left|3\right.\right>\}$.


So, with system 1 in state $\left.\left|1\right.\right>$, system 2 can be in any state of the form $\alpha_1\,\left.\left|1\right.\right>+\alpha_2\,\left.\left|2\right.\right>+\alpha_3\,\left.\left|3\right.\right>$. So the set of combined quantum states where system 1 is in state $\left.\left|1\right.\right>$ is a three dimensional vector space. A different 3-dimensional vector space of combined states arises if system 1 is in state $\left.\left|2\right.\right>$ with system 2 in an arbitrary $\alpha_1\,\left.\left|1\right.\right>+\alpha_2\,\left.\left|2\right.\right>+\alpha_3\,\left.\left|3\right.\right>$ state. Likewise for the set of combined states with system 1 in state $\left.\left|3\right.\right>$.


So our combined system has nine base states: it is a vector space of 9 dimensions, not 6. Lets write our base states for the moment as $\left.\left|i,\,j\right.\right>$, meaning system 1 in base state $i$, system 2 in base state $j$. Now, write a superposition of these states as a nine dimensional column vector stacked up as three lots of three: the first 3 elements are the superposition weights of the $\left.\left|1,\,j\right.\right>$, the next 3 the weights of $\left.\left|2,\,j\right.\right>$ and the last three the weights of the $\left.\left|3,\,j\right.\right>$. This is what a matrix representation of a general combined state will be.



Now, suppose we have a linear operator $T_1$ that acts on the first system alone, and a linear operator $T_2$ that acts on the second alone. These operators on the individual states have $3\times 3$ matrices. Then an operator on the combined system has a $9\times 9$ matrix. If you form the matrix Kronecker product $T_1\otimes T_2$, then this is the matrix of the operator that imparts the same $T_1$ to the three $\left.\left|i,\,1\right.\right>$ components, the three $\left.\left|i,\,2\right.\right>$ components and the three $\left.\left|i,\,3\right.\right>$ components and likewise imparts the same $T_2$ to the three $\left.\left|1,\,j\right.\right>$ components, the three $\left.\left|2,\,j\right.\right>$ components and the three $\left.\left|3,\,j\right.\right>$ components. This is what ACuriousMind means when he says:



we want every action of an operator (which are linear maps) on the individual states to define an action on the combined state - and the tensor product is exactly that, since, for every pair of linear maps $ T_i : \mathcal{H}_i \to \mathcal{H}$ (which is a bilinear map $(T_1,T_2) : \mathcal{H}_1 \times \mathcal{H}_2 \to \mathcal{H}$) there is a unique linear map $T_1 \otimes T_2 : \mathcal{H}_1 \otimes \mathcal{H}_2 \to \mathcal{H}$.



I work through a further detailed example for two coupled oscillators in my answer here.


No comments:

Post a Comment

classical mechanics - Moment of a force about a given axis (Torque) - Scalar or vectorial?

I am studying Statics and saw that: The moment of a force about a given axis (or Torque) is defined by the equation: $M_X = (\vec r \times \...