>>7588366 >Notation. Same as when you differentiate with respect to a vector. ahh okay, just looked it up but then either both vecors should be transposed, or none, since the tensor product of two vectors is defined as (x\otimes y)(z) := \left \langle y,z \right \rangle x
It basically depends, if the matrix derivatives operates on the vector space or it's dual
>>7588451 The ket wouldn't turn into a bra. Like people have been saying, only one should be transposed. I wouldn't call it handwaving - it's more that you prove how these things behave, and then the notation makes things simpler. Or doesn't.
>>7588456 >I wouldn't call it handwaving - it's more that you prove how these things behave, and then the notation makes things simpler. Or doesn't. This explains you physicists' mindset so well. What normal human beings see as retarded, senseless handwaving, you people see as helpful memorisation tools. It's why everyone hates your stupid fucking field.
>>7588456 only one should be transposed if it would be x^Tx^T and not x^T \otimes x^T
Just what I've got from the wikipedia article on matrix calculus: https://en.wikipedia.org/wiki/Matrix_calculus#Scalar-by-matrix >Notice that the indexing of the gradient with respect to X is transposed as compared with the indexing of X so it seems like the matrix derivative is an operator, that acts on the dual space (or in the finite dimensional case on transposed vectors) I guess I made a mistake - it should be |x \rangle \langle x |
All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
This is a 4chan archive - all of the content originated from them. If you need IP information for a Poster - you need to contact them. This website shows only archived content.
If a post contains personal/copyrighted/illegal content you can contact me at email@example.com with that post and thread number and it will be removed as soon as possible.