Twitter | Pretraživanje | |
Ankur Handa
Quaternions and Euler angles are discontinuous and difficult for neural networks to learn. They show 3D rotations have continuous representations in 5D and 6D, which are more suitable for learning. i.e. regress two vectors and apply Graham-Schmidt (GS).
Reply Retweet Označi sa "sviđa mi se" More
Ankur Handa 26. sij
Odgovor korisniku/ci @ankurhandos
They do a series of experiments to highlight the accuracy of this 6D representation. source code in pytorch is here:
Reply Retweet Označi sa "sviđa mi se"
Ankur Handa 26. sij
Odgovor korisniku/ci @ankurhandos
This is how they obtain the matrix from two 3D vectors.
Reply Retweet Označi sa "sviđa mi se"
Maurice Weiler 26. sij
Odgovor korisniku/ci @ankurhandos @lcfalors i 2 ostali
Interesting paper on manifold valued regression. The proposed solution is equivalent to that found in the Homeomorphic VAE paper of , and (-> Eq. 33)
Reply Retweet Označi sa "sviđa mi se"
Ankur Handa 26. sij
Odgovor korisniku/ci @maurice_weiler @lcfalors i 2 ostali
very interesting, thanks for highlighting this to me!
Reply Retweet Označi sa "sviđa mi se"
Klaus H. Strobl 26. sij
Odgovor korisniku/ci @ankurhandos
Note that, since NNs are not shy of high dimensional parametrizations, you can go wild with this rotation vector e.g. if you suffer from occlusions or ambiguous rotations, see
Reply Retweet Označi sa "sviđa mi se"
Ankur Handa 26. sij
Odgovor korisniku/ci @5trobl
thanks. yes, this is another way of doing things - they do everything in embedding space and do L2 or cosine loss.
Reply Retweet Označi sa "sviđa mi se"
Pascal Kwanten | No/thingverse, Un/universe 27. sij
Odgovor korisniku/ci @ankurhandos @revprez
I suppose you refer to Gram-Schmidt? One of the key theorems of linear algebra in orthonormalizing vectors in a space with inner product.
Reply Retweet Označi sa "sviđa mi se"
Miguel Cazorla 27. sij
Odgovor korisniku/ci @ankurhandos @XitoVicious
Reply Retweet Označi sa "sviđa mi se"
Pedro F. Proença 🇪🇺 28. sij
Odgovor korisniku/ci @ankurhandos
An alternative that works well is to do soft classification and then quaternion fitting e.g. This way you can model orientation ambiguity. But yes to get high precision you end up naively with a huge net!
Reply Retweet Označi sa "sviđa mi se"
Howard Mahé 31. sij
Odgovor korisniku/ci @ankurhandos
Lie algebra representation is well differentiable, I never thought it was discontinuous. What a surprise.
Reply Retweet Označi sa "sviđa mi se"