|
@ankurhandos | |||||
|
Quaternions and Euler angles are discontinuous and difficult for neural networks to learn. They show 3D rotations have continuous representations in 5D and 6D, which are more suitable for learning.
i.e. regress two vectors and apply Graham-Schmidt (GS).
arxiv.org/abs/1812.07035 pic.twitter.com/fXUF3sgkTT
|
||||||
|
||||||
|
Ankur Handa
@ankurhandos
|
26. sij |
|
They do a series of experiments to highlight the accuracy of this 6D representation.
source code in pytorch is here: github.com/papagina/Rotat… pic.twitter.com/2FJzv26frl
|
||
|
|
||
|
Ankur Handa
@ankurhandos
|
26. sij |
|
This is how they obtain the matrix from two 3D vectors. pic.twitter.com/pqHrAXFROs
|
||
|
|
||
|
Maurice Weiler
@maurice_weiler
|
26. sij |
|
Interesting paper on manifold valued regression. The proposed solution is equivalent to that found in the Homeomorphic VAE paper of @lcfalors, @pimdehaan and @im_td
arxiv.org/pdf/1807.04689… (-> Eq. 33) pic.twitter.com/Jvu38Eu7j3
|
||
|
|
||
|
Ankur Handa
@ankurhandos
|
26. sij |
|
very interesting, thanks for highlighting this to me!
|
||
|
|
||
|
Klaus H. Strobl
@5trobl
|
26. sij |
|
Note that, since NNs are not shy of high dimensional parametrizations, you can go wild with this rotation vector e.g. if you suffer from occlusions or ambiguous rotations, see openaccess.thecvf.com/content_ECCV_2…
|
||
|
|
||
|
Ankur Handa
@ankurhandos
|
26. sij |
|
thanks. yes, this is another way of doing things - they do everything in embedding space and do L2 or cosine loss.
|
||
|
|
||
|
Pascal Kwanten | No/thingverse, Un/universe
@pascalkwanten
|
27. sij |
|
I suppose you refer to Gram-Schmidt? One of the key theorems of linear algebra in orthonormalizing vectors in a space with inner product.
|
||
|
|
||
|
Miguel Cazorla
@miguelcazorla
|
27. sij |
|
|
||
|
Pedro F. Proença 🇪🇺
@pe_proenza
|
28. sij |
|
An alternative that works well is to do soft classification and then quaternion fitting e.g. arxiv.org/abs/1907.04298
This way you can model orientation ambiguity. But yes to get high precision you end up naively with a huge net!
|
||
|
|
||
|
Howard Mahé
@Howard_Mahe
|
31. sij |
|
Lie algebra representation is well differentiable, I never thought it was discontinuous. What a surprise.
|
||
|
|
||