Utilising language shared subspaces as proxy for training task aligned embedding models without explicit need of language specific data
Share this post
🌌LUSIFER: Language Universal Space…
Share this post
Utilising language shared subspaces as proxy for training task aligned embedding models without explicit need of language specific data