You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
At the moment TranslationModel::translate takes a slice of texts to translate, a source language and a destination language and produces a slice of translated texts.
For models like m2m or nnlb, it would be convenient if there were an additional method, perhaps translate_multi_lang, that takes a slice of texts to translate, a slice of source languages and possibly a slice of destination languages. This way one could make use of the batched inference but for multiple source/destination pairs.
At least for multiple source languages, this would be very easy to implement.
The text was updated successfully, but these errors were encountered:
Apologies for getting back to you with a delay. This may indeed be useful - would you like to draft a pull request? If not I will try to have a look at it this week-end.
Hi, thanks so much for the great project!
At the moment
TranslationModel::translate
takes a slice of texts to translate, a source language and a destination language and produces a slice of translated texts.For models like m2m or nnlb, it would be convenient if there were an additional method, perhaps
translate_multi_lang
, that takes a slice of texts to translate, a slice of source languages and possibly a slice of destination languages. This way one could make use of the batched inference but for multiple source/destination pairs.At least for multiple source languages, this would be very easy to implement.
The text was updated successfully, but these errors were encountered: