You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The label mapping is loaded from the config.json file provided to initialize the model. Do you have an instance of a malformed model configuration that does not contain the label information? While creating labels "on the fly" if they are missing would allow the code to compile and run, the output is not properly form (what is LABEL_0 for the downstream application)?
I'd be in favor of keeping the current set-up to encourage user to provide a valid configuration, maybe additional documentation/hints for the error thrown would be helpful?
As opposed to transformers where labels are generated ad-hoc
To resolve, we might want to add label mapping into
SequenceClassificationConfig
with some defaults, but it might be a change that's too radicalAnother possible fix is to do the same thing as
transformers
and go:instead of
And then
num_labels
when no mapping specified, is... magic number 2https://github.com/huggingface/transformers/blob/95b374952dc27d8511541d6f5a4e22c9ec11fb24/src/transformers/configuration_utils.py#L331
Well not so much magic if you assume a classifier with no other information provided is binary always which is what the python lib seems to do.
Any thoughts?
The text was updated successfully, but these errors were encountered: