You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to run transformers models using DirectML, and it works fine in most cases. However, since Microsoft's DirectML does not support storage, this causes an error whenever an untyped_storage method is involved.
I hope there is a way to provide a custom device context method without modifying the transformers source code. In my current use case, I can write compatible code for storage and skip this part.
The text was updated successfully, but these errors were encountered:
Hi @sqxccdy, thanks for opening this feature request.
Could you provide some more details? Without a reproducible code snippet, explanation of the observed and expected behaviour, and all the relevant information about the error e.g. full error traceback there's not much we can do.
Feature request
I'm trying to run transformers models using DirectML, and it works fine in most cases. However, since Microsoft's DirectML does not support storage, this causes an error whenever an untyped_storage method is involved.
Package versions installed:
torch~=2.3.1
torch_directml=0.2.2.dev240614
transformers=4.42.3
Motivation
compatibility issues
Your contribution
I hope there is a way to provide a custom device context method without modifying the transformers source code. In my current use case, I can write compatible code for storage and skip this part.
The text was updated successfully, but these errors were encountered: