Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DirectML storage compatibility issues #31823

Open
sqxccdy opened this issue Jul 7, 2024 · 1 comment
Open

DirectML storage compatibility issues #31823

sqxccdy opened this issue Jul 7, 2024 · 1 comment
Labels
Feature request Request for a new feature

Comments

@sqxccdy
Copy link

sqxccdy commented Jul 7, 2024

Feature request

I'm trying to run transformers models using DirectML, and it works fine in most cases. However, since Microsoft's DirectML does not support storage, this causes an error whenever an untyped_storage method is involved.

Package versions installed:

torch~=2.3.1
torch_directml=0.2.2.dev240614
transformers=4.42.3

Motivation

compatibility issues

Your contribution

I hope there is a way to provide a custom device context method without modifying the transformers source code. In my current use case, I can write compatible code for storage and skip this part.

@sqxccdy sqxccdy added the Feature request Request for a new feature label Jul 7, 2024
@amyeroberts
Copy link
Collaborator

Hi @sqxccdy, thanks for opening this feature request.

Could you provide some more details? Without a reproducible code snippet, explanation of the observed and expected behaviour, and all the relevant information about the error e.g. full error traceback there's not much we can do.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Feature request Request for a new feature
Projects
None yet
Development

No branches or pull requests

2 participants