-
-
Notifications
You must be signed in to change notification settings - Fork 402
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Fix AVX512 error when using low or med vram with ipex
- Loading branch information
Showing
1 changed file
with
1 addition
and
1 deletion.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
8022de7
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
00:16:54-870571 ERROR Error running pip: install --upgrade torch==1.13.0a0 torchvision==0.14.1a0 intel_extension_for_pytorch==1.13.120+xpu -f https://developer.intel.com/ipex-whl-stable-xpu
00:16:54-871927 ERROR Could not load torch: No module named 'torch'
8022de7
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
direct installed WSL2
https://intel-extension-for-pytorch.s3.amazonaws.com/ipex_stable/xpu/torch-1.13.0a0%2Bgit6c9b55e-cp310-cp310-linux_x86_64.whl
https://intel-extension-for-pytorch.s3.amazonaws.com/ipex_stable/xpu/intel_extension_for_pytorch-1.13.120%2Bxpu-cp310-cp310-linux_x86_64.whl
https://intel-extension-for-pytorch.s3.amazonaws.com/ipex_stable/xpu/torchvision-0.14.1a0%2B5e8e2f1-cp310-cp310-linux_x86_64.whl
8022de7
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Open an issue instead of commenting on unrelated commits.
This is an issue on Intel's servers. It seems like they removed the latest version of ipex.
Switch to Automatic's virtual environment and install them manually:
source venv/bin/activate
And then use
pip install --upgrade
for each package in your message.8022de7
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@Disty0 how fast is
torch.xpu.optimize
? should it be enabled by default of behind a flag?8022de7
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It doesn't seem to change much for me but it added +%10-20 extra performance for other users.
I takes less than a second to to apply.
torch.xpu.optimize
.We could make this another flag but i enabled it with ipex by default since we are already using --use-ipex.