-
Notifications
You must be signed in to change notification settings - Fork 83
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(python): support moe #208
Conversation
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
revisit allreduce for moe.gate...bagua/bagua/torch_api/moe/experts.py Lines 20 to 25 in 28bc3e2
This comment was generated by todo based on a
|
examples/mnist/main.py
Outdated
@@ -13,14 +13,20 @@ | |||
|
|||
|
|||
class Net(nn.Module): | |||
def __init__(self): | |||
def __init__(self, num_local_experts): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
move moe example into separate dir, for example examples/mnist-moe
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
bagua/torch_api/moe/sharded_moe.py
Outdated
# Git commit hash: bff6126f0ddbd1a03da66867571ac87b11c21ac1 | ||
# We retain the following license from the original files: | ||
|
||
# Copyright 2020 The Microsoft DeepSpeed Team |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also add our license line
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
bagua/torch_api/moe/layer.py
Outdated
# Git commit hash: bff6126f0ddbd1a03da66867571ac87b11c21ac1 | ||
# We retain the following license from the original files: | ||
|
||
# Copyright 2020 The Microsoft DeepSpeed Team |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
add our license
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
bagua/torch_api/moe/experts.py
Outdated
# Git commit hash: bff6126f0ddbd1a03da66867571ac87b11c21ac1 | ||
# We retain the following license from the original files: | ||
|
||
# Copyright 2020 The Microsoft DeepSpeed Team |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
add our license
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
bagua/torch_api/moe/__init__.py
Outdated
@@ -0,0 +1 @@ | |||
from .layer import MoE # noqa: F401 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
move the whole moe directory to bagua/torch_api/model_parallel/moe
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
done
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
see comments
Create param groups to handle expert + data case (e.g. param.group = moe_group)bagua/bagua/torch_api/model_parallel/moe/experts.py Lines 27 to 32 in 04658ce
This comment was generated by todo based on a
|
Create param groups to handle expert + data case (e.g. param.group = moe_group)bagua/bagua/torch_api/model_parallel/moe/experts.py Lines 27 to 32 in 0fc5ad1
This comment was generated by todo based on a
|
Create param groups to handle expert + data case (e.g. param.group = moe_group)bagua/bagua/torch_api/model_parallel/moe/experts.py Lines 27 to 32 in e50fdfc
This comment was generated by todo based on a
|
Create param groups to handle expert + data case (e.g. param.group = moe_group)bagua/bagua/torch_api/model_parallel/moe/experts.py Lines 27 to 32 in 013a3c3
This comment was generated by todo based on a
|
1 similar comment
Create param groups to handle expert + data case (e.g. param.group = moe_group)bagua/bagua/torch_api/model_parallel/moe/experts.py Lines 27 to 32 in 013a3c3
This comment was generated by todo based on a
|
No description provided.