Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request] Add Bloom to the Auto Policy #36

Open
airsplay opened this issue Aug 13, 2022 · 2 comments
Open

[Feature Request] Add Bloom to the Auto Policy #36

airsplay opened this issue Aug 13, 2022 · 2 comments
Labels
enhancement New feature or request

Comments

@airsplay
Copy link

Add Bloom to the Auto Policy

It would be great to see the recent bloom model from bigscience can be added to the auto policy. The Bloom model is another auto-regressive large language model thus the policy might be inherited from existing policies.

Expected behavior

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("bigscience/bloom-2b5")
tokenizer = AutoTokenizer.from_pretrained("bigscience/bloom-2b5")

from parallelformers import parallelize
parallelize(model, num_gpus=2, fp16=True, verbose='detail')

inputs = tokenizer("Parallelformers is", return_tensors="pt")

outputs = model.generate(
    **inputs,
    num_beams=5,
    no_repeat_ngram_size=4,
    max_length=15,
)

print(f"Output: {tokenizer.batch_decode(outputs)[0]}")
@airsplay airsplay added the enhancement New feature or request label Aug 13, 2022
@csinva
Copy link

csinva commented Sep 3, 2022

+1!!!

@seopbo
Copy link

seopbo commented Feb 27, 2023

This is great.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants