Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Gemini failed in Vertex AI workbench #778

Open
LeoMai2024 opened this issue Mar 31, 2024 · 1 comment
Open

[Bug]: Gemini failed in Vertex AI workbench #778

LeoMai2024 opened this issue Mar 31, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@LeoMai2024
Copy link

Describe the bug
Hi AutoLabel team,
Amazing project! Currently, I am trying to use autolabel in Vertex AI workbench, I successfully installed autolabel library and I have Gemini pro access. However, the notebook will raise error when I tried to use Gemini. The only difference my situation with documentation is that I don't have API key in workbench since I use Gemini access from project. Could you help me understand how can I use it?
use Gemini config:
"model": {
"provider": "google",
"name": "gemini-pro", # I also tried "gemini-1.0-pro" both raise same error as follows
"params": {}
},

Error Screenshots

ValidationError Traceback (most recent call last)
Cell In[18], line 2
1 # create an aggent for labeling
----> 2 agent = LabelingAgent(config=config)

File /opt/conda/lib/python3.10/site-packages/autolabel/labeler.py:96, in LabelingAgent.init(self, config, cache, example_selector, create_task, console_output, generation_cache, transform_cache)
92 self.config = (
93 config if isinstance(config, AutolabelConfig) else AutolabelConfig(config)
94 )
95 self.task = TaskFactory.from_config(self.config)
---> 96 self.llm: BaseModel = ModelFactory.from_config(
97 self.config, cache=self.generation_cache
98 )
99 score_type = "logprob_average"
100 if self.config.task_type() == TaskType.ATTRIBUTE_EXTRACTION:

File /opt/conda/lib/python3.10/site-packages/autolabel/models/init.py:47, in ModelFactory.from_config(config, cache)
45 try:
46 model_cls = MODEL_REGISTRY[provider]
---> 47 model_obj = model_cls(config=config, cache=cache)
48 # The below ensures that users should based off of the BaseModel
49 # when creating/registering custom models.
50 assert isinstance(
51 model_obj, BaseModel
52 ), f"{model_obj} should inherit from autolabel.models.BaseModel"

File /opt/conda/lib/python3.10/site-packages/autolabel/models/palm.py:68, in PaLMLLM.init(self, config, cache)
66 self.llm = ChatVertexAI(model_name=self.model_name, **self.model_params)
67 else:
---> 68 self.llm = VertexAI(model_name=self.model_name, **self.model_params)

File /opt/conda/lib/python3.10/site-packages/langchain/load/serializable.py:74, in Serializable.init(self, **kwargs)
73 def init(self, **kwargs: Any) -> None:
---> 74 super().init(**kwargs)
75 self._lc_kwargs = kwargs

File /opt/conda/lib/python3.10/site-packages/pydantic/main.py:341, in pydantic.main.BaseModel.init()

ValidationError: 1 validation error for VertexAI
root
Unknown model publishers/google/models/gemini-1.0-pro; {'gs://google-cloud-aiplatform/schema/predict/instance/text_generation_1.0.0.yaml': <class 'vertexai.preview.language_models._PreviewTextGenerationModel'>} (type=value_error)

Additional context
Add any other context about the problem here.

@LeoMai2024 LeoMai2024 added the bug Something isn't working label Mar 31, 2024
@rishabh-bhargava
Copy link
Contributor

cc: @Vaibhav2001

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants