Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add groq cloud + llama3 model #60

Merged
merged 1 commit into from
May 25, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 7 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,10 +1,11 @@
# roboquote

Generate random "inspirational" quotes images by using an AI text generation model through the Hugging Face Inference API.
Generate random "inspirational" quotes images by using an AI text generation model.

The following models can be used:
- [bigscience/bloom](https://huggingface.co/bigscience/bloom)
- [mistralai/Mixtral-8x7B-Instruct-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1)
- [bigscience/bloom](https://huggingface.co/bigscience/bloom) through Hugging Face Inference API
- [mistralai/Mixtral-8x7B-Instruct-v0.1](https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1) through Hugging Face Inference API
- [meta-llama/Meta-Llama-3-70B](https://huggingface.co/meta-llama/Meta-Llama-3-70B) through GroqCloud API

## Examples

Expand All @@ -23,10 +24,10 @@ The following models can be used:
### Installation
1. Install the project with [poetry](https://python-poetry.org/) by doing `poetry install`.
2. Set environment variables for configuration (the project uses [environs](https://github.com/sloria/environs) so you can also put the variable in a [env file](https://github.com/sloria/environs#reading-env-files)):
- `HUGGING_FACE_API_TOKEN` (**required**): your Hugging Face Inference API token.
- `HUGGING_FACE_ACCESS_TOKEN` (**required**): your Hugging Face access token for their Inference API
- `GROQ_CLOUD_API_KEY` (**required**): your GroqCloud API key for their API
- `LOG_LEVEL` (optional, default is `WARNING`): log level of the application
- `WEB_DEBUG` (optional, default is `False`): if you want to run the web app in debug mode (should not be required)
- `HIDE_HUGGING

### CLI usage

Expand All @@ -45,6 +46,7 @@ It can be launched quickly locally with [uvicorn](https://pypi.org/project/uvico
- [atomicparade](https://github.com/atomicparade) for the [code used to do the text auto wrapping](https://github.com/atomicparade/pil_autowrap/blob/main/pil_autowrap/pil_autowrap.py)
- [BigScience Workshop](https://huggingface.co/bigscience/) for the BLOOM model used
- [Google Fonts](https://fonts.google.com/) for the fonts used in the pictures.
- [Groq](https://groq.com/) for their inference API
- [Hugging Face](https://huggingface.co/) for the inference API
- [Mistral AI](https://mistral.ai/) for the Mistral-7B-Instruct-v0.1 model used
- [Unsplash](unsplash.com) for the background images.
47 changes: 31 additions & 16 deletions main.py
Original file line number Diff line number Diff line change
@@ -1,15 +1,20 @@
"""Main CLI entrypoint."""

import asyncio
from typing import Annotated

import click
import typer

from roboquote.background_image import (
get_random_background_from_unsplash_by_theme,
get_random_background_search_query,
)
from roboquote.entities.generate_options import GenerateOptions
from roboquote.entities.text_model import TextModel
from roboquote.entities.large_language_model import (
AVAILABLE_LARGE_LANGUAGE_MODELS,
AVAILABLE_LARGE_LANGUAGE_MODELS_NAMES,
)
from roboquote.quote_text_generation import get_random_quote
from roboquote.result_image import generate_image

Expand All @@ -19,22 +24,32 @@
@app.command()
def generate(
filename: str,
blur: bool = typer.Option(True, help="Add a blur on the background."),
blur_intensity: int | None = typer.Option(
None, help="If blur is enabled,the blur intensity level."
),
background: str | None = typer.Option(
default=None,
help="If specified, use this string as the search query "
+ "for the background image instead of a random one. "
+ "Works best with simple queries like 'mountain', 'sea' etc.",
),
text_model: TextModel = typer.Option(
default=TextModel.MISTRAL_8X7B_INSTRUCT.value,
help="The text generation model to use.",
),
blur: Annotated[bool, typer.Option(help="Add a blur on the background.")] = True,
blur_intensity: Annotated[
int | None, typer.Option(help="If blur is enabled,the blur intensity level.")
] = None,
background: Annotated[
str | None,
typer.Option(
help="If specified, use this string as the search query "
+ "for the background image instead of a random one. "
+ "Works best with simple queries like 'mountain', 'sea' etc.",
),
] = None,
model_name: Annotated[
str,
typer.Option(
click_type=click.Choice(AVAILABLE_LARGE_LANGUAGE_MODELS_NAMES),
help="The name of the LLM to use for the text generation.",
),
] = AVAILABLE_LARGE_LANGUAGE_MODELS_NAMES[0],
) -> None:
"""Generate a new image with the given filename."""
# Get model
large_language_model = next(
model for model in AVAILABLE_LARGE_LANGUAGE_MODELS if model.name == model_name
)

# Get a random background category if not specified
if background is None:
background = get_random_background_search_query()
Expand All @@ -55,7 +70,7 @@ def generate(
)

# Get text to use
text = asyncio.run(get_random_quote(background, text_model))
text = asyncio.run(get_random_quote(background, large_language_model))

# Generate and save image
generated_image = generate_image(
Expand Down
76 changes: 37 additions & 39 deletions poetry.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[tool.poetry]
name = "roboquote"
version = "0.3.0"
version = "0.4.0"
description = "A generator of inspirational quotes"
authors = ["corenting <corenting@gmail.com>"]
license = "MIT"
Expand Down
2 changes: 1 addition & 1 deletion roboquote/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@

from roboquote import config

__version__ = "0.3.0"
__version__ = "0.4.0"

# Setup logger
logger.remove()
Expand Down
4 changes: 3 additions & 1 deletion roboquote/config.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,5 +6,7 @@
env.read_env()

WEB_DEBUG: bool = env.bool("WEB_DEBUG", False)
HUGGING_FACE_API_TOKEN: str = env.str("HUGGING_FACE_API_TOKEN")
LOG_LEVEL: str = env.str("LOG_LEVEL", "WARNING")

HUGGING_FACE_ACCESS_TOKEN: str = env.str("HUGGING_FACE_ACCESS_TOKEN")
GROQ_CLOUD_API_KEY: str = env.str("GROQ_CLOUD_API_KEY")
6 changes: 0 additions & 6 deletions roboquote/constants.py
Original file line number Diff line number Diff line change
@@ -1,9 +1,3 @@
"""Constants used in the code."""

from roboquote.entities.text_model import TextModel

FONTS_PATH = "fonts"

HUGGING_FACE_BASE_API_URL = "https://api-inference.huggingface.co/models/"

DEFAULT_WEB_TEXT_MODEL = TextModel.MISTRAL_8X7B_INSTRUCT
47 changes: 47 additions & 0 deletions roboquote/entities/large_language_model.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
from dataclasses import dataclass
from enum import Enum


class LargeLanguageModelPromptType(Enum):
CONTINUE = "continue" # where the model continues an existing sentence
INSTRUCT = "instruct" # where we instruct the model


class LargeLanguageModelAPI(Enum):
GROQ_CLOUD = "GroqCloud"
HUGGING_FACE = "Hugging Face"


@dataclass
class LargeLanguageModel:
name: str
prompt_type: LargeLanguageModelPromptType
api: LargeLanguageModelAPI
prompt_start: str | None = None
prompt_end: str | None = None


# Ordered by preferred models
AVAILABLE_LARGE_LANGUAGE_MODELS = [
LargeLanguageModel(
name="llama3-70b-8192",
prompt_type=LargeLanguageModelPromptType.INSTRUCT,
api=LargeLanguageModelAPI.GROQ_CLOUD,
),
LargeLanguageModel(
name="mistralai/Mixtral-8x7B-Instruct-v0.1",
prompt_type=LargeLanguageModelPromptType.INSTRUCT,
api=LargeLanguageModelAPI.HUGGING_FACE,
prompt_start="<s>[[INST]",
prompt_end="[/INST]",
),
LargeLanguageModel(
name="bigscience/bloom",
prompt_type=LargeLanguageModelPromptType.CONTINUE,
api=LargeLanguageModelAPI.HUGGING_FACE,
),
]

AVAILABLE_LARGE_LANGUAGE_MODELS_NAMES = [
model.name for model in AVAILABLE_LARGE_LANGUAGE_MODELS
]
6 changes: 0 additions & 6 deletions roboquote/entities/text_model.py

This file was deleted.

Loading