Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

avoid expensive initialization #18

Open
mimoo opened this issue Jun 20, 2023 · 5 comments
Open

avoid expensive initialization #18

mimoo opened this issue Jun 20, 2023 · 5 comments
Labels
help wanted Extra attention is needed

Comments

@mimoo
Copy link

mimoo commented Jun 20, 2023

Hello,

I'm using the following:

import { encode, isWithinTokenLimit } from 'gpt-tokenizer/model/text-davinci-003';

which seems to slow down the initialization, enough that I can't deploy to cloudflare workers with this library. Is there a way to lazily initialize things?

@airhorns
Copy link

We're experincing this as well -- requiring this package takes ~600ms on my M1 MBP:

❯ time node -r gpt-tokenizer -e "1"

________________________________________________________
Executed in  548.82 millis    fish           external
   usr time  616.81 millis    4.71 millis  612.10 millis
   sys time   99.25 millis    9.21 millis   90.04 millis

Would it be hard to lazily require the encodings only once the first encode call is made?

@zakariamehbi
Copy link

Same issue on my end.

@thdoan
Copy link

thdoan commented Oct 17, 2023

For Cloudflare Workers I suggest you look at this:
https://github.com/dqbd/tiktoken#cloudflare-workers

@luizzappa
Copy link

luizzappa commented Feb 26, 2024

To get around the 400ms startup time limit of Cloudflare Workers, I just import the library within fetch.

export default {
	async fetch(request: Request, env: Env, ctx: ExecutionContext): Promise<Response> {
              const { encode } = await import('gpt-tokenizer');
              // ....
        }
}

Regarding another suggested library by @thdoan, I couldn't get tiktoken or js-tiktoken to work within the limits of Cloudflare Workers.
The js-tiktoken bundles all the encoders, so this makes the bundle larger than the 1mb limit of the Cloudflare Worker (see here).
And tiktoken/lite, which allows you to import only the necessary encoder, which makes it within the size <= 1mb, has a bug that has not yet been fixed.

@niieani
Copy link
Owner

niieani commented Jul 18, 2024

When designing the decision was made to make it possible for the tokenizer loadable synchronously.
The large startup time is likely because of the large file containing the encodings and the base64 parsing that needs to happen after the load.

You could try to experiment with enabling v8's code cache introduced in node 22.1.0. It should start much faster with it enabled. Here's more info about this.

We could also experiment with an alternative way of storing the encodings so that parsing is much simpler/easier on the resources. Would need to profile and see what is causing the bulk of the startup time right now.

Suggestions and PRs welcome, as I'm constrained on time right now.

@niieani niieani added the help wanted Extra attention is needed label Jul 18, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

6 participants