This model's maximum context length is 4097 tokens. However, you requested 4307 tokens (2259 in the messages, 2048 in the completion). Please reduce the length of the messages or completion. at com.theokanning.openai.service.OpenAiService.execute(OpenAiService.java:227) #42
Labels
bug
Something isn't working
This model's maximum context length is 4097 tokens. However, you requested 4307 tokens (2259 in the messages, 2048 in the completion). Please reduce the length of the messages or completion.
at com.theokanning.openai.service.OpenAiService.execute(OpenAiService.java:227)
The text was updated successfully, but these errors were encountered: