Skip to content

tinglion/openai-java

 
 

Repository files navigation

Maven Central

⚠️ Please switch to using the new 'service' library if you need to use OpenAiService. The old 'client' OpenAiService is deprecated as of 0.10.0.
⚠️OpenAI has deprecated all Engine-based APIs. See Deprecated Endpoints below for more info.

OpenAI-Java

Java libraries for using OpenAI's GPT apis. Supports GPT-3, ChatGPT, and GPT-4.

Includes the following artifacts:

  • api : request/response POJOs for the GPT APIs.
  • client : a basic retrofit client for the GPT endpoints, includes the api module
  • service : A basic service class that creates and calls the client. This is the easiest way to get started.

as well as an example project using the service.

Supported APIs

Deprecated by OpenAI

Importing

Gradle

implementation 'com.theokanning.openai-gpt3-java:<api|client|service>:<version>'

Maven

   <dependency>
    <groupId>com.theokanning.openai-gpt3-java</groupId>
    <artifactId>{api|client|service}</artifactId>
    <version>version</version>       
   </dependency>

Usage

Data classes only

If you want to make your own client, just import the POJOs from the api module. Your client will need to use snake case to work with the OpenAI API.

Retrofit client

If you're using retrofit, you can import the client module and use the OpenAiApi.
You'll have to add your auth token as a header (see AuthenticationInterceptor) and set your converter factory to use snake case and only include non-null fields.

OpenAiService

If you're looking for the fastest solution, import the service module and use OpenAiService.

⚠️The OpenAiService in the client module is deprecated, please switch to the new version in the service module.

OpenAiService service = new OpenAiService("your_token");
CompletionRequest completionRequest = CompletionRequest.builder()
        .prompt("Somebody once told me the world is gonna roll me")
        .model("ada")
        .echo(true)
        .build();
service.createCompletion(completionRequest).getChoices().forEach(System.out::println);

Customizing OpenAiService

If you need to customize OpenAiService, create your own Retrofit client and pass it in to the constructor. For example, do the following to add request logging (after adding the logging gradle dependency):

ObjectMapper mapper = defaultObjectMapper();
OkHttpClient client = defaultClient(token, timeout)
        .newBuilder()
        .interceptor(HttpLoggingInterceptor())
        .build();
Retrofit retrofit = defaultRetrofit(client, mapper);

OpenAiApi api = retrofit.create(OpenAiApi.class);
OpenAiService service = new OpenAiService(api);

Adding a Proxy

To use a proxy, modify the OkHttp client as shown below:

ObjectMapper mapper = defaultObjectMapper();
Proxy proxy = new Proxy(Proxy.Type.HTTP, new InetSocketAddress(host, port));
OkHttpClient client = defaultClient(token, timeout)
        .newBuilder()
        .proxy(proxy)
        .build();
Retrofit retrofit = defaultRetrofit(client, mapper);
OpenAiApi api = retrofit.create(OpenAiApi.class);
OpenAiService service = new OpenAiService(api);

Streaming thread shutdown

If you want to shut down your process immediately after streaming responses, call OpenAiService.shutdown().
This is not necessary for non-streaming calls.

Running the example project

All the example project requires is your OpenAI api token

export OPENAI_TOKEN="sk-XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX"
./gradlew example:run

FAQ

Does this support GPT-4?

Yes! GPT-4 uses the ChatCompletion Api, and you can see the latest model options here.
GPT-4 is currently in a limited beta (as of 4/1/23), so make sure you have access before trying to use it.

Why am I getting connection timeouts?

Make sure that OpenAI is available in your country.

Why doesn't OpenAiService support x configuration option?

Many projects use OpenAiService, and in order to support them best I've kept it extremely simple.
You can create your own OpenAiApi instance to customize headers, timeouts, base urls etc.
If you want features like retry logic and async calls, you'll have to make an OpenAiApi instance and call it directly instead of using OpenAiService

Deprecated Endpoints

OpenAI has deprecated engine-based endpoints in favor of model-based endpoints. For example, instead of using v1/engines/{engine_id}/completions, switch to v1/completions and specify the model in the CompletionRequest. The code includes upgrade instructions for all deprecated endpoints.

I won't remove the old endpoints from this library until OpenAI shuts them down.

License

Published under the MIT License

About

OpenAI GPT-3 Api Client in Java

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Java 100.0%