streamCompletion

@ApiStatus.Obsolete
@Contract(pure = true)
abstract fun streamCompletion(request: CompletionRequest): Iterable<CompletionResponseChunk>

Calls the completions API endpoint and streams each token 1 at a time for a faster response time.

This method is technically not blocking, but the returned iterable will block until the next token is generated.

// Each iteration of the loop will block until the next token is streamed
for (chunk in openAI.streamCompletion(request)) {
// Do something with the chunk
}

Completions are considered Legacy, and OpenAI officially recommends that all developers use the chat completion endpoint isntead. See streamChatCompletion.

Return

The response from the API

Parameters

request

The request to send to the API