I'm trying to stream results from Open AI using a Lambda function on AWS using the OpenAI Python library. For the invoke mode I have: RESPONSE_STREAM. And, using the example provided for streaming, I can see the streamed results in the Function Logs (abbreviated below):
Response null
Function Logs START RequestId: 3e0148c3-1269-4e38-bd08-e29de5751f18 Version: $LATEST { "choices": [ { "finish_reason": null, "index": 0, "logprobs": null, "text": "\n" } ], "created": 1685755648, "id": "cmpl-7NALANaR7eLwIMrXTYJVxBpk6tiZb", "model": "text-davinci-003", "object": "text_completion" } { "choices": [ { "finish_reason": null, "index": 0, "logprobs": null, "text": "\n" } ],....
but, the Response is null. I've tested this by entering the URL in the browser and by performing a get request via cURL: both respond with null. Below is the exact code (with the secret key changed) that I used, but it can also be found on the link provided:
import json
import openai
import boto3
def lambda_handler(event, context):
model_to_use = "text-davinci-003"
input_prompt="Write a sentence in 4 words."
openai.api_key = 'some-secret key'
response = openai.Completion.create(
model=model_to_use,
prompt=input_prompt,
temperature=0,
max_tokens=100,
top_p=1,
frequency_penalty=0.0,
presence_penalty=0.0,
stream=True
)
for chunk in response:
print(chunk)
You are having trouble because python runtimes do not currently support streaming responses. From 4/7/2023 AWS announcement of streaming responses:
Response streaming currently supports the Node.js 14.x and subsequent managed runtimes.
As of 6/8/2023 this is still true.