Search code examples
dockeraws-lambdadockerfileboto3amazon-bedrock

UnknownServiceError: Unknown service: 'bedrock-runtime'


I executed this by indexing already on AOSS, and i was able to query the prompt as well. When i tried implementing this on docker, It would give an unknown service error. This error could be attributed to boto3. If it is not please let me know.

Please help me resolve the issues with the code here and the dockerfile below.

This is my code :

import boto3
from requests_aws4auth import AWS4Auth
from opensearchpy import OpenSearch, RequestsHttpConnection
import json
#%%
#client = boto3.client('opensearchserverless')
credentials = boto3.Session().get_credentials()
service = 'aoss'
region = "us-east-1"
credentials = boto3.Session(
    aws_access_key_id="*******", aws_secret_access_key="********"
).get_credentials()
awsauth = AWS4Auth("*****", "******", region, service, session_token=credentials.token)
text = "vamsee"
endpoint = "2z2fm2hju98va5k7pis9.us-east-1.aoss.amazonaws.com"
ops_client =  OpenSearch(
        hosts=[{'host': endpoint, 'port': 443}],
        http_auth=awsauth,
        use_ssl=True,
        verify_certs=True,
        connection_class=RequestsHttpConnection,
        timeout=300
    )
#print("ops client successful")
bedrock_client = boto3.client('bedrock-runtime',aws_access_key_id="*********",
    aws_secret_access_key="**********",region_name="us-east-1")
embed_model_id = "amazon.titan-embed-text-v1"

#%%
def prepare_prompt_template(model_id,prompt,query,prompt_history=None):
    prompt_template = {"inputText": f"""{prompt}\n
                            {query}
                            """}
    return prompt_template
def query_bedrock_models(model,prompt):
    print(f'Bedrock prompt {prompt}')
    response = bedrock_client.invoke_model(
        body=json.dumps(prompt),
        modelId=model,
        accept='application/json',
        contentType='application/json'
    )
    print('EventStream')
    print(dir(response['body']))

    return response
#%%
def handler(event,context):
    response = bedrock_client.invoke_model(
                body=json.dumps({"inputText": event["query"]}),
                modelId=event['embed_model_id'],
                accept='application/json',
                co`ntentType='application/json'
            )
    result = json.loads(response['body'].read())
    embedded_search = result.get('embedding')
    vector_query = {
                "size": 5,
                "query": {"knn": {"embedding": {"vector": embedded_search, "k": 2}}},
                "_source": False,
                "fields": ["text", "doc_type"]
            }
    response = ops_client.search(body=vector_query, index=event["index"])
    context = None
    for data in response["hits"]["hits"]:
        if context is None:`
            context = data['fields']['text'][0]
    else:
        context = context + ' ' + data['fields']['text'][0]
    if context is not None:
        context = f""" Data points: {context}
                               Question: {event["query"]}"""
    prompt = """answer this question """

    prompt_template = prepare_prompt_template(event["prompt_model_id"], prompt, context)
    response = query_bedrock_models(event["prompt_model_id"], prompt_template)
    result = json.loads(response['body'].read())
    str = result['results'][0]['outputText']


    return {
        "status" : 200,
        "response" : str
    }
#%%

json_data = {
    "embed_model_id":"amazon.titan-embed-text-v1",
    "prompt_model_id": "amazon.titan-text-express-v1",
    "index": "bedrock_test",
    "query": "whos birthday is it ?"
}

# Convert the Python dictionary to a JSON string


response = handler(json_data,context = "hello")
print(response)

# %%

This code is working locally, it is giving me accurate responses, but somehow it is giving me the above mentioned error when executing the docker container.

This is my dockerfile:

FROM public.ecr.aws/lambda/python:3.10

# Copy requirements.txt
COPY requirements.txt ${LAMBDA_TASK_ROOT}

# Install the specified packages
RUN pip3 install llama-index && \
    pip3 install opensearch-py && \
    pip3 install safetensors && \
    pip3 install sentence-transformers && \
    pip3 install requests-aws4auth && \
    pip3 install boto3

# Copy function code
COPY prompt_lambda.py ${LAMBDA_TASK_ROOT}

# Set the CMD to your handler (could also be done as a parameter override outside of the Dockerfile)
CMD [ "prompt_lambda.handler" ]

I tried using multiple boto3 version, nothing would work though.


Solution

  • You possibly selected a Lambda runtime (e.g. 3.11) that has a version of boto3 that does not have the bedrock-runtime. If possible, switch to Python 3.12 as the Lambda runtime.