Search code examples
azureopenai-apiazure-openaillamallama-index

VectorStoreIndex API Key while using AzureOpenAI Service


I'm trying to use VectorStoreIndex from llama_index solving the RAG problem for chatbot just the following way:

import openai
from llama_index import VectorStoreIndex

index = VectorStoreIndex.from_documents(docs)
index.storage_context.persist()

When I read documentation, it is recommended to use openai.api_key = os.getenv('OPENAI_API_KEY') to be able to connect model with OpenAI. But what should I do if I use AzureOpenAI (meaning, that I have api_key, azure_endpoint and api_version and don't have openai api_key)? If I use openai.api_key = my_azure_api_key this also does not work because apparently the model refers to https://platform.openai.com/account/api-keys and not to AzureAI Services..

Sorry if the question looks dubious, was unable to find the information on web.

Thanks!


Solution

  • Have solved it. Actually all you need is to add context parameter into VectorStoreIndex.from_documents(). In this parameter one can specify AzureOpenAI or AzureOpenAIEmbedding and obviously use api_key, azure_endpoint and api_version.

    index = VectorStoreIndex.from_documents(documents, service_context=service_context)