Search code examples
node.jsllama-index

Change default llama index gpt base api to internal proxy gpt service


I'm using LlamaIndex.TS in my node server and I'm trying to change the base url to my proxy azure openAI server, as the following process.env['OPENAI_API_BASE'] = 'http://openaiproxy.service.consul:8080/OpenAIProxy/handler';

It seems the request is still routing to the default.

Any thoughts? Thanks


Solution

  • I solved it using OpenAIEmbedding class:

    ...
      // Create Document object with essay
      const document = new Document({text: essay});
    
      let params = {
        timeout: 20000,
        maxRetries: 1,
        additionalSessionOptions: {
          baseURL: "http://openaiproxy.service.consul:8080/OpenAIProxy/handler",
          defaultHeaders: {
            'X-App-Caller': 'GeniusBotService',
          }
        }
      };
      const serviceContext = serviceContextFromDefaults({
        embedModel: new OpenAIEmbedding(params),
        llm: new OpenAI({model: "gpt-4", ...params})
      });
      // Split text and create embeddings. Store them in a VectorStoreIndex
      const index = await VectorStoreIndex.fromDocuments([document], {serviceContext});
    ...