Search code examples
rubylarge-language-modellangchainrb

Ruby langchainrb gem and custom configuration for the model setup


I am working in a prototype using the gem langchainrb. I am using the module assistant module to implemente a basic RAG architecture.

Everything works, and now I would like to customize the model configuration.

In the documenation there is no clear way of setting up the Model. In my case, I would like to use OpenAi and use:

  • temperature: 0.1
  • Model: gpt-4o

In the README, there is a mention about using llm_options.

If I go to the OpenAI Module documentation:

It says I have to check here:

But there is not any mention of temperature, for example. Also, in the example in the Langchain::LLM::OpenAI documentation, the options are totally different.

# ruby-openai options:

CONFIG_KEYS = %i[
  api_type
  api_version
  access_token
  log_errors
  organization_id
  uri_base
  request_timeout
  extra_headers
].freeze
# Example in Class: Langchain::LLM::OpenAI documentation: 

{
  n: 1,
  temperature: 0.0,
  chat_completion_model_name: "gpt-3.5-turbo",
  embeddings_model_name: "text-embedding-3-small"
}.freeze
  • Langchain.rb version: 0.13.4

Solution

  • @engineersmnky is right on their comment.

    I have a conflict between llm_options and default_options. I thought it was the same with different priorities.

    For the needs expressed in the question I have to use the default_options as in here:

    llm =
      Langchain::LLM::OpenAI.new(
        api_key: <OPENAI_KEY>,
        default_options: {
          temperature: 0.0,
          chat_completion_model_name: "gpt-4o"
        }
      )