I am trying to make a chatbot using the Langchain-Openai.I have never done this before. I created a brand new api key, which was never used before. I copied code from the official langchain-openai docs, and the following code:
from langchain_core.prompts import PromptTemplate
from langchain_openai import OpenAI
OPENAI_API_KEY = 'sk-proj-...'
template = """Question: {question}
Answer: Let's think step by step."""
prompt = PromptTemplate.from_template(template)
llm = OpenAI(openai_api_key="sk-proj-...")
llm_chain = prompt | llm
question = "What NFL team won the Super Bowl in the year Justin Beiber was born?"
llm_chain.invoke(question)
It is giving this very long error:
Traceback (most recent call last):
File "C:\Users\Acer\OneDrive\Documents\VS_Code\Python\ai\Langchain-Openai.py", line 25, in <module>
llm_chain.invoke(question)
File "C:\Users\Acer\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.12_qbz5n2kfra8p0\LocalCache\local-packages\Python312\site-packages\langchain_core\runnables\base.py", line 2399, in invoke
input = step.invoke(
^^^^^^^^^^^^
File "C:\Users\Acer\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.12_qbz5n2kfra8p0\LocalCache\local-packages\Python312\site-packages\langchain_core\language_models\llms.py", line 276, in invoke
self.generate_prompt(
File "C:\Users\Acer\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.12_qbz5n2kfra8p0\LocalCache\local-packages\Python312\site-packages\langchain_core\language_models\llms.py", line 633, in generate_prompt
return self.generate(prompt_strings, stop=stop, callbacks=callbacks, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Acer\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.12_qbz5n2kfra8p0\LocalCache\local-packages\Python312\site-packages\langchain_core\language_models\llms.py", line 803, in generate
output = self._generate_helper(
^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Acer\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.12_qbz5n2kfra8p0\LocalCache\local-packages\Python312\site-packages\langchain_core\language_models\llms.py", line 670, in _generate_helper
raise e
File "C:\Users\Acer\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.12_qbz5n2kfra8p0\LocalCache\local-packages\Python312\site-packages\langchain_core\language_models\llms.py", line 657, in _generate_helper
self._generate(
File "C:\Users\Acer\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.12_qbz5n2kfra8p0\LocalCache\local-packages\Python312\site-packages\langchain_openai\llms\base.py", line 350, in _generate
response = self.client.create(prompt=_prompts, **params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Acer\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.12_qbz5n2kfra8p0\LocalCache\local-packages\Python312\site-packages\openai\_utils\_utils.py", line 277, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Acer\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.12_qbz5n2kfra8p0\LocalCache\local-packages\Python312\site-packages\openai\resources\completions.py", line 528, in create
return self._post(
^^^^^^^^^^^
File "C:\Users\Acer\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.12_qbz5n2kfra8p0\LocalCache\local-packages\Python312\site-packages\openai\_base_client.py", line 1240, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Acer\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.12_qbz5n2kfra8p0\LocalCache\local-packages\Python312\site-packages\openai\_base_client.py", line 921, in request
return self._request(
^^^^^^^^^^^^^^
File "C:\Users\Acer\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.12_qbz5n2kfra8p0\LocalCache\local-packages\Python312\site-packages\openai\_base_client.py", line 1005, in _request
return self._retry_request(
^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Acer\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.12_qbz5n2kfra8p0\LocalCache\local-packages\Python312\site-packages\openai\_base_client.py", line 1053, in _retry_request
return self._request(
^^^^^^^^^^^^^^
File "C:\Users\Acer\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.12_qbz5n2kfra8p0\LocalCache\local-packages\Python312\site-packages\openai\_base_client.py", line 1005, in _request
return self._retry_request(
^^^^^^^^^^^^^^^^^^^^
File "C:\Users\Acer\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.12_qbz5n2kfra8p0\LocalCache\local-packages\Python312\site-packages\openai\_base_client.py", line 1053, in _retry_request
return self._request(
^^^^^^^^^^^^^^
File "C:\Users\Acer\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.12_qbz5n2kfra8p0\LocalCache\local-packages\Python312\site-packages\openai\_base_client.py", line 1020, in _request
raise self._make_status_error_from_response(err.response) from None
openai.RateLimitError: Error code: 429 - {'error': {'message': 'You exceeded your current quota, please check your plan and billing details. For more information on this error, read the docs: https://platform.openai.com/docs/guides/error-codes/api-errors.', 'type': 'insufficient_quota', 'param': None, 'code': 'insufficient_quota'}}
I even checked the openai api key usage website and it is not showing anything.
All of the code is from the Langchain-Openai docs.
Am I doing something wrong?
EDIT: As @trazoM pointed out, the code works just fine but apparently I just needed to make a new key and link a credit card. Thanks @trazoM!
As @trazoM pointed out, the code works just fine but apparently I just needed to make a new key and link a credit card. Thanks @trazoM! Writing this to make this question appeared as answered.