Skip to content

ollama._types.ResponseError: Internal Server Error (ref: 3f380fc3-f1b8-4b9e-aa2c-5c4486b28c08) (status code: 500) #36602

@khteh

Description

@khteh

Checked other resources

  • This is a bug, not a usage question.
  • I added a clear and descriptive title that summarizes this issue.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).
  • This is not related to the langchain-community package.
  • I posted a self-contained, minimal, reproducible example. A maintainer can copy it and run it AS IS.

Package (Required)

  • langchain
  • langchain-openai
  • langchain-anthropic
  • langchain-classic
  • langchain-core
  • langchain-model-profiles
  • langchain-tests
  • langchain-text-splitters
  • langchain-chroma
  • langchain-deepseek
  • langchain-exa
  • langchain-fireworks
  • langchain-groq
  • langchain-huggingface
  • langchain-mistralai
  • langchain-nomic
  • langchain-ollama
  • langchain-openrouter
  • langchain-perplexity
  • langchain-qdrant
  • langchain-xai
  • Other / not sure / general

Related Issues / PRs

No response

Reproduction Steps / Example Code (Python)

https://github.com/khteh/rag-agent/blob/master/src/rag_agent/EmailRAG.py

Error Message and Stack Trace (if applicable)

Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "/usr/src/Python/rag-agent/src/rag_agent/EmailRAG.py", line 329, in <module>
    asyncio.run(main())
    ~~~~~~~~~~~^^^^^^^^
  File "/usr/lib/python3.13/asyncio/runners.py", line 195, in run
    return runner.run(main)
           ~~~~~~~~~~^^^^^^
  File "/usr/lib/python3.13/asyncio/runners.py", line 118, in run
    return self._loop.run_until_complete(task)
           ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^
  File "/usr/lib/python3.13/asyncio/base_events.py", line 725, in run_until_complete
    return future.result()
           ~~~~~~~~~~~~~^^
  File "/usr/src/Python/rag-agent/src/rag_agent/EmailRAG.py", line 322, in main
    result = await rag.Chat("There's an immediate risk of electrical, water, or fire damage", email_state)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/Python/rag-agent/src/rag_agent/EmailRAG.py", line 290, in Chat
    async for step in self._agent.with_config(config).astream(
    ...<5 lines>...
        step["messages"][-1].pretty_print()
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/pregel/main.py", line 3128, in astream
    async for _ in runner.atick(
    ...<16 lines>...
            yield o
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/pregel/_runner.py", line 304, in atick
    await arun_with_retry(
    ...<15 lines>...
    )
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/pregel/_retry.py", line 242, in arun_with_retry
    return await task.proc.ainvoke(task.input, config)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/_internal/_runnable.py", line 705, in ainvoke
    input = await asyncio.create_task(
            ^^^^^^^^^^^^^^^^^^^^^^^^^^
        step.ainvoke(input, config, **kwargs), context=context
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    )
    ^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langgraph/_internal/_runnable.py", line 473, in ainvoke
    ret = await self.afunc(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain/agents/factory.py", line 1361, in amodel_node
    result = await awrap_model_call_handler(request, _execute_model_async)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain/agents/factory.py", line 386, in composed
    outer_result = await outer(request, inner_handler)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain/agents/middleware/todo.py", line 271, in awrap_model_call
    return await handler(request.override(system_message=new_system_message))
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain/agents/factory.py", line 376, in inner_handler
    inner_result = await inner(req, handler)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain/agents/factory.py", line 386, in composed
    outer_result = await outer(request, inner_handler)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/deepagents/middleware/filesystem.py", line 1126, in awrap_model_call
    return await handler(request)
           ^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain/agents/factory.py", line 376, in inner_handler
    inner_result = await inner(req, handler)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain/agents/factory.py", line 386, in composed
    outer_result = await outer(request, inner_handler)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/deepagents/middleware/subagents.py", line 691, in awrap_model_call
    return await handler(request.override(system_message=new_system_message))
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain/agents/factory.py", line 376, in inner_handler
    inner_result = await inner(req, handler)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain/agents/factory.py", line 386, in composed
    outer_result = await outer(request, inner_handler)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/deepagents/middleware/summarization.py", line 1024, in awrap_model_call
    return await handler(request.override(messages=truncated_messages))
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain/agents/factory.py", line 376, in inner_handler
    inner_result = await inner(req, handler)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_anthropic/middleware/prompt_caching.py", line 180, in awrap_model_call
    return await handler(request)
           ^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain/agents/factory.py", line 1330, in _execute_model_async
    output = await model_.ainvoke(messages)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_core/runnables/base.py", line 5708, in ainvoke
    return await self.bound.ainvoke(
           ^^^^^^^^^^^^^^^^^^^^^^^^^
    ...<3 lines>...
    )
    ^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_core/language_models/chat_models.py", line 477, in ainvoke
    llm_result = await self.agenerate_prompt(
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    ...<8 lines>...
    )
    ^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_core/language_models/chat_models.py", line 1196, in agenerate_prompt
    return await self.agenerate(
           ^^^^^^^^^^^^^^^^^^^^^
        prompt_messages, stop=stop, callbacks=callbacks, **kwargs
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    )
    ^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_core/language_models/chat_models.py", line 1154, in agenerate
    raise exceptions[0]
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_core/language_models/chat_models.py", line 1423, in _agenerate_with_cache
    result = await self._agenerate(
             ^^^^^^^^^^^^^^^^^^^^^^
        messages, stop=stop, run_manager=run_manager, **kwargs
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    )
    ^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_ollama/chat_models.py", line 1208, in _agenerate
    final_chunk = await self._achat_stream_with_aggregation(
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        messages, stop, run_manager, verbose=self.verbose, **kwargs
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    )
    ^
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_ollama/chat_models.py", line 991, in _achat_stream_with_aggregation
    async for chunk in self._aiterate_over_stream(messages, stop, **kwargs):
    ...<9 lines>...
            )
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_ollama/chat_models.py", line 1131, in _aiterate_over_stream
    async for stream_resp in self._acreate_chat_stream(messages, stop, **kwargs):
    ...<52 lines>...
            yield chunk
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/langchain_ollama/chat_models.py", line 937, in _acreate_chat_stream
    async for part in await self._async_client.chat(**chat_params):
        yield part
  File "/usr/src/Python/rag-agent/.venv/lib/python3.13/site-packages/ollama/_client.py", line 757, in inner
    raise ResponseError(e.response.text, e.response.status_code) from None
ollama._types.ResponseError: Internal Server Error (ref: 3f380fc3-f1b8-4b9e-aa2c-5c4486b28c08) (status code: 500)

Description

Trying to use a larger Ollama model, gpt-oss:120b-cloud and it hits the 500 exception.
Langsmith traces:
https://smith.langchain.com/public/28d84063-6b03-4b66-b004-c0ce5dc5aeb3/r
https://smith.langchain.com/public/80e0e9c4-5577-4ed9-8569-7405482cdbd2/r

System Info

System Information
------------------
> OS:  Linux
> OS Version:  #20-Ubuntu SMP PREEMPT_DYNAMIC Fri Mar 13 20:07:29 UTC 2026
> Python Version:  3.13.7 (main, Mar  3 2026, 12:19:54) [GCC 15.2.0]

Package Information
-------------------
> langchain_core: 1.2.26
> langchain: 1.2.15
> langchain_community: 0.4.1
> langsmith: 0.7.25
> deepagents: 0.4.12
> langchain_anthropic: 1.4.0
> langchain_classic: 1.0.3
> langchain_google_genai: 4.2.1
> langchain_neo4j: 0.9.0
> langchain_ollama: 1.0.1
> langchain_openai: 1.1.12
> langchain_postgres: 0.0.17
> langchain_text_splitters: 1.1.1
> langgraph_api: 0.7.27
> langgraph_cli: 0.4.19
> langgraph_runtime_inmem: 0.24.1
> langgraph_sdk: 0.3.12

Optional packages not installed
-------------------------------
> deepagents-cli

Other Dependencies
------------------
> aiohttp: 3.13.5
> anthropic: 0.89.0
> asyncpg: 0.31.0
> blockbuster: 1.5.26
> click: 8.3.2
> cloudpickle: 3.1.2
> croniter: 6.2.2
> cryptography: 46.0.6
> dataclasses-json: 0.6.7
> filetype: 1.2.0
> google-genai: 1.70.0
> grpcio: 1.80.0
> grpcio-health-checking: 1.80.0
> grpcio-tools: 1.75.1
> httpx: 0.28.1
> httpx-sse: 0.4.3
> jsonpatch: 1.33
> jsonschema-rs: 0.29.1
> langgraph: 1.1.6
> langgraph-checkpoint: 4.0.1
> neo4j: 6.1.0
> neo4j-graphrag: 1.14.1
> numpy: 2.4.4
> ollama: 0.6.1
> openai: 2.30.0
> opentelemetry-api: 1.40.0
> opentelemetry-exporter-otlp-proto-http: 1.40.0
> opentelemetry-sdk: 1.40.0
> orjson: 3.11.8
> packaging: 26.0
> pgvector: 0.3.6
> protobuf: 6.33.6
> psycopg: 3.3.3
> psycopg-pool: 3.3.0
> pydantic: 2.12.5
> pydantic-settings: 2.13.1
> pyjwt: 2.12.1
> pytest: 9.0.2
> python-dotenv: 1.2.2
> pyyaml: 6.0.3
> PyYAML: 6.0.3
> requests: 2.33.1
> requests-toolbelt: 1.0.0
> sqlalchemy: 2.0.49
> SQLAlchemy: 2.0.49
> sse-starlette: 2.1.3
> starlette: 1.0.0
> structlog: 25.5.0
> tenacity: 9.1.4
> tiktoken: 0.12.0
> truststore: 0.10.4
> typing-extensions: 4.15.0
> uuid-utils: 0.14.1
> uvicorn: 0.43.0
> watchfiles: 1.1.1
> wcmatch: 10.1
> websockets: 16.0
> xxhash: 3.6.0
> zstandard: 0.25.0

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugRelated to a bug, vulnerability, unexpected error with an existing featurecore`langchain-core` package issues & PRsexternallangchain`langchain` package issues & PRsollama`langchain-ollama` package issues & PRs

    Type

    No fields configured for Bug.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions