Skip to content

feat(models): Add OCI Generative AI provider for Google Gemini on OCI#5285

Open
fede-kamel wants to merge 2 commits intogoogle:mainfrom
fede-kamel:feat/oci-generative-ai
Open

feat(models): Add OCI Generative AI provider for Google Gemini on OCI#5285
fede-kamel wants to merge 2 commits intogoogle:mainfrom
fede-kamel:feat/oci-generative-ai

Conversation

@fede-kamel
Copy link
Copy Markdown

@fede-kamel fede-kamel commented Apr 11, 2026

Closes #5069

What this PR does

Adds OCIGenAILlm — a first-class ADK model provider for Google Gemini models hosted on Oracle Cloud Infrastructure (OCI) Generative AI. Gemini 2.5 Flash, Gemini 2.5 Pro, and Gemini 2.5 Flash Lite are available as first-party models through OCI's inference endpoints — this is a native Google × OCI model partnership, not a third-party wrapper.

from google.adk.models.oci_genai_llm import OCIGenAILlm
from google.adk.agents import LlmAgent

agent = LlmAgent(
    model=OCIGenAILlm(
        model="google.gemini-2.5-flash",
        compartment_id="ocid1.compartment.oc1...",
    ),
    instruction="You are a helpful assistant.",
)

Why this belongs in adk-python, not a community package

1. This is a model provider primitive, not a tool or connector

The community repo and integrations catalog host tools and connectors — BigQuery, Chroma, MongoDB, ElevenLabs, Slack, etc. All 60+ entries in that catalog are tool integrations. There is zero precedent for a model provider (BaseLlm subclass) living in the community repo — the community repo itself only contains memory/ and sessions/ modules, no model providers.

Every existing model provider in ADK — Gemini, Gemma, Claude, LiteLlm, ApigeeLlm — lives in src/google/adk/models/ in this repo. This PR follows that exact pattern.

2. Model providers cannot be external without degrading the SDK

LLMRegistry, BaseLlm, and the model dispatch layer are core SDK primitives. A model provider sitting in a standalone package creates:

  • Version drift — core changes to BaseLlm, LlmRequest, LlmResponse, or LLMRegistry can silently break an external provider with no CI coverage
  • Broken type inference — external packages lose Pydantic model validation and IDE type narrowing that works seamlessly for in-tree providers
  • Second-class developer experiencepip install google-adk[oci] (an optional extra, like [extensions] for Claude) is the established pattern; requiring a separate package for one cloud provider while others are built-in is an inconsistency that confuses users

3. OCI is an official Google model distribution channel

OCI Generative AI hosts Google's own Gemini models through a direct partnership. This is the same class of integration as Vertex AI or Apigee AI Gateway — a Google-authorized inference endpoint, not a community hobby project. AWS Bedrock and Azure are absent from the SDK because no one has contributed them yet, not because multi-cloud model providers belong in a tools catalog.

4. The implementation follows established precedent exactly

Aspect AnthropicLlm (Claude) OCIGenAILlm (this PR)
Subclasses BaseLlm Yes Yes
Registered in LLMRegistry Yes Yes
Optional dependency guard try/except in __init__.py Same pattern
Install extra pip install google-adk[extensions] pip install google-adk[oci]
Uses native SDK anthropic package oci package
No LangChain dependency Correct Correct
Streaming + non-streaming Yes Yes

Design

  • Subclasses BaseLlm, registered in LLMRegistry for google.gemini-* and other OCI model patterns
  • Uses the OCI Python SDK directly — no LangChain dependency
  • Optional install: pip install google-adk[oci] (safe to import without oci installed)
  • Non-streaming: _call_oci() runs the synchronous OCI SDK call in asyncio.to_thread
  • Streaming: _call_oci_stream() collects OCI's OpenAI-compatible SSE events in a thread, then yields partial=True chunks followed by a partial=False final response with aggregated content and usage metadata
  • Auth: API_KEY (default), INSTANCE_PRINCIPAL, RESOURCE_PRINCIPAL

Files changed

File Description
src/google/adk/models/oci_genai_llm.py New OCIGenAILlm implementation
src/google/adk/models/__init__.py Optional registration in LLMRegistry
pyproject.toml New [oci] optional dependency extra
tests/unittests/models/test_oci_genai_llm.py 37 unit tests (fully mocked)
tests/integration/models/test_oci_genai_llm.py 10 integration tests (skipped without OCI_COMPARTMENT_ID)

Test results (rebased on current main)

  • 37/37 OCI unit tests passed — fully mocked, no OCI account needed
  • 641/641 total model unit tests passed — zero regressions across the entire tests/unittests/models/ suite
  • Integration tests verified against live OCI endpoint (google.gemini-2.5-flash, us-chicago-1): text generation, streaming, tool calls, system instructions, multi-turn, concurrent calls
  • import google.adk succeeds without oci installed (optional dependency guard)

Request to maintainers

Issue #5069 is open, labeled models + needs review, and a maintainer (@sanketpatil06) confirmed it is "under review by our team." I am asking that this PR be reviewed on its technical merits and that the routing question (core vs. community) be evaluated against the evidence above — specifically the zero precedent for external model providers, the SDK degradation risks, and the partnership-level nature of OCI Gemini hosting.

Happy to address any code review feedback directly.

@adk-bot adk-bot added the models [Component] Issues related to model support label Apr 11, 2026
@fede-kamel fede-kamel force-pushed the feat/oci-generative-ai branch from de72195 to 3d70b80 Compare April 11, 2026 19:44
Adds first-class support for Google Gemini models hosted on Oracle Cloud
Infrastructure (OCI) Generative AI service — a native Google × OCI model
partnership that makes Gemini available directly through OCI's inference
endpoints.

Key design points:
- Subclasses BaseLlm following the anthropic_llm.py pattern
- Uses the OCI Python SDK directly (no LangChain dependency)
- Optional dependency: pip install google-adk[oci]
- Supports API_KEY, INSTANCE_PRINCIPAL, and RESOURCE_PRINCIPAL auth
- Both non-streaming (_call_oci) and streaming (_call_oci_stream) paths
  share setup code via _build_chat_details(); streaming collects OCI's
  OpenAI-compatible SSE events in a thread pool (asyncio.to_thread) and
  yields partial then final LlmResponse
- Registers google.gemini-* (and other OCI-hosted) model patterns in
  LLMRegistry via optional try/except in models/__init__.py
- 37 unit tests (fully mocked, no OCI account needed)
- 10 integration tests (skipped when OCI_COMPARTMENT_ID is unset)

Supported models: google.gemini-*, google.gemma-*, meta.llama-*,
  mistralai.*, xai.grok-*, nvidia.*
@fede-kamel fede-kamel force-pushed the feat/oci-generative-ai branch from 3d70b80 to 8f40e5c Compare April 11, 2026 19:49
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

models [Component] Issues related to model support

Projects

None yet

Development

Successfully merging this pull request may close these issues.

feat(models): Add OCI Generative AI provider for Google Gemini on OCI

2 participants