Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/docs/advanced/mellea-core-internals.md
Original file line number Diff line number Diff line change
Expand Up @@ -277,5 +277,5 @@ for a worked example.

**See also:**
[Generative Programming](../concepts/generative-programming) |
[Working with Data](../guide/working-with-data) |
[Working with Data](../how-to/working-with-data) |
[Async and Streaming](../how-to/use-async-and-streaming)
6 changes: 3 additions & 3 deletions docs/docs/concepts/architecture-vs-agents.md
Original file line number Diff line number Diff line change
Expand Up @@ -133,13 +133,13 @@ orchestrator:

- **ReACT loops** — implement thought/action/observation cycles using `m.chat()`
with [`ChatContext`](../guide/glossary#chatcontext) and the `@tool` decorator. See
[Tools and Agents](../guide/tools-and-agents).
[Tools and Agents](../how-to/tools-and-agents).
- **Guarded agents** — combine the ReACT pattern with `requirements` and
`GuardianCheck` to enforce safety constraints at every step. See
[Security and Taint Tracking](../advanced/security-and-taint-tracking).
- **Structured outputs** — use `@generative` with Pydantic models or `Literal` types
to enforce type-safe structured output at each step. See
[Generative Functions](../guide/generative-functions).
[Generative Functions](../how-to/generative-functions).

For programs where the control flow is fixed in Python — a pipeline, an extraction
workflow, a classification step — there is no need for a separate orchestrator.
Expand Down Expand Up @@ -211,5 +211,5 @@ tools or steps.

---

**See also:** [Tools and Agents](../guide/tools-and-agents) |
**See also:** [Tools and Agents](../how-to/tools-and-agents) |
[Security and Taint Tracking](../advanced/security-and-taint-tracking)
4 changes: 2 additions & 2 deletions docs/docs/concepts/generative-functions.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ In a generative program, a function can have the same interface but delegate its
to an LLM. Mellea calls these [**generative functions**](../guide/glossary#generative-function) and provides the [`@generative`](../guide/glossary#generative) decorator
to define them.

> **Looking to use this in code?** See [Generative Functions](../guide/generative-functions) for practical examples and API details.
> **Looking to use this in code?** See [Generative Functions](../how-to/generative-functions) for practical examples and API details.

## The @generative decorator

Expand Down Expand Up @@ -167,4 +167,4 @@ Use `@generative` when you want a named, typed, reusable LLM-backed operation. U

**See also:** [Instruct, Validate, Repair](./instruct-validate-repair) |
[The Requirements System](./requirements-system) |
[Tools and Agents](../guide/tools-and-agents)
[Tools and Agents](../how-to/tools-and-agents)
2 changes: 1 addition & 1 deletion docs/docs/concepts/generative-programming.md
Original file line number Diff line number Diff line change
Expand Up @@ -143,4 +143,4 @@ These principles recur throughout Mellea:
**See also:**
[Instruct, Validate, Repair](./instruct-validate-repair) |
[Inference-Time Scaling](../advanced/inference-time-scaling) |
[Working with Data](../guide/working-with-data)
[Working with Data](../how-to/working-with-data)
2 changes: 1 addition & 1 deletion docs/docs/concepts/instruct-validate-repair.md
Original file line number Diff line number Diff line change
Expand Up @@ -188,7 +188,7 @@ print(str(answer))
`grounding_context` maps string keys to document text. The keys are arbitrary
labels — they appear in the prompt as `[key] = value` so the model can reference
them by name, but there is no required naming convention (e.g. `"doc0"`, `"annual_report"`,
`"spec"` all work). See [Working with Data](../guide/working-with-data) for richer
`"spec"` all work). See [Working with Data](../how-to/working-with-data) for richer
document handling using MObjects and `RichDocument`.

## ICL examples
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/concepts/plugins.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -989,4 +989,4 @@ from mellea.plugins import (

---

**See also:** [Glossary](../guide/glossary), [Tools and Agents](../guide/tools-and-agents), [Security and Taint Tracking](../advanced/security-and-taint-tracking), [OpenTelemetry Tracing](../evaluation-and-observability/opentelemetry-tracing)
**See also:** [Glossary](../guide/glossary), [Tools and Agents](../how-to/tools-and-agents), [Security and Taint Tracking](../advanced/security-and-taint-tracking), [OpenTelemetry Tracing](../evaluation-and-observability/opentelemetry-tracing)
46 changes: 35 additions & 11 deletions docs/docs/docs.json
Original file line number Diff line number Diff line change
Expand Up @@ -54,12 +54,12 @@
{
"group": "How-To",
"pages": [
"guide/generative-functions",
"guide/tools-and-agents",
"guide/working-with-data",
"guide/backends-and-configuration",
"guide/act-and-aact",
"guide/m-decompose",
"how-to/generative-functions",
"how-to/tools-and-agents",
"how-to/working-with-data",
"how-to/backends-and-configuration",
"how-to/act-and-aact",
"how-to/m-decompose",
"how-to/use-async-and-streaming",
"how-to/use-context-and-sessions",
"how-to/enforce-structured-output",
Expand Down Expand Up @@ -443,7 +443,7 @@
},
{
"source": "/overview/architecture",
"destination": "/guide/backends-and-configuration"
"destination": "/how-to/backends-and-configuration"
},
{
"source": "/core-concept/instruct-validate-repair",
Expand All @@ -455,15 +455,15 @@
},
{
"source": "/core-concept/generative-slots",
"destination": "/guide/generative-functions"
"destination": "/how-to/generative-functions"
},
{
"source": "/core-concept/mobjects",
"destination": "/concepts/mobjects-and-mify"
},
{
"source": "/core-concept/agents",
"destination": "/guide/tools-and-agents"
"destination": "/how-to/tools-and-agents"
},
{
"source": "/core-concept/context-management",
Expand Down Expand Up @@ -491,7 +491,7 @@
},
{
"source": "/core-concept/adapters",
"destination": "/guide/tools-and-agents"
"destination": "/how-to/tools-and-agents"
},
{
"source": "/core-concept/contribution-guide",
Expand Down Expand Up @@ -547,7 +547,7 @@
},
{
"source": "/dev/tool-calling",
"destination": "/guide/tools-and-agents"
"destination": "/how-to/tools-and-agents"
},
{
"source": "/api/cli/m",
Expand Down Expand Up @@ -616,6 +616,30 @@
{
"source": "/api/cli/fix/genstub_fixer",
"destination": "/reference/cli"
},
{
"source": "/guide/generative-functions",
"destination": "/how-to/generative-functions"
},
{
"source": "/guide/tools-and-agents",
"destination": "/how-to/tools-and-agents"
},
{
"source": "/guide/working-with-data",
"destination": "/how-to/working-with-data"
},
{
"source": "/guide/backends-and-configuration",
"destination": "/how-to/backends-and-configuration"
},
{
"source": "/guide/act-and-aact",
"destination": "/how-to/act-and-aact"
},
{
"source": "/guide/m-decompose",
"destination": "/how-to/m-decompose"
}
]
}
2 changes: 1 addition & 1 deletion docs/docs/examples/data-extraction-pipeline.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ m = start_session()

`start_session()` with no arguments creates a session backed by the default
local model. The `model_ids` import is available if you want to switch to a
specific model later (see [Backends and configuration](../guide/backends-and-configuration)).
specific model later (see [Backends and configuration](../how-to/backends-and-configuration)).

### Declaring the extraction function

Expand Down
2 changes: 1 addition & 1 deletion docs/docs/getting-started/quickstart.md
Original file line number Diff line number Diff line change
Expand Up @@ -94,7 +94,7 @@ chat.

**Backends** — Pluggable model providers. Ollama is the default. OpenAI, [LiteLLM](../guide/glossary#litellm--litellmbackend),
HuggingFace, and WatsonX are also supported. See
[Backends and Configuration](../guide/backends-and-configuration).
[Backends and Configuration](../how-to/backends-and-configuration).

## Troubleshooting

Expand Down
2 changes: 1 addition & 1 deletion docs/docs/guide/CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,7 @@ before the first H2, so readers can orient themselves quickly:
- On the **explanation** page:

```markdown
> **Looking to use this in code?** See [Generative Functions](../guide/generative-functions) for practical examples and API details.
> **Looking to use this in code?** See [Generative Functions](../how-to/generative-functions) for practical examples and API details.
```

- On the **how-to** page:
Expand Down
File renamed without changes.
2 changes: 1 addition & 1 deletion docs/docs/how-to/enforce-structured-output.md
Original file line number Diff line number Diff line change
Expand Up @@ -260,5 +260,5 @@ Both patterns support the full IVR loop, requirements, sampling strategies, and

---

**See also:** [Generative Functions](../guide/generative-functions) |
**See also:** [Generative Functions](../how-to/generative-functions) |
[The Requirements System](../concepts/requirements-system)
Original file line number Diff line number Diff line change
Expand Up @@ -123,4 +123,4 @@ For tasks that fit comfortably in a single prompt, use `m.instruct()` directly.

---

**See also:** [Tools and Agents](../guide/tools-and-agents) | [Refactor Prompts with CLI](../how-to/refactor-prompts-with-cli) | [CLI Reference](../reference/cli)
**See also:** [Tools and Agents](../how-to/tools-and-agents) | [Refactor Prompts with CLI](../how-to/refactor-prompts-with-cli) | [CLI Reference](../reference/cli)
2 changes: 1 addition & 1 deletion docs/docs/how-to/use-async-and-streaming.md
Original file line number Diff line number Diff line change
Expand Up @@ -166,4 +166,4 @@ For parallel generation, use `SimpleContext`.

---

**See also:** [Tutorial 02: Streaming and Async](../tutorials/02-streaming-and-async) | [act() and aact()](../guide/act-and-aact)
**See also:** [Tutorial 02: Streaming and Async](../tutorials/02-streaming-and-async) | [act() and aact()](../how-to/act-and-aact)
2 changes: 1 addition & 1 deletion docs/docs/how-to/use-images-and-vision.md
Original file line number Diff line number Diff line change
Expand Up @@ -121,5 +121,5 @@ To remove images from context on the next turn, pass `images=[]` explicitly.

---

**See also:** [Working with Data](../guide/working-with-data) |
**See also:** [Working with Data](../how-to/working-with-data) |
[The Instruction Model](../concepts/instruct-validate-repair)
Original file line number Diff line number Diff line change
Expand Up @@ -250,4 +250,4 @@ tools during `transform()` calls automatically.

---

**See also:** [act() and aact()](../guide/act-and-aact) | [MObjects and mify](../concepts/mobjects-and-mify)
**See also:** [act() and aact()](../how-to/act-and-aact) | [MObjects and mify](../concepts/mobjects-and-mify)
4 changes: 2 additions & 2 deletions docs/docs/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ Mellea's design rests on three interlocking ideas.
<Card title="Inference-time scaling" icon="chart-line" href="/advanced/inference-time-scaling">
Best-of-n, SOFAI, majority voting — swap strategies in one line.
</Card>
<Card title="Tools and agents" icon="wrench" href="/guide/tools-and-agents">
<Card title="Tools and agents" icon="wrench" href="/how-to/tools-and-agents">
`@tool`, `MelleaTool`, and the ReACT loop for goal-driven multi-step agents.
</Card>
</CardGroup>
Expand Down Expand Up @@ -105,7 +105,7 @@ Mellea is backend-agnostic. The same program runs on any inference engine.
</Card>
</CardGroup>

See [Backends and configuration](/guide/backends-and-configuration) for the full list of supported backends and how to configure them.
See [Backends and configuration](/how-to/backends-and-configuration) for the full list of supported backends and how to configure them.

## How-to guides

Expand Down
2 changes: 1 addition & 1 deletion docs/docs/integrations/bedrock.md
Original file line number Diff line number Diff line change
Expand Up @@ -145,4 +145,4 @@ so vision-capable models (e.g., `amazon.nova-pro-v1:0`) support image input via

---

**See also:** [Backends and Configuration](../guide/backends-and-configuration)
**See also:** [Backends and Configuration](../how-to/backends-and-configuration)
2 changes: 1 addition & 1 deletion docs/docs/integrations/huggingface.md
Original file line number Diff line number Diff line change
Expand Up @@ -115,5 +115,5 @@ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh

---

**See also:** [Backends and Configuration](../guide/backends-and-configuration) |
**See also:** [Backends and Configuration](../how-to/backends-and-configuration) |
[LoRA and aLoRA Adapters](../advanced/lora-and-alora-adapters)
4 changes: 2 additions & 2 deletions docs/docs/integrations/langchain.md
Original file line number Diff line number Diff line change
Expand Up @@ -105,11 +105,11 @@ OpenAI chat format — LlamaIndex, Haystack, Semantic Kernel — works with the
| -------- | --- |
| Your tool exists as a LangChain `BaseTool` | `MelleaTool.from_langchain(tool)` |
| Your tool exists as a smolagents `Tool` | [`MelleaTool.from_smolagents(tool)`](./smolagents) |
| You have a plain Python function to expose | [`@tool` decorator](../guide/tools-and-agents) |
| You have a plain Python function to expose | [`@tool` decorator](../how-to/tools-and-agents) |
| You have LangChain message history to continue | `convert_to_openai_messages` → `ChatContext` |
| You want Mellea as an OpenAI endpoint for another framework | [`m serve`](./m-serve) |

---

**See also:** [Tools and Agents](../guide/tools-and-agents) |
**See also:** [Tools and Agents](../how-to/tools-and-agents) |
[Context and Sessions](../concepts/context-and-sessions)
2 changes: 1 addition & 1 deletion docs/docs/integrations/m-serve.md
Original file line number Diff line number Diff line change
Expand Up @@ -112,5 +112,5 @@ print(response.choices[0].message.content)
---

**See also:** [Context and Sessions](../concepts/context-and-sessions) |
[Backends and Configuration](../guide/backends-and-configuration) |
[Backends and Configuration](../how-to/backends-and-configuration) |
[CLI Reference](../reference/cli)
2 changes: 1 addition & 1 deletion docs/docs/integrations/mcp.md
Original file line number Diff line number Diff line change
Expand Up @@ -115,4 +115,4 @@ uv run your_server.py

---

**See also:** [Backends and Configuration](../guide/backends-and-configuration)
**See also:** [Backends and Configuration](../how-to/backends-and-configuration)
4 changes: 2 additions & 2 deletions docs/docs/integrations/ollama.md
Original file line number Diff line number Diff line change
Expand Up @@ -205,7 +205,7 @@ m = MelleaSession(
)
```

See [Backends and Configuration](../guide/backends-and-configuration) for the
See [Backends and Configuration](../how-to/backends-and-configuration) for the
full `OpenAIBackend` reference.

## Troubleshooting
Expand Down Expand Up @@ -240,5 +240,5 @@ pip install mellea

---

**See also:** [Backends and Configuration](../guide/backends-and-configuration) |
**See also:** [Backends and Configuration](../how-to/backends-and-configuration) |
[Getting Started](../getting-started/installation)
4 changes: 2 additions & 2 deletions docs/docs/integrations/openai.md
Original file line number Diff line number Diff line change
Expand Up @@ -234,7 +234,7 @@ m = MelleaSession(
> **Note (review needed):** Direct Anthropic API compatibility via this path has not
> been verified against the current Mellea version. If you are using Anthropic,
> LiteLLM provides a verified integration — see
> [Backends and Configuration](../guide/backends-and-configuration).
> [Backends and Configuration](../how-to/backends-and-configuration).

## Troubleshooting

Expand All @@ -256,5 +256,5 @@ local servers, list available models from the server's API or UI.

---

**See also:** [Backends and Configuration](../guide/backends-and-configuration) |
**See also:** [Backends and Configuration](../how-to/backends-and-configuration) |
[Enforce Structured Output](../how-to/enforce-structured-output)
4 changes: 2 additions & 2 deletions docs/docs/integrations/smolagents.md
Original file line number Diff line number Diff line change
Expand Up @@ -55,11 +55,11 @@ description and parameter types are preserved exactly.
| -------- | --- |
| Your tool exists as a LangChain `BaseTool` | [`MelleaTool.from_langchain(tool)`](./langchain) |
| Your tool exists as a smolagents `Tool` | `MelleaTool.from_smolagents(tool)` |
| You have a plain Python function to expose | [`@tool` decorator](../guide/tools-and-agents) |
| You have a plain Python function to expose | [`@tool` decorator](../how-to/tools-and-agents) |
| You have LangChain message history to continue | [`convert_to_openai_messages` → `ChatContext`](./langchain.md#seeding-a-session-with-langchain-message-history) |
| You want Mellea as an OpenAI endpoint for another framework | [`m serve`](./m-serve) |

---

**See also:** [Tools and Agents](../guide/tools-and-agents) |
**See also:** [Tools and Agents](../how-to/tools-and-agents) |
[Context and Sessions](../concepts/context-and-sessions)
2 changes: 1 addition & 1 deletion docs/docs/integrations/vertex-ai.md
Original file line number Diff line number Diff line change
Expand Up @@ -244,4 +244,4 @@ pip install google-cloud-aiplatform
---

**See also:** [OpenAI and OpenAI-Compatible APIs](../integrations/openai) |
[Backends and Configuration](../guide/backends-and-configuration)
[Backends and Configuration](../how-to/backends-and-configuration)
6 changes: 3 additions & 3 deletions docs/docs/integrations/watsonx.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,8 +5,8 @@ description: "Run Mellea with IBM WatsonX AI using the WatsonxAIBackend."
---

> **Deprecated:** The native WatsonX backend is deprecated since v0.4. Use the
> [LiteLLM](../guide/backends-and-configuration#litellm-backend) or
> [OpenAI](../guide/backends-and-configuration#openai-backend) backend with a
> [LiteLLM](../how-to/backends-and-configuration#litellm-backend) or
> [OpenAI](../how-to/backends-and-configuration#openai-backend) backend with a
> WatsonX-compatible endpoint instead.

The WatsonX backend connects to IBM's managed AI platform. It requires an API key,
Expand Down Expand Up @@ -111,4 +111,4 @@ pip install 'mellea[watsonx]'

---

**See also:** [Backends and Configuration](../guide/backends-and-configuration)
**See also:** [Backends and Configuration](../how-to/backends-and-configuration)
2 changes: 1 addition & 1 deletion docs/docs/tutorials/03-using-generative-stubs.md
Original file line number Diff line number Diff line change
Expand Up @@ -258,4 +258,4 @@ context-steerable generative functions:

---

**See also:** [Generative Functions](../guide/generative-functions) | [The Requirements System](../concepts/requirements-system) | [Write Custom Verifiers](../how-to/write-custom-verifiers)
**See also:** [Generative Functions](../how-to/generative-functions) | [The Requirements System](../concepts/requirements-system) | [Write Custom Verifiers](../how-to/write-custom-verifiers)
2 changes: 1 addition & 1 deletion docs/docs/tutorials/04-making-agents-reliable.md
Original file line number Diff line number Diff line change
Expand Up @@ -489,4 +489,4 @@ agentic system:

---

**See also:** [The Requirements System](../concepts/requirements-system) | [Security and Taint Tracking](../advanced/security-and-taint-tracking) | [Tools and Agents](../guide/tools-and-agents)
**See also:** [The Requirements System](../concepts/requirements-system) | [Security and Taint Tracking](../advanced/security-and-taint-tracking) | [Tools and Agents](../how-to/tools-and-agents)
2 changes: 1 addition & 1 deletion docs/docs/tutorials/05-mifying-legacy-code.md
Original file line number Diff line number Diff line change
Expand Up @@ -194,5 +194,5 @@ modifying their class definitions:
| `mify(obj)` | You don't own the class |

**See also:** [MObjects and mify](../concepts/mobjects-and-mify) |
[Working with Data](../guide/working-with-data) |
[Working with Data](../how-to/working-with-data) |
[Tutorial 03: Using Generative Stubs](./03-using-generative-stubs)
Loading