diff --git a/.claude/skills/gitnexus/gitnexus-cli/SKILL.md b/.claude/skills/gitnexus/gitnexus-cli/SKILL.md new file mode 100644 index 0000000..c9e0af3 --- /dev/null +++ b/.claude/skills/gitnexus/gitnexus-cli/SKILL.md @@ -0,0 +1,82 @@ +--- +name: gitnexus-cli +description: "Use when the user needs to run GitNexus CLI commands like analyze/index a repo, check status, clean the index, generate a wiki, or list indexed repos. Examples: \"Index this repo\", \"Reanalyze the codebase\", \"Generate a wiki\"" +--- + +# GitNexus CLI Commands + +All commands work via `npx` — no global install required. + +## Commands + +### analyze — Build or refresh the index + +```bash +npx gitnexus analyze +``` + +Run from the project root. This parses all source files, builds the knowledge graph, writes it to `.gitnexus/`, and generates CLAUDE.md / AGENTS.md context files. + +| Flag | Effect | +| -------------- | ---------------------------------------------------------------- | +| `--force` | Force full re-index even if up to date | +| `--embeddings` | Enable embedding generation for semantic search (off by default) | + +**When to run:** First time in a project, after major code changes, or when `gitnexus://repo/{name}/context` reports the index is stale. In Claude Code, a PostToolUse hook runs `analyze` automatically after `git commit` and `git merge`, preserving embeddings if previously generated. + +### status — Check index freshness + +```bash +npx gitnexus status +``` + +Shows whether the current repo has a GitNexus index, when it was last updated, and symbol/relationship counts. Use this to check if re-indexing is needed. + +### clean — Delete the index + +```bash +npx gitnexus clean +``` + +Deletes the `.gitnexus/` directory and unregisters the repo from the global registry. Use before re-indexing if the index is corrupt or after removing GitNexus from a project. + +| Flag | Effect | +| --------- | ------------------------------------------------- | +| `--force` | Skip confirmation prompt | +| `--all` | Clean all indexed repos, not just the current one | + +### wiki — Generate documentation from the graph + +```bash +npx gitnexus wiki +``` + +Generates repository documentation from the knowledge graph using an LLM. Requires an API key (saved to `~/.gitnexus/config.json` on first use). + +| Flag | Effect | +| ------------------- | ----------------------------------------- | +| `--force` | Force full regeneration | +| `--model ` | LLM model (default: minimax/minimax-m2.5) | +| `--base-url ` | LLM API base URL | +| `--api-key ` | LLM API key | +| `--concurrency ` | Parallel LLM calls (default: 3) | +| `--gist` | Publish wiki as a public GitHub Gist | + +### list — Show all indexed repos + +```bash +npx gitnexus list +``` + +Lists all repositories registered in `~/.gitnexus/registry.json`. The MCP `list_repos` tool provides the same information. + +## After Indexing + +1. **Read `gitnexus://repo/{name}/context`** to verify the index loaded +2. Use the other GitNexus skills (`exploring`, `debugging`, `impact-analysis`, `refactoring`) for your task + +## Troubleshooting + +- **"Not inside a git repository"**: Run from a directory inside a git repo +- **Index is stale after re-analyzing**: Restart Claude Code to reload the MCP server +- **Embeddings slow**: Omit `--embeddings` (it's off by default) or set `OPENAI_API_KEY` for faster API-based embedding diff --git a/.claude/skills/gitnexus/gitnexus-debugging/SKILL.md b/.claude/skills/gitnexus/gitnexus-debugging/SKILL.md new file mode 100644 index 0000000..9510b97 --- /dev/null +++ b/.claude/skills/gitnexus/gitnexus-debugging/SKILL.md @@ -0,0 +1,89 @@ +--- +name: gitnexus-debugging +description: "Use when the user is debugging a bug, tracing an error, or asking why something fails. Examples: \"Why is X failing?\", \"Where does this error come from?\", \"Trace this bug\"" +--- + +# Debugging with GitNexus + +## When to Use + +- "Why is this function failing?" +- "Trace where this error comes from" +- "Who calls this method?" +- "This endpoint returns 500" +- Investigating bugs, errors, or unexpected behavior + +## Workflow + +``` +1. gitnexus_query({query: ""}) → Find related execution flows +2. gitnexus_context({name: ""}) → See callers/callees/processes +3. READ gitnexus://repo/{name}/process/{name} → Trace execution flow +4. gitnexus_cypher({query: "MATCH path..."}) → Custom traces if needed +``` + +> If "Index is stale" → run `npx gitnexus analyze` in terminal. + +## Checklist + +``` +- [ ] Understand the symptom (error message, unexpected behavior) +- [ ] gitnexus_query for error text or related code +- [ ] Identify the suspect function from returned processes +- [ ] gitnexus_context to see callers and callees +- [ ] Trace execution flow via process resource if applicable +- [ ] gitnexus_cypher for custom call chain traces if needed +- [ ] Read source files to confirm root cause +``` + +## Debugging Patterns + +| Symptom | GitNexus Approach | +| -------------------- | ---------------------------------------------------------- | +| Error message | `gitnexus_query` for error text → `context` on throw sites | +| Wrong return value | `context` on the function → trace callees for data flow | +| Intermittent failure | `context` → look for external calls, async deps | +| Performance issue | `context` → find symbols with many callers (hot paths) | +| Recent regression | `detect_changes` to see what your changes affect | + +## Tools + +**gitnexus_query** — find code related to error: + +``` +gitnexus_query({query: "payment validation error"}) +→ Processes: CheckoutFlow, ErrorHandling +→ Symbols: validatePayment, handlePaymentError, PaymentException +``` + +**gitnexus_context** — full context for a suspect: + +``` +gitnexus_context({name: "validatePayment"}) +→ Incoming calls: processCheckout, webhookHandler +→ Outgoing calls: verifyCard, fetchRates (external API!) +→ Processes: CheckoutFlow (step 3/7) +``` + +**gitnexus_cypher** — custom call chain traces: + +```cypher +MATCH path = (a)-[:CodeRelation {type: 'CALLS'}*1..2]->(b:Function {name: "validatePayment"}) +RETURN [n IN nodes(path) | n.name] AS chain +``` + +## Example: "Payment endpoint returns 500 intermittently" + +``` +1. gitnexus_query({query: "payment error handling"}) + → Processes: CheckoutFlow, ErrorHandling + → Symbols: validatePayment, handlePaymentError + +2. gitnexus_context({name: "validatePayment"}) + → Outgoing calls: verifyCard, fetchRates (external API!) + +3. READ gitnexus://repo/my-app/process/CheckoutFlow + → Step 3: validatePayment → calls fetchRates (external) + +4. Root cause: fetchRates calls external API without proper timeout +``` diff --git a/.claude/skills/gitnexus/gitnexus-exploring/SKILL.md b/.claude/skills/gitnexus/gitnexus-exploring/SKILL.md new file mode 100644 index 0000000..927a4e4 --- /dev/null +++ b/.claude/skills/gitnexus/gitnexus-exploring/SKILL.md @@ -0,0 +1,78 @@ +--- +name: gitnexus-exploring +description: "Use when the user asks how code works, wants to understand architecture, trace execution flows, or explore unfamiliar parts of the codebase. Examples: \"How does X work?\", \"What calls this function?\", \"Show me the auth flow\"" +--- + +# Exploring Codebases with GitNexus + +## When to Use + +- "How does authentication work?" +- "What's the project structure?" +- "Show me the main components" +- "Where is the database logic?" +- Understanding code you haven't seen before + +## Workflow + +``` +1. READ gitnexus://repos → Discover indexed repos +2. READ gitnexus://repo/{name}/context → Codebase overview, check staleness +3. gitnexus_query({query: ""}) → Find related execution flows +4. gitnexus_context({name: ""}) → Deep dive on specific symbol +5. READ gitnexus://repo/{name}/process/{name} → Trace full execution flow +``` + +> If step 2 says "Index is stale" → run `npx gitnexus analyze` in terminal. + +## Checklist + +``` +- [ ] READ gitnexus://repo/{name}/context +- [ ] gitnexus_query for the concept you want to understand +- [ ] Review returned processes (execution flows) +- [ ] gitnexus_context on key symbols for callers/callees +- [ ] READ process resource for full execution traces +- [ ] Read source files for implementation details +``` + +## Resources + +| Resource | What you get | +| --------------------------------------- | ------------------------------------------------------- | +| `gitnexus://repo/{name}/context` | Stats, staleness warning (~150 tokens) | +| `gitnexus://repo/{name}/clusters` | All functional areas with cohesion scores (~300 tokens) | +| `gitnexus://repo/{name}/cluster/{name}` | Area members with file paths (~500 tokens) | +| `gitnexus://repo/{name}/process/{name}` | Step-by-step execution trace (~200 tokens) | + +## Tools + +**gitnexus_query** — find execution flows related to a concept: + +``` +gitnexus_query({query: "payment processing"}) +→ Processes: CheckoutFlow, RefundFlow, WebhookHandler +→ Symbols grouped by flow with file locations +``` + +**gitnexus_context** — 360-degree view of a symbol: + +``` +gitnexus_context({name: "validateUser"}) +→ Incoming calls: loginHandler, apiMiddleware +→ Outgoing calls: checkToken, getUserById +→ Processes: LoginFlow (step 2/5), TokenRefresh (step 1/3) +``` + +## Example: "How does payment processing work?" + +``` +1. READ gitnexus://repo/my-app/context → 918 symbols, 45 processes +2. gitnexus_query({query: "payment processing"}) + → CheckoutFlow: processPayment → validateCard → chargeStripe + → RefundFlow: initiateRefund → calculateRefund → processRefund +3. gitnexus_context({name: "processPayment"}) + → Incoming: checkoutHandler, webhookHandler + → Outgoing: validateCard, chargeStripe, saveTransaction +4. Read src/payments/processor.ts for implementation details +``` diff --git a/.claude/skills/gitnexus/gitnexus-guide/SKILL.md b/.claude/skills/gitnexus/gitnexus-guide/SKILL.md new file mode 100644 index 0000000..937ac73 --- /dev/null +++ b/.claude/skills/gitnexus/gitnexus-guide/SKILL.md @@ -0,0 +1,64 @@ +--- +name: gitnexus-guide +description: "Use when the user asks about GitNexus itself — available tools, how to query the knowledge graph, MCP resources, graph schema, or workflow reference. Examples: \"What GitNexus tools are available?\", \"How do I use GitNexus?\"" +--- + +# GitNexus Guide + +Quick reference for all GitNexus MCP tools, resources, and the knowledge graph schema. + +## Always Start Here + +For any task involving code understanding, debugging, impact analysis, or refactoring: + +1. **Read `gitnexus://repo/{name}/context`** — codebase overview + check index freshness +2. **Match your task to a skill below** and **read that skill file** +3. **Follow the skill's workflow and checklist** + +> If step 1 warns the index is stale, run `npx gitnexus analyze` in the terminal first. + +## Skills + +| Task | Skill to read | +| -------------------------------------------- | ------------------- | +| Understand architecture / "How does X work?" | `gitnexus-exploring` | +| Blast radius / "What breaks if I change X?" | `gitnexus-impact-analysis` | +| Trace bugs / "Why is X failing?" | `gitnexus-debugging` | +| Rename / extract / split / refactor | `gitnexus-refactoring` | +| Tools, resources, schema reference | `gitnexus-guide` (this file) | +| Index, status, clean, wiki CLI commands | `gitnexus-cli` | + +## Tools Reference + +| Tool | What it gives you | +| ---------------- | ------------------------------------------------------------------------ | +| `query` | Process-grouped code intelligence — execution flows related to a concept | +| `context` | 360-degree symbol view — categorized refs, processes it participates in | +| `impact` | Symbol blast radius — what breaks at depth 1/2/3 with confidence | +| `detect_changes` | Git-diff impact — what do your current changes affect | +| `rename` | Multi-file coordinated rename with confidence-tagged edits | +| `cypher` | Raw graph queries (read `gitnexus://repo/{name}/schema` first) | +| `list_repos` | Discover indexed repos | + +## Resources Reference + +Lightweight reads (~100-500 tokens) for navigation: + +| Resource | Content | +| ---------------------------------------------- | ----------------------------------------- | +| `gitnexus://repo/{name}/context` | Stats, staleness check | +| `gitnexus://repo/{name}/clusters` | All functional areas with cohesion scores | +| `gitnexus://repo/{name}/cluster/{clusterName}` | Area members | +| `gitnexus://repo/{name}/processes` | All execution flows | +| `gitnexus://repo/{name}/process/{processName}` | Step-by-step trace | +| `gitnexus://repo/{name}/schema` | Graph schema for Cypher | + +## Graph Schema + +**Nodes:** File, Function, Class, Interface, Method, Community, Process +**Edges (via CodeRelation.type):** CALLS, IMPORTS, EXTENDS, IMPLEMENTS, DEFINES, MEMBER_OF, STEP_IN_PROCESS + +```cypher +MATCH (caller)-[:CodeRelation {type: 'CALLS'}]->(f:Function {name: "myFunc"}) +RETURN caller.name, caller.filePath +``` diff --git a/.claude/skills/gitnexus/gitnexus-impact-analysis/SKILL.md b/.claude/skills/gitnexus/gitnexus-impact-analysis/SKILL.md new file mode 100644 index 0000000..e19af28 --- /dev/null +++ b/.claude/skills/gitnexus/gitnexus-impact-analysis/SKILL.md @@ -0,0 +1,97 @@ +--- +name: gitnexus-impact-analysis +description: "Use when the user wants to know what will break if they change something, or needs safety analysis before editing code. Examples: \"Is it safe to change X?\", \"What depends on this?\", \"What will break?\"" +--- + +# Impact Analysis with GitNexus + +## When to Use + +- "Is it safe to change this function?" +- "What will break if I modify X?" +- "Show me the blast radius" +- "Who uses this code?" +- Before making non-trivial code changes +- Before committing — to understand what your changes affect + +## Workflow + +``` +1. gitnexus_impact({target: "X", direction: "upstream"}) → What depends on this +2. READ gitnexus://repo/{name}/processes → Check affected execution flows +3. gitnexus_detect_changes() → Map current git changes to affected flows +4. Assess risk and report to user +``` + +> If "Index is stale" → run `npx gitnexus analyze` in terminal. + +## Checklist + +``` +- [ ] gitnexus_impact({target, direction: "upstream"}) to find dependents +- [ ] Review d=1 items first (these WILL BREAK) +- [ ] Check high-confidence (>0.8) dependencies +- [ ] READ processes to check affected execution flows +- [ ] gitnexus_detect_changes() for pre-commit check +- [ ] Assess risk level and report to user +``` + +## Understanding Output + +| Depth | Risk Level | Meaning | +| ----- | ---------------- | ------------------------ | +| d=1 | **WILL BREAK** | Direct callers/importers | +| d=2 | LIKELY AFFECTED | Indirect dependencies | +| d=3 | MAY NEED TESTING | Transitive effects | + +## Risk Assessment + +| Affected | Risk | +| ------------------------------ | -------- | +| <5 symbols, few processes | LOW | +| 5-15 symbols, 2-5 processes | MEDIUM | +| >15 symbols or many processes | HIGH | +| Critical path (auth, payments) | CRITICAL | + +## Tools + +**gitnexus_impact** — the primary tool for symbol blast radius: + +``` +gitnexus_impact({ + target: "validateUser", + direction: "upstream", + minConfidence: 0.8, + maxDepth: 3 +}) + +→ d=1 (WILL BREAK): + - loginHandler (src/auth/login.ts:42) [CALLS, 100%] + - apiMiddleware (src/api/middleware.ts:15) [CALLS, 100%] + +→ d=2 (LIKELY AFFECTED): + - authRouter (src/routes/auth.ts:22) [CALLS, 95%] +``` + +**gitnexus_detect_changes** — git-diff based impact analysis: + +``` +gitnexus_detect_changes({scope: "staged"}) + +→ Changed: 5 symbols in 3 files +→ Affected: LoginFlow, TokenRefresh, APIMiddlewarePipeline +→ Risk: MEDIUM +``` + +## Example: "What breaks if I change validateUser?" + +``` +1. gitnexus_impact({target: "validateUser", direction: "upstream"}) + → d=1: loginHandler, apiMiddleware (WILL BREAK) + → d=2: authRouter, sessionManager (LIKELY AFFECTED) + +2. READ gitnexus://repo/my-app/processes + → LoginFlow and TokenRefresh touch validateUser + +3. Risk: 2 direct callers, 2 processes = MEDIUM +``` diff --git a/.claude/skills/gitnexus/gitnexus-refactoring/SKILL.md b/.claude/skills/gitnexus/gitnexus-refactoring/SKILL.md new file mode 100644 index 0000000..f48cc01 --- /dev/null +++ b/.claude/skills/gitnexus/gitnexus-refactoring/SKILL.md @@ -0,0 +1,121 @@ +--- +name: gitnexus-refactoring +description: "Use when the user wants to rename, extract, split, move, or restructure code safely. Examples: \"Rename this function\", \"Extract this into a module\", \"Refactor this class\", \"Move this to a separate file\"" +--- + +# Refactoring with GitNexus + +## When to Use + +- "Rename this function safely" +- "Extract this into a module" +- "Split this service" +- "Move this to a new file" +- Any task involving renaming, extracting, splitting, or restructuring code + +## Workflow + +``` +1. gitnexus_impact({target: "X", direction: "upstream"}) → Map all dependents +2. gitnexus_query({query: "X"}) → Find execution flows involving X +3. gitnexus_context({name: "X"}) → See all incoming/outgoing refs +4. Plan update order: interfaces → implementations → callers → tests +``` + +> If "Index is stale" → run `npx gitnexus analyze` in terminal. + +## Checklists + +### Rename Symbol + +``` +- [ ] gitnexus_rename({symbol_name: "oldName", new_name: "newName", dry_run: true}) — preview all edits +- [ ] Review graph edits (high confidence) and ast_search edits (review carefully) +- [ ] If satisfied: gitnexus_rename({..., dry_run: false}) — apply edits +- [ ] gitnexus_detect_changes() — verify only expected files changed +- [ ] Run tests for affected processes +``` + +### Extract Module + +``` +- [ ] gitnexus_context({name: target}) — see all incoming/outgoing refs +- [ ] gitnexus_impact({target, direction: "upstream"}) — find all external callers +- [ ] Define new module interface +- [ ] Extract code, update imports +- [ ] gitnexus_detect_changes() — verify affected scope +- [ ] Run tests for affected processes +``` + +### Split Function/Service + +``` +- [ ] gitnexus_context({name: target}) — understand all callees +- [ ] Group callees by responsibility +- [ ] gitnexus_impact({target, direction: "upstream"}) — map callers to update +- [ ] Create new functions/services +- [ ] Update callers +- [ ] gitnexus_detect_changes() — verify affected scope +- [ ] Run tests for affected processes +``` + +## Tools + +**gitnexus_rename** — automated multi-file rename: + +``` +gitnexus_rename({symbol_name: "validateUser", new_name: "authenticateUser", dry_run: true}) +→ 12 edits across 8 files +→ 10 graph edits (high confidence), 2 ast_search edits (review) +→ Changes: [{file_path, edits: [{line, old_text, new_text, confidence}]}] +``` + +**gitnexus_impact** — map all dependents first: + +``` +gitnexus_impact({target: "validateUser", direction: "upstream"}) +→ d=1: loginHandler, apiMiddleware, testUtils +→ Affected Processes: LoginFlow, TokenRefresh +``` + +**gitnexus_detect_changes** — verify your changes after refactoring: + +``` +gitnexus_detect_changes({scope: "all"}) +→ Changed: 8 files, 12 symbols +→ Affected processes: LoginFlow, TokenRefresh +→ Risk: MEDIUM +``` + +**gitnexus_cypher** — custom reference queries: + +```cypher +MATCH (caller)-[:CodeRelation {type: 'CALLS'}]->(f:Function {name: "validateUser"}) +RETURN caller.name, caller.filePath ORDER BY caller.filePath +``` + +## Risk Rules + +| Risk Factor | Mitigation | +| ------------------- | ----------------------------------------- | +| Many callers (>5) | Use gitnexus_rename for automated updates | +| Cross-area refs | Use detect_changes after to verify scope | +| String/dynamic refs | gitnexus_query to find them | +| External/public API | Version and deprecate properly | + +## Example: Rename `validateUser` to `authenticateUser` + +``` +1. gitnexus_rename({symbol_name: "validateUser", new_name: "authenticateUser", dry_run: true}) + → 12 edits: 10 graph (safe), 2 ast_search (review) + → Files: validator.ts, login.ts, middleware.ts, config.json... + +2. Review ast_search edits (config.json: dynamic reference!) + +3. gitnexus_rename({symbol_name: "validateUser", new_name: "authenticateUser", dry_run: false}) + → Applied 12 edits across 8 files + +4. gitnexus_detect_changes({scope: "all"}) + → Affected: LoginFlow, TokenRefresh + → Risk: MEDIUM — run tests for these flows +``` diff --git a/.dockerignore b/.dockerignore new file mode 100644 index 0000000..2570269 --- /dev/null +++ b/.dockerignore @@ -0,0 +1,60 @@ +# Version control +.git +.gitignore +.gitattributes + +# Python +__pycache__ +*.pyc +*.pyo +.venv +.mypy_cache +.pytest_cache +.ruff_cache +*.egg-info + +# IDE +.vscode +.idea + +# Environment +.env +.env.* +!.env.example + +# Docker (no need to send these into the build context) +src/infra/Dockerfile +src/infra/compose.dev.yaml +src/infra/README.md +src/infra/.env +src/infra/.env.* + +# AI / tooling config +.claude +CLAUDE.md + +# CI +.github + +# Docs +docs + +# Build artifacts and data +dist +build +data +reports +coverage.xml +htmlcov +.coverage +.tox + +# Tests and demo (not needed at runtime) +test +src/demo + +# Project config (not needed at runtime) +sonar-project.properties +.pylintrc +.importlinter +tox.ini diff --git a/.gitattributes b/.gitattributes new file mode 100644 index 0000000..bfec021 --- /dev/null +++ b/.gitattributes @@ -0,0 +1,4 @@ +# Enforce Unix line endings +*.sh text eol=lf +Dockerfile text eol=lf +*.yaml text eol=lf diff --git a/.github/workflows/code-quality.yaml b/.github/workflows/ci.yaml similarity index 57% rename from .github/workflows/code-quality.yaml rename to .github/workflows/ci.yaml index 4369e8e..105b9ce 100644 --- a/.github/workflows/code-quality.yaml +++ b/.github/workflows/ci.yaml @@ -1,20 +1,18 @@ -# Quality Check workflow for Entity Resolution Engine (ERE) -# ========================================================= +# CI workflow for Entity Resolution Engine (ERE) +# =============================================== # Runs on push to develop and on PRs targeting develop. # -# Steps: -# 1. Install (Python, Poetry, project dependencies) -# 2. Lint, Test & Verify (tox: unit tests + architecture + clean-code checks) -# 3. SonarCloud analysis (coverage, quality gate) +# Jobs: +# 1. quality — Install, lint, test & verify (tox), SonarCloud +# 2. trigger-staging-deploy — on push to develop only, triggers the +# Deploy ERSys Staging workflow on enity-resolution-ops via the +# GitHub workflow_dispatch API # -# Optional repository secrets: -# - SONAR_TOKEN: SonarCloud authentication token (step skipped when absent) -# -# If the private ers-spec dependency fails to resolve with the default -# GITHUB_TOKEN, add a PAT as GH_TOKEN_PRIVATE_REPOS and uncomment the -# fallback section below. +# Required secrets: +# - SONAR_TOKEN: SonarCloud authentication token +# - CI_GH_TOKEN: org-level PAT for cross-repo workflow dispatch -name: Quality Check +name: CI on: push: @@ -29,8 +27,6 @@ jobs: quality: name: Lint, Test & Verify runs-on: ubuntu-latest - env: - SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }} services: redis: @@ -56,7 +52,7 @@ jobs: # ------------------------------------------------------------------ - name: Read Python version from pyproject.toml id: python-version - run: echo "version=$(grep -m1 'python = ' pyproject.toml | grep -oP '\d+\.\d+' | head -1)" >> $GITHUB_OUTPUT + run: echo "version=$(grep -m1 'python = ' src/pyproject.toml | grep -oP '\d+\.\d+' | head -1)" >> $GITHUB_OUTPUT - name: Set up Python uses: actions/setup-python@v6 @@ -81,7 +77,7 @@ jobs: path: | ~/.cache/pypoetry .tox - key: poetry-${{ runner.os }}-${{ hashFiles('poetry.lock', 'tox.ini') }} + key: poetry-${{ runner.os }}-${{ hashFiles('src/poetry.lock', 'tox.ini') }} restore-keys: | poetry-${{ runner.os }}- @@ -89,21 +85,53 @@ jobs: # Install # ------------------------------------------------------------------ - name: Install dependencies - run: poetry install --with dev + run: cd src && poetry install --with dev # ------------------------------------------------------------------ # Lint, Test & Verify (tox) # ------------------------------------------------------------------ + - name: Prepare environment file + run: cp src/infra/.env.example src/infra/.env + - name: Run quality checks (unit tests + architecture + clean-code) - run: | - rm -f infra/.env.local - poetry run tox -e py312,architecture,clean-code + run: cd src && poetry run tox -e py312,architecture,clean-code + env: + REDIS_HOST: localhost + REDIS_PORT: 6379 + REDIS_PASSWORD: "" # ------------------------------------------------------------------ # SonarCloud # ------------------------------------------------------------------ + - name: Check for SonarCloud Token + id: sonar_check + run: | + if [ -z "${{ secrets.SONAR_TOKEN }}" ]; then + echo "has_token=false" >> $GITHUB_OUTPUT + else + echo "has_token=true" >> $GITHUB_OUTPUT + fi + - name: SonarCloud scan - if: always() && env.SONAR_TOKEN != '' + if: always() && steps.sonar_check.outputs.has_token == 'true' uses: SonarSource/sonarqube-scan-action@v6 env: SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }} + + trigger-staging-deploy: + name: Trigger staging deploy + needs: quality + if: github.event_name == 'push' && github.repository_owner == 'meaningfy-ws' + runs-on: ubuntu-latest + env: + OPS_REPO: meaningfy-ws/enity-resolution-ops + DEPLOY_WORKFLOW: deploy-staging.yml + DEPLOY_REF: develop + steps: + - name: Trigger deploy workflow on ops repo + run: | + curl -sf -X POST \ + -H "Authorization: token ${{ secrets.CI_GH_TOKEN }}" \ + -H "Accept: application/vnd.github.v3+json" \ + "https://api.github.com/repos/${OPS_REPO}/actions/workflows/${DEPLOY_WORKFLOW}/dispatches" \ + -d '{"ref":"${{ env.DEPLOY_REF }}","inputs":{"repo":"${{ github.repository }}","sha":"${{ github.sha }}"}}' diff --git a/.gitignore b/.gitignore index a75bb70..6379a8e 100644 --- a/.gitignore +++ b/.gitignore @@ -136,6 +136,7 @@ celerybeat.pid # Environments .env +infra/.env .envrc .venv env/ @@ -215,4 +216,4 @@ poetry.toml .vscode .import_linter_cache .pycharm_plugin - +.idea diff --git a/.pylintrc b/.pylintrc index af64832..fbfd3f9 100644 --- a/.pylintrc +++ b/.pylintrc @@ -37,7 +37,7 @@ score=yes [BASIC] # Good names for short variables good-names=i,j,k,v,e,ex,f,fp,fd,x,y,z,id,pk,db,df,dt,ts,tz,io,ok,_,__,Run,log,url,uri,api,sql,xml,json,csv,ttl,rdf,ns,ctx,cfg,tmp,value -bad-names=foo,bar,baz,toto,tutu,tata,temp,tmp2,tmp3,data,info,obj,item,thing,stuff,do_stuff,handle,process,manager,helper,util,utils,utility,common,misc,base,abstract,generic,value,result,output,input,flag,flag1,flag2,aux,auxiliary +bad-names=foo,bar,baz,toto,tutu,tata,temp,tmp2,tmp3,data,info,obj,item,thing,stuff,do_stuff,handle,process,manager,helper,util,utility,common,misc,base,abstract,generic,value,result,output,input,flag,flag1,flag2,aux,auxiliary # Naming patterns for code elements name-group= diff --git a/AGENTS.md b/AGENTS.md new file mode 100644 index 0000000..7b9d592 --- /dev/null +++ b/AGENTS.md @@ -0,0 +1,219 @@ +# ERE — Agent Operating Instructions + +This file governs how AI agents operate in this repository. +It complements `CLAUDE.md` (which governs Claude Code specifically) and `.claude/CLAUDE.md` (project instructions). + +--- + +## Commits and PRs + +- **Never auto-commit** unless the user explicitly asks. +- **Never force-push** to `main` or `develop`. +- **Never add co-author lines**, tool names, or agent names to commit messages. +- Commit format: `type(scope): concise description` — e.g. `feat(adapters): add splink resolver factory`. +- Stage only files you modified: `git add `, never `git add -A` blindly. +- Before committing, run `make lint` and `make test-unit` to verify nothing is broken. +- PRs target `develop` (not `main`) unless told otherwise. +- When creating a PR, include a short summary and a test-plan checklist. + +--- + +## Working Methodology + +### Before touching code + +1. Read `WORKING.md` — it points to the active task file. +2. Read the referenced `docs/tasks/yyyy-mm-dd-*.md` fully. +3. Understand the current branch state: `git log --oneline -10`. + +### Running the stack for integration tests + +Integration tests require Redis to be running. Start it first: + +```bash +make infra-up # starts Redis + RedisInsight via Docker Compose +make test-integration # then run integration tests +make infra-down # tear down when done +``` + +Unit tests do **not** require any infrastructure: + +```bash +make test-unit # fast, self-contained, uses your venv +``` + +### Typical development loop + +```bash +make install # first time or after pyproject.toml changes +make test-unit # red → green → refactor +make lint # quick style check +make check-architecture # verify import-linter contracts +make all-quality-checks # before opening a PR +``` + +--- + +## Tooling Reference + +| Target | What it does | +|--------|-------------| +| `make install` | Install deps via Poetry | +| `make test-unit` | pytest unit suite + coverage report | +| `make test-integration` | integration tests (Redis must be up) | +| `make test-coverage` | HTML coverage report → `htmlcov/index.html` | +| `make lint` | pylint (fast, your venv) | +| `make format` | Ruff formatter | +| `make lint-fix` | Ruff auto-fix | +| `make check-clean-code` | pylint + radon + xenon (tox isolated) | +| `make check-architecture` | import-linter contracts (tox isolated) | +| `make all-quality-checks` | lint + clean-code + architecture | +| `make ci` | full tox pipeline (py312 + architecture + clean-code) | +| `make infra-up` | Start Redis stack (Docker Compose) | +| `make infra-down` | Stop Redis stack | +| `make infra-watch` | Live-reload mode (syncs `src/` and `src/config/`) | + +--- + +## Architecture Rules (enforced by import-linter) + +Dependency direction must never be violated: + +``` +entrypoints → services → models + ↘ + adapters → models +``` + +- `models/` — no I/O, no framework imports, no side effects. +- `adapters/` — infrastructure only; never calls `services/`. +- `services/` — orchestrates domain and adapters; never imports from `entrypoints/`. +- `entrypoints/` — parses input, calls services, formats output; no business logic. + +Violations block CI. Check with `make check-architecture` before opening a PR. + +--- + +## Memory Conventions + +Save to memory only what is non-obvious and persists across conversations: + +- Architectural decisions that aren't evident from the code (e.g. resolver factory registry pattern, DuckDB threading model). +- Design constraints explained by the user that aren't in comments or docs. +- User preferences about how to collaborate (e.g. "never suggest walrus operators", "prefer explicit factory injection"). + +Do **not** save to memory: +- Current task state (use the task file in `docs/tasks/`). +- Git history or recent changes (readable via `git log`). +- File paths or code structure (readable from the repo). + +--- + +## Gotchas + +- **`logging.basicConfig` is a no-op** when handlers already exist (conftest sets them up via `dictConfig`). Mock it with `patch("logging.basicConfig")` in logging tests. +- **DuckDB in tests**: use in-memory mode (`:memory:`) or a temp file via `tmp_path`; never a fixed path that leaks between tests. +- **Integration tests are marked** with `@pytest.mark.integration` — `make test-unit` skips them automatically. +- **`infra/.env`** is required for `make infra-*` targets. Copy from `infra/.env.example` on first use. +- **Config files** live in `src/config/` (moved from repo root in the 2026-04 restructure). Do not confuse with `infra/config/`. +- **erspec models** are LinkML-generated with snake_case fields (e.g. `legal_name`, not `legalName`). Do not edit generated files — update the schema and regenerate. +- **`ERE_LOG_LEVEL`** is the canonical env var for log level in this service (not `LOG_LEVEL`). + +--- + + +# GitNexus — Code Intelligence + +This project is indexed by GitNexus as **entity-resolution-engine-basic** (528 symbols, 1372 relationships, 36 execution flows). Use the GitNexus MCP tools to understand code, assess impact, and navigate safely. + +> If any GitNexus tool warns the index is stale, run `npx gitnexus analyze` in terminal first. + +## Always Do + +- **MUST run impact analysis before editing any symbol.** Before modifying a function, class, or method, run `gitnexus_impact({target: "symbolName", direction: "upstream"})` and report the blast radius (direct callers, affected processes, risk level) to the user. +- **MUST run `gitnexus_detect_changes()` before committing** to verify your changes only affect expected symbols and execution flows. +- **MUST warn the user** if impact analysis returns HIGH or CRITICAL risk before proceeding with edits. +- When exploring unfamiliar code, use `gitnexus_query({query: "concept"})` to find execution flows instead of grepping. It returns process-grouped results ranked by relevance. +- When you need full context on a specific symbol — callers, callees, which execution flows it participates in — use `gitnexus_context({name: "symbolName"})`. + +## When Debugging + +1. `gitnexus_query({query: ""})` — find execution flows related to the issue +2. `gitnexus_context({name: ""})` — see all callers, callees, and process participation +3. `READ gitnexus://repo/entity-resolution-engine-basic/process/{processName}` — trace the full execution flow step by step +4. For regressions: `gitnexus_detect_changes({scope: "compare", base_ref: "main"})` — see what your branch changed + +## When Refactoring + +- **Renaming**: MUST use `gitnexus_rename({symbol_name: "old", new_name: "new", dry_run: true})` first. Review the preview — graph edits are safe, text_search edits need manual review. Then run with `dry_run: false`. +- **Extracting/Splitting**: MUST run `gitnexus_context({name: "target"})` to see all incoming/outgoing refs, then `gitnexus_impact({target: "target", direction: "upstream"})` to find all external callers before moving code. +- After any refactor: run `gitnexus_detect_changes({scope: "all"})` to verify only expected files changed. + +## Never Do + +- NEVER edit a function, class, or method without first running `gitnexus_impact` on it. +- NEVER ignore HIGH or CRITICAL risk warnings from impact analysis. +- NEVER rename symbols with find-and-replace — use `gitnexus_rename` which understands the call graph. +- NEVER commit changes without running `gitnexus_detect_changes()` to check affected scope. + +## Tools Quick Reference + +| Tool | When to use | Command | +|------|-------------|---------| +| `query` | Find code by concept | `gitnexus_query({query: "auth validation"})` | +| `context` | 360-degree view of one symbol | `gitnexus_context({name: "validateUser"})` | +| `impact` | Blast radius before editing | `gitnexus_impact({target: "X", direction: "upstream"})` | +| `detect_changes` | Pre-commit scope check | `gitnexus_detect_changes({scope: "staged"})` | +| `rename` | Safe multi-file rename | `gitnexus_rename({symbol_name: "old", new_name: "new", dry_run: true})` | +| `cypher` | Custom graph queries | `gitnexus_cypher({query: "MATCH ..."})` | + +## Impact Risk Levels + +| Depth | Meaning | Action | +|-------|---------|--------| +| d=1 | WILL BREAK — direct callers/importers | MUST update these | +| d=2 | LIKELY AFFECTED — indirect deps | Should test | +| d=3 | MAY NEED TESTING — transitive | Test if critical path | + +## Resources + +| Resource | Use for | +|----------|---------| +| `gitnexus://repo/entity-resolution-engine-basic/context` | Codebase overview, check index freshness | +| `gitnexus://repo/entity-resolution-engine-basic/clusters` | All functional areas | +| `gitnexus://repo/entity-resolution-engine-basic/processes` | All execution flows | +| `gitnexus://repo/entity-resolution-engine-basic/process/{name}` | Step-by-step execution trace | + +## Self-Check Before Finishing + +Before completing any code modification task, verify: +1. `gitnexus_impact` was run for all modified symbols +2. No HIGH/CRITICAL risk warnings were ignored +3. `gitnexus_detect_changes()` confirms changes match expected scope +4. All d=1 (WILL BREAK) dependents were updated + +## Keeping the Index Fresh + +After committing code changes, the GitNexus index becomes stale. Re-run analyze to update it: + +```bash +npx gitnexus analyze +``` + +If the index previously included embeddings, preserve them by adding `--embeddings`: + +```bash +npx gitnexus analyze --embeddings +``` + +To check whether embeddings exist, inspect `.gitnexus/meta.json` — the `stats.embeddings` field shows the count (0 means no embeddings). **Running analyze without `--embeddings` will delete any previously generated embeddings.** + +> Claude Code users: A PostToolUse hook handles this automatically after `git commit` and `git merge`. + +## CLI + +- Re-index: `npx gitnexus analyze` +- Check freshness: `npx gitnexus status` +- Generate docs: `npx gitnexus wiki` + + diff --git a/CHANGELOG.md b/CHANGELOG.md index 9980d99..2c6d560 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -7,6 +7,19 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 --- +## [Unreleased] + +## [1.0.0-rc.1] - 2026-04-21 + +### Added +- Unit test suite expanded to meet the 80% coverage threshold + +### Changed +- Repository layout restructured: `config/`, `demo/`, `pyproject.toml`, `poetry.lock`, and `infra/` consolidated under `src/`; all tooling, Makefile targets, and path references updated accordingly +- Docker: multi-stage wheel-based build with non-root user for improved security and build reproducibility; configuration decoupled from the image and mounted at runtime +- CI: SonarCloud scan made conditional on token availability; coverage report path mapping corrected; integration tests excluded from the tox pipeline to keep unit runs self-contained; staging deployment gated behind explicit dispatch +- Environment variables aligned with ERSys naming convention + ## [0.3.0] - 2026-03-04 ### Added @@ -65,7 +78,7 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0 **Docker & Deployment** - Multi-stage Dockerfile for production-ready containerization -- `docker-compose.yml` for full-stack setup (Redis + ERE service) +- `compose.dev.yaml` for full-stack setup (Redis + ERE service) - `.env.example` template for configuration **Documentation** diff --git a/CLAUDE.md b/CLAUDE.md new file mode 100644 index 0000000..7b9d592 --- /dev/null +++ b/CLAUDE.md @@ -0,0 +1,219 @@ +# ERE — Agent Operating Instructions + +This file governs how AI agents operate in this repository. +It complements `CLAUDE.md` (which governs Claude Code specifically) and `.claude/CLAUDE.md` (project instructions). + +--- + +## Commits and PRs + +- **Never auto-commit** unless the user explicitly asks. +- **Never force-push** to `main` or `develop`. +- **Never add co-author lines**, tool names, or agent names to commit messages. +- Commit format: `type(scope): concise description` — e.g. `feat(adapters): add splink resolver factory`. +- Stage only files you modified: `git add `, never `git add -A` blindly. +- Before committing, run `make lint` and `make test-unit` to verify nothing is broken. +- PRs target `develop` (not `main`) unless told otherwise. +- When creating a PR, include a short summary and a test-plan checklist. + +--- + +## Working Methodology + +### Before touching code + +1. Read `WORKING.md` — it points to the active task file. +2. Read the referenced `docs/tasks/yyyy-mm-dd-*.md` fully. +3. Understand the current branch state: `git log --oneline -10`. + +### Running the stack for integration tests + +Integration tests require Redis to be running. Start it first: + +```bash +make infra-up # starts Redis + RedisInsight via Docker Compose +make test-integration # then run integration tests +make infra-down # tear down when done +``` + +Unit tests do **not** require any infrastructure: + +```bash +make test-unit # fast, self-contained, uses your venv +``` + +### Typical development loop + +```bash +make install # first time or after pyproject.toml changes +make test-unit # red → green → refactor +make lint # quick style check +make check-architecture # verify import-linter contracts +make all-quality-checks # before opening a PR +``` + +--- + +## Tooling Reference + +| Target | What it does | +|--------|-------------| +| `make install` | Install deps via Poetry | +| `make test-unit` | pytest unit suite + coverage report | +| `make test-integration` | integration tests (Redis must be up) | +| `make test-coverage` | HTML coverage report → `htmlcov/index.html` | +| `make lint` | pylint (fast, your venv) | +| `make format` | Ruff formatter | +| `make lint-fix` | Ruff auto-fix | +| `make check-clean-code` | pylint + radon + xenon (tox isolated) | +| `make check-architecture` | import-linter contracts (tox isolated) | +| `make all-quality-checks` | lint + clean-code + architecture | +| `make ci` | full tox pipeline (py312 + architecture + clean-code) | +| `make infra-up` | Start Redis stack (Docker Compose) | +| `make infra-down` | Stop Redis stack | +| `make infra-watch` | Live-reload mode (syncs `src/` and `src/config/`) | + +--- + +## Architecture Rules (enforced by import-linter) + +Dependency direction must never be violated: + +``` +entrypoints → services → models + ↘ + adapters → models +``` + +- `models/` — no I/O, no framework imports, no side effects. +- `adapters/` — infrastructure only; never calls `services/`. +- `services/` — orchestrates domain and adapters; never imports from `entrypoints/`. +- `entrypoints/` — parses input, calls services, formats output; no business logic. + +Violations block CI. Check with `make check-architecture` before opening a PR. + +--- + +## Memory Conventions + +Save to memory only what is non-obvious and persists across conversations: + +- Architectural decisions that aren't evident from the code (e.g. resolver factory registry pattern, DuckDB threading model). +- Design constraints explained by the user that aren't in comments or docs. +- User preferences about how to collaborate (e.g. "never suggest walrus operators", "prefer explicit factory injection"). + +Do **not** save to memory: +- Current task state (use the task file in `docs/tasks/`). +- Git history or recent changes (readable via `git log`). +- File paths or code structure (readable from the repo). + +--- + +## Gotchas + +- **`logging.basicConfig` is a no-op** when handlers already exist (conftest sets them up via `dictConfig`). Mock it with `patch("logging.basicConfig")` in logging tests. +- **DuckDB in tests**: use in-memory mode (`:memory:`) or a temp file via `tmp_path`; never a fixed path that leaks between tests. +- **Integration tests are marked** with `@pytest.mark.integration` — `make test-unit` skips them automatically. +- **`infra/.env`** is required for `make infra-*` targets. Copy from `infra/.env.example` on first use. +- **Config files** live in `src/config/` (moved from repo root in the 2026-04 restructure). Do not confuse with `infra/config/`. +- **erspec models** are LinkML-generated with snake_case fields (e.g. `legal_name`, not `legalName`). Do not edit generated files — update the schema and regenerate. +- **`ERE_LOG_LEVEL`** is the canonical env var for log level in this service (not `LOG_LEVEL`). + +--- + + +# GitNexus — Code Intelligence + +This project is indexed by GitNexus as **entity-resolution-engine-basic** (528 symbols, 1372 relationships, 36 execution flows). Use the GitNexus MCP tools to understand code, assess impact, and navigate safely. + +> If any GitNexus tool warns the index is stale, run `npx gitnexus analyze` in terminal first. + +## Always Do + +- **MUST run impact analysis before editing any symbol.** Before modifying a function, class, or method, run `gitnexus_impact({target: "symbolName", direction: "upstream"})` and report the blast radius (direct callers, affected processes, risk level) to the user. +- **MUST run `gitnexus_detect_changes()` before committing** to verify your changes only affect expected symbols and execution flows. +- **MUST warn the user** if impact analysis returns HIGH or CRITICAL risk before proceeding with edits. +- When exploring unfamiliar code, use `gitnexus_query({query: "concept"})` to find execution flows instead of grepping. It returns process-grouped results ranked by relevance. +- When you need full context on a specific symbol — callers, callees, which execution flows it participates in — use `gitnexus_context({name: "symbolName"})`. + +## When Debugging + +1. `gitnexus_query({query: ""})` — find execution flows related to the issue +2. `gitnexus_context({name: ""})` — see all callers, callees, and process participation +3. `READ gitnexus://repo/entity-resolution-engine-basic/process/{processName}` — trace the full execution flow step by step +4. For regressions: `gitnexus_detect_changes({scope: "compare", base_ref: "main"})` — see what your branch changed + +## When Refactoring + +- **Renaming**: MUST use `gitnexus_rename({symbol_name: "old", new_name: "new", dry_run: true})` first. Review the preview — graph edits are safe, text_search edits need manual review. Then run with `dry_run: false`. +- **Extracting/Splitting**: MUST run `gitnexus_context({name: "target"})` to see all incoming/outgoing refs, then `gitnexus_impact({target: "target", direction: "upstream"})` to find all external callers before moving code. +- After any refactor: run `gitnexus_detect_changes({scope: "all"})` to verify only expected files changed. + +## Never Do + +- NEVER edit a function, class, or method without first running `gitnexus_impact` on it. +- NEVER ignore HIGH or CRITICAL risk warnings from impact analysis. +- NEVER rename symbols with find-and-replace — use `gitnexus_rename` which understands the call graph. +- NEVER commit changes without running `gitnexus_detect_changes()` to check affected scope. + +## Tools Quick Reference + +| Tool | When to use | Command | +|------|-------------|---------| +| `query` | Find code by concept | `gitnexus_query({query: "auth validation"})` | +| `context` | 360-degree view of one symbol | `gitnexus_context({name: "validateUser"})` | +| `impact` | Blast radius before editing | `gitnexus_impact({target: "X", direction: "upstream"})` | +| `detect_changes` | Pre-commit scope check | `gitnexus_detect_changes({scope: "staged"})` | +| `rename` | Safe multi-file rename | `gitnexus_rename({symbol_name: "old", new_name: "new", dry_run: true})` | +| `cypher` | Custom graph queries | `gitnexus_cypher({query: "MATCH ..."})` | + +## Impact Risk Levels + +| Depth | Meaning | Action | +|-------|---------|--------| +| d=1 | WILL BREAK — direct callers/importers | MUST update these | +| d=2 | LIKELY AFFECTED — indirect deps | Should test | +| d=3 | MAY NEED TESTING — transitive | Test if critical path | + +## Resources + +| Resource | Use for | +|----------|---------| +| `gitnexus://repo/entity-resolution-engine-basic/context` | Codebase overview, check index freshness | +| `gitnexus://repo/entity-resolution-engine-basic/clusters` | All functional areas | +| `gitnexus://repo/entity-resolution-engine-basic/processes` | All execution flows | +| `gitnexus://repo/entity-resolution-engine-basic/process/{name}` | Step-by-step execution trace | + +## Self-Check Before Finishing + +Before completing any code modification task, verify: +1. `gitnexus_impact` was run for all modified symbols +2. No HIGH/CRITICAL risk warnings were ignored +3. `gitnexus_detect_changes()` confirms changes match expected scope +4. All d=1 (WILL BREAK) dependents were updated + +## Keeping the Index Fresh + +After committing code changes, the GitNexus index becomes stale. Re-run analyze to update it: + +```bash +npx gitnexus analyze +``` + +If the index previously included embeddings, preserve them by adding `--embeddings`: + +```bash +npx gitnexus analyze --embeddings +``` + +To check whether embeddings exist, inspect `.gitnexus/meta.json` — the `stats.embeddings` field shows the count (0 means no embeddings). **Running analyze without `--embeddings` will delete any previously generated embeddings.** + +> Claude Code users: A PostToolUse hook handles this automatically after `git commit` and `git merge`. + +## CLI + +- Re-index: `npx gitnexus analyze` +- Check freshness: `npx gitnexus status` +- Generate docs: `npx gitnexus wiki` + + diff --git a/Makefile b/Makefile index 6a6cfa1..63ead09 100644 --- a/Makefile +++ b/Makefile @@ -27,7 +27,16 @@ PROJECT_PATH = $(shell pwd) SRC_PATH = ${PROJECT_PATH}/src TEST_PATH = ${PROJECT_PATH}/test BUILD_PATH = ${PROJECT_PATH}/dist -INFRA_PATH = ${PROJECT_PATH}/infra +INFRA_PATH = ${PROJECT_PATH}/src/infra +COMPOSE_FILE = ${INFRA_PATH}/compose.dev.yaml +ENV_FILE = ${INFRA_PATH}/.env + +# Auto-export all .env variables to every recipe shell (if the file exists) +ifneq ($(wildcard $(ENV_FILE)),) +include $(ENV_FILE) +export $(shell sed -n 's/^\([^#= ][^= ]*\)[ ]*=.*/\1/p' $(ENV_FILE)) +endif + PACKAGE_NAME = ere ICON_DONE = [✔] @@ -66,9 +75,13 @@ help: ## Display available targets @ echo "" @ echo -e " $(BUILD_PRINT)Infrastructure (Docker):$(END_BUILD_PRINT)" @ echo " infra-build - Build the ERE Docker image" - @ echo " infra-up - Start full stack (Redis + ERE) in detached mode" + @ echo " infra-up - Start services (docker compose up -d)" @ echo " infra-down - Stop and remove stack containers and networks" - @ echo " infra-logs - Tail ERE container logs" + @ echo " infra-down-volumes - Stop services and remove volumes (clean slate)" + @ echo " infra-rebuild - Rebuild images and start services" + @ echo " infra-rebuild-clean - Rebuild from scratch (no cache) and start" + @ echo " infra-logs - Follow service logs" + @ echo " infra-watch - Start services with file watching (sync src/ and src/config/)" @ echo "" @ echo -e " $(BUILD_PRINT)Utilities:$(END_BUILD_PRINT)" @ echo " clean - Remove build artifacts and caches" @@ -82,13 +95,13 @@ install-poetry: ## Install Poetry if not present install: install-poetry ## Install project dependencies @ echo -e "$(BUILD_PRINT)$(ICON_PROGRESS) Installing ERE requirements$(END_BUILD_PRINT)" - @ poetry lock - @ poetry install --with dev + @ cd src && poetry lock + @ cd src && poetry install --with dev @ echo -e "$(BUILD_PRINT)$(ICON_DONE) ERE requirements are installed$(END_BUILD_PRINT)" build: ## Build the package distribution @ echo -e "$(BUILD_PRINT)$(ICON_PROGRESS) Building package$(END_BUILD_PRINT)" - @ poetry build + @ cd src && poetry build @ echo -e "$(BUILD_PRINT)$(ICON_DONE) Package built successfully$(END_BUILD_PRINT)" #----------------------------------------------------------------------------- @@ -97,24 +110,24 @@ build: ## Build the package distribution .PHONY: test test-unit test-integration test-coverage test: ## Run all tests @ echo -e "$(BUILD_PRINT)$(ICON_PROGRESS) Running all tests$(END_BUILD_PRINT)" - @ poetry run pytest $(TEST_PATH) + @ cd src && poetry run pytest --rootdir=$(SRC_PATH) $(TEST_PATH) @ echo -e "$(BUILD_PRINT)$(ICON_DONE) All tests passed$(END_BUILD_PRINT)" test-unit: ## Run unit tests with coverage (fast, uses your venv) @ echo -e "$(BUILD_PRINT)$(ICON_PROGRESS) Running unit tests with coverage$(END_BUILD_PRINT)" - @ poetry run pytest $(TEST_PATH) -m "not integration" \ - --cov=src --cov-report=term-missing --cov-report=html + @ cd src && poetry run pytest --rootdir=$(SRC_PATH) $(TEST_PATH) -m "not integration" \ + --cov=ere --cov-report=term-missing --cov-report=html:htmlcov @ echo -e "$(BUILD_PRINT)$(ICON_DONE) Unit tests passed (coverage: htmlcov/index.html)$(END_BUILD_PRINT)" -test-integration: ## Run integration tests only +test-integration: check-env ## Run integration tests only (requires Redis — run make infra-up first) @ echo -e "$(BUILD_PRINT)$(ICON_PROGRESS) Running integration tests$(END_BUILD_PRINT)" - @ poetry run pytest $(TEST_PATH) -m "integration" + @ cd src && poetry run pytest --rootdir=$(SRC_PATH) $(TEST_PATH) -m "integration" @ echo -e "$(BUILD_PRINT)$(ICON_DONE) Integration tests passed$(END_BUILD_PRINT)" test-coverage: ## Generate detailed HTML coverage report @ echo -e "$(BUILD_PRINT)$(ICON_PROGRESS) Generating coverage report$(END_BUILD_PRINT)" - @ poetry run pytest $(TEST_PATH) -m "not integration" \ - --cov=src --cov-report=html --cov-report=term-missing + @ cd src && poetry run pytest --rootdir=$(SRC_PATH) $(TEST_PATH) -m "not integration" \ + --cov=ere --cov-report=html:htmlcov --cov-report=term-missing @ echo -e "$(BUILD_PRINT)$(ICON_DONE) Coverage report: htmlcov/index.html$(END_BUILD_PRINT)" #----------------------------------------------------------------------------- @@ -124,27 +137,27 @@ test-coverage: ## Generate detailed HTML coverage report format: ## Format code with Ruff @ echo -e "$(BUILD_PRINT)$(ICON_PROGRESS) Formatting code$(END_BUILD_PRINT)" - @ poetry run ruff format $(SRC_PATH) $(TEST_PATH) + @ cd src && poetry run ruff format $(SRC_PATH) $(TEST_PATH) @ echo -e "$(BUILD_PRINT)$(ICON_DONE) Format complete$(END_BUILD_PRINT)" lint: ## Run pylint checks (style, naming, SOLID principles) — uses your venv @ echo -e "$(BUILD_PRINT)$(ICON_PROGRESS) Running pylint checks$(END_BUILD_PRINT)" - @ poetry run pylint --rcfile=.pylintrc ./src ./test + @ cd src && poetry run pylint --rcfile=$(PROJECT_PATH)/.pylintrc $(SRC_PATH)/ere $(TEST_PATH) @ echo -e "$(BUILD_PRINT)$(ICON_DONE) Pylint checks passed$(END_BUILD_PRINT)" lint-fix: ## Auto-fix code style with Ruff @ echo -e "$(BUILD_PRINT)$(ICON_PROGRESS) Auto-fixing with Ruff$(END_BUILD_PRINT)" - @ poetry run ruff check --fix $(SRC_PATH) $(TEST_PATH) + @ cd src && poetry run ruff check --fix $(SRC_PATH) $(TEST_PATH) @ echo -e "$(BUILD_PRINT)$(ICON_DONE) Auto-fix complete$(END_BUILD_PRINT)" check-clean-code: ## Clean-code checks: pylint + radon + xenon (isolated tox) @ echo -e "$(BUILD_PRINT)$(ICON_PROGRESS) Running clean-code checks (tox isolated)$(END_BUILD_PRINT)" - @ tox -e clean-code + @ cd src && poetry run tox -e clean-code @ echo -e "$(BUILD_PRINT)$(ICON_DONE) Clean-code checks passed$(END_BUILD_PRINT)" check-architecture: ## Validate architectural boundaries (isolated tox) @ echo -e "$(BUILD_PRINT)$(ICON_PROGRESS) Checking architecture contracts (tox isolated)$(END_BUILD_PRINT)" - @ tox -e architecture + @ cd src && poetry run tox -e architecture @ echo -e "$(BUILD_PRINT)$(ICON_DONE) Architecture checks passed$(END_BUILD_PRINT)" all-quality-checks: lint check-clean-code check-architecture ## Run all: lint + clean-code + architecture @@ -152,31 +165,55 @@ all-quality-checks: lint check-clean-code check-architecture ## Run all: lint + ci: ## Full CI pipeline for GitHub Actions (tox) @ echo -e "$(BUILD_PRINT)$(ICON_PROGRESS) Running full CI pipeline$(END_BUILD_PRINT)" - @ tox -e py312,architecture,clean-code + @ set -a && . $(ENV_FILE) && set +a && poetry -C ./src run tox -e py312,architecture,clean-code @ echo -e "$(BUILD_PRINT)$(ICON_DONE) CI pipeline complete$(END_BUILD_PRINT)" #----------------------------------------------------------------------------- # Infrastructure commands (Docker) #----------------------------------------------------------------------------- -.PHONY: infra-build infra-up infra-down infra-logs +.PHONY: check-env infra-build infra-up infra-down infra-down-volumes infra-rebuild infra-rebuild-clean infra-logs infra-watch -infra-build: ## Build the ERE Docker image +check-env: + @ test -f $(ENV_FILE) || (echo -e "$(BUILD_PRINT)$(ICON_ERROR) Missing $(ENV_FILE). Run: cp infra/.env.example infra/.env$(END_BUILD_PRINT)" && exit 1) + +infra-build: check-env ## Build the ERE Docker image @ echo -e "$(BUILD_PRINT)$(ICON_PROGRESS) Building ERE Docker image$(END_BUILD_PRINT)" - @ docker compose -f $(INFRA_PATH)/docker-compose.yml build + @ docker compose -f $(COMPOSE_FILE) --env-file $(ENV_FILE) build @ echo -e "$(BUILD_PRINT)$(ICON_DONE) ERE image built$(END_BUILD_PRINT)" -infra-up: ## Start full stack: Redis + ERE (docker compose up --build) +infra-up: check-env ## Start services (docker compose up -d) + @ docker network create ersys-local || true @ echo -e "$(BUILD_PRINT)$(ICON_PROGRESS) Starting ERE stack$(END_BUILD_PRINT)" - @ docker compose -f $(INFRA_PATH)/docker-compose.yml up --build -d + @ docker compose -f $(COMPOSE_FILE) --env-file $(ENV_FILE) up -d @ echo -e "$(BUILD_PRINT)$(ICON_DONE) ERE stack is running — use 'make infra-logs' to follow output$(END_BUILD_PRINT)" -infra-down: ## Stop and remove ERE stack containers and networks +infra-down: check-env ## Stop and remove ERE stack containers and networks @ echo -e "$(BUILD_PRINT)$(ICON_PROGRESS) Stopping ERE stack$(END_BUILD_PRINT)" - @ docker compose -f $(INFRA_PATH)/docker-compose.yml down + @ docker compose -f $(COMPOSE_FILE) --env-file $(ENV_FILE) down @ echo -e "$(BUILD_PRINT)$(ICON_DONE) ERE stack stopped$(END_BUILD_PRINT)" -infra-logs: ## Tail logs from the ERE container - @ docker compose -f $(INFRA_PATH)/docker-compose.yml logs -f ere +infra-down-volumes: check-env ## Stop services and remove volumes (clean slate) + @ echo -e "$(BUILD_PRINT)$(ICON_PROGRESS) Stopping ERE stack and removing volumes$(END_BUILD_PRINT)" + @ docker compose -f $(COMPOSE_FILE) --env-file $(ENV_FILE) down -v + @ echo -e "$(BUILD_PRINT)$(ICON_DONE) ERE stack stopped and volumes removed$(END_BUILD_PRINT)" + +infra-rebuild: check-env ## Rebuild images and start services + @ echo -e "$(BUILD_PRINT)$(ICON_PROGRESS) Rebuilding ERE stack$(END_BUILD_PRINT)" + @ docker compose -f $(COMPOSE_FILE) --env-file $(ENV_FILE) up -d --build + @ echo -e "$(BUILD_PRINT)$(ICON_DONE) ERE stack rebuilt and started$(END_BUILD_PRINT)" + +infra-rebuild-clean: check-env ## Rebuild from scratch (no cache) and start + @ echo -e "$(BUILD_PRINT)$(ICON_PROGRESS) Rebuilding ERE stack (no cache)$(END_BUILD_PRINT)" + @ docker compose -f $(COMPOSE_FILE) --env-file $(ENV_FILE) build --no-cache + @ docker compose -f $(COMPOSE_FILE) --env-file $(ENV_FILE) up -d + @ echo -e "$(BUILD_PRINT)$(ICON_DONE) ERE stack rebuilt (clean) and started$(END_BUILD_PRINT)" + +infra-logs: check-env ## Follow service logs + @ docker compose -f $(COMPOSE_FILE) --env-file $(ENV_FILE) logs -f + +infra-watch: check-env ## Start services with file watching (sync src/ and src/config/) + @ echo -e "$(BUILD_PRINT)$(ICON_PROGRESS) Starting ERE stack with watch$(END_BUILD_PRINT)" + @ docker compose -f $(COMPOSE_FILE) --env-file $(ENV_FILE) watch #----------------------------------------------------------------------------- # Utility commands @@ -188,8 +225,9 @@ clean: ## Remove build artifacts and caches @ rm -rf .pytest_cache @ rm -rf .tox @ rm -rf *.egg-info + @ rm -rf src/*.egg-info @ rm -rf htmlcov coverage.xml - @ poetry run ruff clean + @ cd src && poetry run ruff clean @ find . -type d -name __pycache__ -exec rm -rf {} + 2>/dev/null || true @ find . -type f -name "*.pyc" -delete 2>/dev/null || true @ find . -type f -name "*.pyo" -delete 2>/dev/null || true diff --git a/README.md b/README.md index 81e76ca..c3fdd94 100644 --- a/README.md +++ b/README.md @@ -35,7 +35,7 @@ Its primary purpose is to interact with the Entity Resolution System (ERSys). It For detailed documentation, see: - [Architecture](docs/architecture.md) - description of the applied architecture - [Algorithm](docs/algorithm.md) - incremental probabilistic entity linking -- [Configuration](infra/config/README.md) - field mapping, model tuning, Splink setup +- [Configuration](src/config/README.md) - field mapping, model tuning, Splink setup - [ERS–ERE Technical Contract v0.2](docs/ERS-ERE-System-Technical-Contract.pdf) @@ -47,56 +47,103 @@ ERE relies on **ers-spec** (from [entity-resolution-spec](https://github.com/OP- This ensures type-safe, versioned communication between ERE and other ERSys components. +#### External Infrastructure Dependencies +To function, the ERE service requires the following external infrastructure: +- **Redis**: Used as the message broker for the request/response queues (`ere_requests` and `ere_responses`). +- **Docker**: Required for containerized deployment and local development. +- **Python 3.12**: The runtime environment for the engine. -## Installation -### Requirements +## Getting Started -- **Python** 3.12+ -- **make** -- **Poetry** (dependency management) -- **Docker** +### Prerequisites -### Quickstart +- Python 3.12+ +- [Poetry](https://python-poetry.org/) 2.x +- Docker + Docker Compose + +### 1. Clone and install -In order to setup the project locally: ```bash -# Install all Python dependencies (Poetry is required) +git clone https://github.com/meaningfy-ws/entity-resolution-engine-basic.git +cd entity-resolution-engine-basic make install ``` -To build and launch Docker-based stack (ERE + Redis): -1. (optional) Adjust connection and logging config in [.env.local](infra/.env.local). -2. Run the following: -```bash -# Build the ERE Docker image -make infra-build +### 2. Configure the environment -# Start the full stack: Redis + ERE service -make infra-up +```bash +cp src/infra/.env.example src/infra/.env ``` -Launch a demo script and observe the end-to-end resolution flow; the demo script connects to the locally deployed Redis instance to which the ERE service is subscribed. -```bash -poetry run python demo/demo.py # run the demo script with the default data +The defaults work for local development. Notable variables in `src/infra/.env`: + +| Variable | Default | Description | +|----------|---------|-------------| +| `REDIS_HOST` | `ersys-redis` | Redis host (shared network `ersys-local`) | +| `REDIS_PORT` | `6379` | Redis port | +| `REDIS_PASSWORD` | `changeme` | Redis password — **must match ERS** | +| `REDIS_DB` | `0` | Redis database index | +| `ERE_REQUEST_QUEUE` | `ere_requests` | Inbound request queue name — **must match ERS** | +| `ERE_RESPONSE_QUEUE` | `ere_responses` | Outbound response queue name — **must match ERS** | +| `ERE_LOG_LEVEL` | `INFO` | Log level | + +### 3. Start the stack -# run the script with a custom request data file -poetry run python demo/demo.py --data demo/data/org-small.json -# logs from request submission and resolution outcomes will be printed to stdout +```bash +make infra-up # start ERE + Redis + RedisInsight +make infra-logs # follow service logs +make infra-down # stop all services -# inspect ere service logs -make infra-logs +Note: `make infra-up` creates a shared external network `ersys-local` used for cross-component communication. +To remove it manually: `docker network rm ersys-local` ``` -Terminate the service: +| Service | URL / Port | +|---------|-----------| +| Redis | `localhost:6379` | +| RedisInsight | `http://localhost:5540` | + +### What this stack does NOT include + +This repo starts ERE and its own Redis instance. It does **not** include the ERS backend or the web UI. + +ERE communicates exclusively through Redis queues — it has no HTTP API. Without ERS publishing requests to `ere_requests`, ERE will start and listen but process nothing. + +- To add ERS: follow the Getting Started section in [entity-resolution-service](https://github.com/meaningfy-ws/entity-resolution-service#getting-started). +- To add the web UI: follow the Getting Started section in [entity-resolution-service-webapp](https://github.com/meaningfy-ws/entity-resolution-service-webapp#getting-started). + +#### Running ERE alongside ERS (shared Redis) + +ERS starts its own Redis on port 6379. ERE also starts Redis on port 6379 by default — running both simultaneously causes a port conflict. + +**Solution**: let ERS own Redis, point ERE at it: + +1. In `src/infra/.env`, set `REDIS_HOST=ersys-redis` +2. Comment out the `ersys-redis` service block in `src/infra/compose.dev.yaml` +3. Start ERS first (`make up` in the ERS repo), then ERE (`make infra-up`) + +Queue names and `REDIS_PASSWORD` must match between both `.env` files (defaults already align). + +### 4. Run the demo + +With ERE running (`make infra-up`), launch the demo script to observe end-to-end resolution: + ```bash -make infra-down +cd src && poetry run python demo/demo.py # 8 mentions, 2 clusters (default) +cd src && poetry run python demo/demo.py --data demo/data/org-small.json # 100 mentions ``` -Note: In order for the demo to work, you need to either set `REDIS_HOST=localhost` in the [.env.local](infra/.env.local) file or pass it to the script as an environment variable. +> The demo connects directly to Redis (`localhost:6379`). Set `REDIS_HOST=localhost` in `src/infra/.env` before running. + +```bash +make infra-logs # inspect ERE service logs +make infra-down # stop when done +``` +See [`src/demo/README.md`](src/demo/README.md) for datasets, configuration, and example output. -For detailed setup instructions, see `Make targets`. +--- ## Usage @@ -133,9 +180,13 @@ Available targets (`make help`): Infrastructure (Docker): infra-build - Build the ERE Docker image - infra-up - Start full stack (Redis + ERE) in detached mode + infra-up - Start services (docker compose up -d) infra-down - Stop and remove stack containers and networks - infra-logs - Tail ERE container logs + infra-down-volumes - Stop services and remove volumes (clean slate) + infra-rebuild - Rebuild images and start services + infra-rebuild-clean - Rebuild from scratch (no cache) and start + infra-logs - Follow service logs + infra-watch - Start services with file watching (sync src/ and src/config/) Utilities: clean - Remove build artifacts and caches @@ -145,10 +196,10 @@ Available targets (`make help`): ### Configuration (Resolver and Mapper) Entity resolution behaviour is configured via two YAML files: -- **Resolver configuration** ([resolver.yaml](./infra/config/resolver.yaml)): Splink comparisons, cold-start parameters, similarity thresholds -- **RDF mapping** ([rdf_mapping.yaml](./infra/config/rdf_mapping.yaml)): RDF namespace bindings, field extraction rules, entity type definitions +- **Resolver configuration** ([resolver.yaml](./src/config/resolver.yaml)): Splink comparisons, cold-start parameters, similarity thresholds +- **RDF mapping** ([rdf_mapping.yaml](./src/config/rdf_mapping.yaml)): RDF namespace bindings, field extraction rules, entity type definitions -For detailed configuration options and tuning, see the [configuration page](./infra/config/README.md). +For detailed configuration options and tuning, see the [configuration page](./src/config/README.md). ### Examples @@ -156,12 +207,12 @@ A working demo is available that demonstrates ERE as a black-box service communi ```bash # Prerequisites: Redis must be running, ERE service must be listening -python demo/demo.py # Uses org-tiny.json (8 mentions, 2 clusters) -python demo/demo.py --data demo/data/org-small.json # 100 mentions, realistic clustering +python src/demo/demo.py # Uses org-tiny.json (8 mentions, 2 clusters) +python src/demo/demo.py --data src/demo/data/org-small.json # 100 mentions, realistic clustering ``` The demo: -- Loads entity mentions from JSON datasets stored in `demo/data/` +- Loads entity mentions from JSON datasets stored in `src/demo/data/` - Sends mentions to the request queue via RDF Turtle messages - Listens for resolution responses with cluster assignments - Logs all interactions with timestamps and outputs a clustering summary @@ -173,21 +224,40 @@ The demo: Note: For practical reasons (Turtle syntax is more verbose and less popular than JSON), the `demo.py` script accepts JSON files of a fixed structure and constructs RDF payloads from them on the fly. -See [`demo/README.md`](demo/README.md) for datasets, configuration, logging, prerequisites, troubleshooting, and example output. +See [`src/demo/README.md`](src/demo/README.md) for datasets, configuration, logging, prerequisites, troubleshooting, and example output. ## Project +### Repository Layout + +This repository places the self-contained Python project (source code, dependencies, and tooling config) under `src/`. The canonical `Makefile` lives at the repo root and owns all build logic. Recipes invoke `cd src &&` internally so that Poetry, Ruff, and pytest all resolve correctly against the `src/` project. All `make` targets are run from the repo root — no need to `cd src` first. + ### Structure -ERE follows a **Cosmic Python layered architecture** that enforces clear separation of concerns and testability. The `src/ere/` directory contains four layers: domain models (pure business logic), services (use-case orchestration), adapters (infrastructure integrations), and entrypoints (external drivers). Test suites mirror this structure with unit, integration, and BDD scenarios, while documentation covers architecture decisions and implementation tasks. The `demo/` directory provides working examples with sample datasets, and `infra/` contains containerisation and configuration for local development. +ERE follows a **Cosmic Python layered architecture** that enforces clear separation of concerns and testability. The `src/ere/` directory contains four layers: domain models (pure business logic), services (use-case orchestration), adapters (infrastructure integrations), and entrypoints (external drivers). Test suites mirror this structure with unit, integration, and BDD scenarios, while documentation covers architecture decisions and implementation tasks. `src/demo/` provides working examples with sample datasets, and `src/infra/` contains containerisation and configuration for local development. ``` -src/ere/ -├── adapters/ # Redis client, cluster store, resolver implementations -├── entrypoints/ # Redis pub/sub consumer -├── models/ # Domain models (entities, value objects, exceptions) -└── services/ # Resolution use-case orchestration +src/ +├── ere/ # Python package +│ ├── adapters/ # Redis client, cluster store, resolver implementations +│ ├── entrypoints/ # Redis pub/sub consumer +│ ├── models/ # Domain models (entities, value objects, exceptions) +│ └── services/ # Resolution use-case orchestration +├── config/ +│ ├── resolver.yaml # Splink comparisons, blocking rules, thresholds +│ ├── rdf_mapping.yaml # RDF namespace bindings, field extraction rules +│ └── README.md # Configuration documentation +├── demo/ +│ ├── demo.py # Entity resolution demonstration script +│ ├── data/ # Sample datasets (derived from TED procurement data) +│ └── README.md # Demo usage and configuration guide +├── infra/ +│ ├── Dockerfile # ERE service image definition +│ ├── compose.dev.yaml # Docker Compose for local development +│ └── .env.example # Environment variable template +├── pyproject.toml # Project metadata and dependencies +└── poetry.lock test/ ├── features/ # Gherkin BDD feature files @@ -198,21 +268,8 @@ test/ └── conftest.py # Shared fixtures and test configuration docs/ -├── architecture/ # ERE architecture, sequence diagrams, ADRs -├── tasks/ # Implementation task logs ├── ERS-ERE-System-Technical-Contract.pdf -└── *.md # Topic documentation - -infra/ -├── Dockerfile # ERE service image definition -├── docker-compose.yml # Full stack (Redis + ERE) -├── config # ERE Configuration -└── .env.local # Local runtime config (git-ignored) - -demo/ -├── demo.py # Entity resolution demonstration script -├── data/ # Sample datasets (derived from TED procurement data) -└── README.md # Demo usage and configuration guide +└── *.md # Architecture, algorithm, glossary ``` ### Tooling @@ -266,7 +323,7 @@ make test-integration # Code formatting and linting make format # Auto-format with Ruff -make lint-check # Lint without modifying files +make lint # Lint without modifying files make lint-fix # Lint with auto-fix ``` diff --git a/docs/algorithm.md b/docs/algorithm.md index 7ad5bb7..201dbbd 100644 --- a/docs/algorithm.md +++ b/docs/algorithm.md @@ -100,7 +100,7 @@ The algorithm processes mentions one at a time, making immediate clustering deci | **top_n** | Maximum candidate clusters returned per mention | | **blocking_rules** | Pre-filters to reduce similarity computation | -The complete list of configuration parameters together with comprehensive description is available in [Configuration](../infra/config/README.md). +The complete list of configuration parameters together with comprehensive description is available in [Configuration](../config/README.md). ## Outputs diff --git a/infra/.env.local b/infra/.env.local deleted file mode 100644 index e89187b..0000000 --- a/infra/.env.local +++ /dev/null @@ -1,28 +0,0 @@ -# Copy this file to .env.local and customize as needed -# This file is a template for Docker Compose configuration - -# ── Redis Configuration ────────────────────────────────────────────────────── -# Inside Docker Compose, use 'redis' as hostname. For local testing, use 'localhost' -REDIS_HOST=redis -REDIS_PORT=6379 -REDIS_DB=0 - -# Redis authentication (recommended for security) -REDIS_PASSWORD=changeme - -# ── Redis Queue Names ──────────────────────────────────────────────────────── -# Queue names for entity resolution requests and responses -REQUEST_QUEUE=ere_requests -RESPONSE_QUEUE=ere_responses - -# ── DuckDB Persistent Storage ──────────────────────────────────────────────── -# Path to DuckDB file inside container (volume-mounted from ere-data volume) -DUCKDB_PATH=/data/app.duckdb - -# ── ERE Service Port ───────────────────────────────────────────────────────── -# Port exposed to host machine for the ERE service -APP_PORT=8000 - -# ── Logging ────────────────────────────────────────────────────────────────── -# Python logging level (DEBUG, INFO, WARNING, ERROR, CRITICAL) -LOG_LEVEL=INFO diff --git a/infra/Dockerfile b/infra/Dockerfile deleted file mode 100644 index 5eb0407..0000000 --- a/infra/Dockerfile +++ /dev/null @@ -1,38 +0,0 @@ -# ── ERE application image ────────────────────────────────────────────────── -# Builds the Entity Resolution Engine service for local development. -# Requires only Docker — no local Python, Redis, or DuckDB installation. -# -# Build context: repository root (one level above /infra) -# Usage: docker compose -f infra/docker-compose.yml up --build -# ─────────────────────────────────────────────────────────────────────────── - -FROM python:3.12-slim - -# git is required to fetch the ers-spec dependency from GitHub -RUN apt-get update \ - && apt-get install -y --no-install-recommends git \ - && rm -rf /var/lib/apt/lists/* - -# Install Poetry (locked to major version 2) -RUN pip install --no-cache-dir "poetry>=2.0.0,<3.0.0" - -WORKDIR /app - -# ── Dependency layer (cached unless pyproject.toml / poetry.lock change) ─── -COPY pyproject.toml poetry.lock* ./ - -# Install into system Python (no virtualenv needed inside the container) -RUN poetry config virtualenvs.create false \ - && poetry install --without dev --no-root --no-interaction - -# ── Application source ────────────────────────────────────────────────────── -COPY README.md ./ -COPY src/ ./src/ -COPY infra/config/ ./config/ - -# Install the ere package itself -RUN poetry install --without dev --no-interaction - -# ── Runtime ───────────────────────────────────────────────────────────────── -# Fail fast: Python will exit immediately if the module cannot be imported. -CMD ["python", "-m", "ere.entrypoints.app"] diff --git a/infra/docker-compose.yml b/infra/docker-compose.yml deleted file mode 100644 index ef5b8df..0000000 --- a/infra/docker-compose.yml +++ /dev/null @@ -1,66 +0,0 @@ -name: ere-local - -services: - - # ── Redis ────────────────────────────────────────────────────────────────── - redis: - image: redis:7-alpine - restart: unless-stopped - command: redis-server --requirepass ${REDIS_PASSWORD:-changeme} - ports: - - "6379:6379" - networks: - - ere-net - healthcheck: - test: ["CMD", "sh", "-c", "redis-cli --no-auth-warning -a $REDIS_PASSWORD ping"] - interval: 5s - timeout: 3s - retries: 5 - environment: - - REDIS_PASSWORD=${REDIS_PASSWORD:-changeme} - - - # ── Redis Insight (GUI for Redis) ────────────────────────────────────────── - redisinsight: - image: redis/redisinsight:latest - restart: unless-stopped - ports: - - "5540:5540" - networks: - - ere-net - environment: - # Optional: set analytics to false if you prefer no telemetry - - REDISINSIGHT_ANALYTICS=true - - - # ── Entity Resolution Engine ─────────────────────────────────────────────── - ere: - build: - context: .. - dockerfile: infra/Dockerfile - env_file: .env.local - restart: unless-stopped - ports: - - "${APP_PORT:-8000}:8000" - environment: - # DuckDB embedded file location (volume-mounted at /data) - - DUCKDB_PATH=${DUCKDB_PATH:-/data/app.duckdb} - # Config file paths in the container - - RDF_MAPPING_PATH=/app/config/rdf_mapping.yaml - - RESOLVER_CONFIG_PATH=/app/config/resolver.yaml - # Inherit REQUEST_QUEUE, RESPONSE_QUEUE, REDIS_* from .env.local - depends_on: - redis: - condition: service_healthy - volumes: - - ere-data:/data # DuckDB embedded file and other persistent state - networks: - - ere-net - -# ── Shared state ─────────────────────────────────────────────────────────── -volumes: - ere-data: - -# ── Internal network (not exposed to host) ───────────────────────────────── -networks: - ere-net: diff --git a/sonar-project.properties b/sonar-project.properties index 17cbff0..0a21d94 100644 --- a/sonar-project.properties +++ b/sonar-project.properties @@ -13,14 +13,14 @@ sonar.organization=meaningfy-ws # Display name and version sonar.projectName=Entity Resolution Engine (ERE) -sonar.projectVersion=0.1.0 +sonar.projectVersion=1.0.0 #----------------------------------------------------------------------------- # Code Analysis Paths #----------------------------------------------------------------------------- # Source code location (relative to sonar-project.properties) -sonar.sources=src +sonar.sources=src/ere sonar.tests=test # Source encoding @@ -53,7 +53,7 @@ sonar.coverage.exclusions=test/**/*,setup.py,**/__init__.py sonar.cpd.exclusions=test/**/* # Exclude documentation and config files from analysis -sonar.exclusions=docs/**/*,*.md,infra/**/* +sonar.exclusions=docs/**/*,*.md,src/infra/**/*,src/demo/**/* #----------------------------------------------------------------------------- # SOLID Principles & Clean Code Quality Gates diff --git a/src/VERSION b/src/VERSION new file mode 100644 index 0000000..3eefcb9 --- /dev/null +++ b/src/VERSION @@ -0,0 +1 @@ +1.0.0 diff --git a/infra/config/README.md b/src/config/README.md similarity index 96% rename from infra/config/README.md rename to src/config/README.md index 6707380..284a922 100644 --- a/infra/config/README.md +++ b/src/config/README.md @@ -150,7 +150,7 @@ To disable: Set `auto_train_threshold: 0` - **Fellegi-Sunter model**: [The Fellegi-Sunter model in Splink](https://moj-analytical-services.github.io/splink/theory/fellegi_sunter.html) -- **ERE algorithm**: See `docs/algorithm.md` for detailed explanation of the online greedy clustering approach. +- **ERE algorithm**: See `../../docs/algorithm.md` for detailed explanation of the online greedy clustering approach. --- diff --git a/infra/config/rdf_mapping.yaml b/src/config/rdf_mapping.yaml similarity index 97% rename from infra/config/rdf_mapping.yaml rename to src/config/rdf_mapping.yaml index 8dd02e2..4b856ed 100644 --- a/infra/config/rdf_mapping.yaml +++ b/src/config/rdf_mapping.yaml @@ -1,20 +1,20 @@ -# Namespace prefix registry - used by rdf_mapper.py to resolve prefixed names in field paths -namespaces: - epo: "http://data.europa.eu/a4g/ontology#" - org: "http://www.w3.org/ns/org#" - locn: "http://www.w3.org/ns/locn#" - cccev: "http://data.europa.eu/m8g/" - -# Entity type mappings: entity_type_string -> rdf_type + field property paths -# Property paths use / as separator for multi-hop traversal. -# Field names must match entity_fields in resolver.yaml (legal_name, country_code). -entity_types: - ORGANISATION: - rdf_type: "org:Organization" - fields: - legal_name: "epo:hasLegalName" - country_code: "cccev:registeredAddress/epo:hasCountryCode" - nuts_code: "cccev:registeredAddress/epo:hasNutsCode" - post_code: "cccev:registeredAddress/locn:postCode" - post_name: "cccev:registeredAddress/locn:postName" - thoroughfare: "cccev:registeredAddress/locn:thoroughfare" +# Namespace prefix registry - used by rdf_mapper.py to resolve prefixed names in field paths +namespaces: + epo: "http://data.europa.eu/a4g/ontology#" + org: "http://www.w3.org/ns/org#" + locn: "http://www.w3.org/ns/locn#" + cccev: "http://data.europa.eu/m8g/" + +# Entity type mappings: entity_type_string -> rdf_type + field property paths +# Property paths use / as separator for multi-hop traversal. +# Field names must match entity_fields in resolver.yaml (legal_name, country_code). +entity_types: + ORGANISATION: + rdf_type: "org:Organization" + fields: + legal_name: "epo:hasLegalName" + country_code: "cccev:registeredAddress/epo:hasCountryCode" + nuts_code: "cccev:registeredAddress/epo:hasNutsCode" + post_code: "cccev:registeredAddress/locn:postCode" + post_name: "cccev:registeredAddress/locn:postName" + thoroughfare: "cccev:registeredAddress/locn:thoroughfare" diff --git a/infra/config/resolver.yaml b/src/config/resolver.yaml similarity index 100% rename from infra/config/resolver.yaml rename to src/config/resolver.yaml diff --git a/infra/config/resolver_compound.yaml b/src/config/resolver_compound.yaml similarity index 96% rename from infra/config/resolver_compound.yaml rename to src/config/resolver_compound.yaml index 9cac682..47ff9d9 100644 --- a/infra/config/resolver_compound.yaml +++ b/src/config/resolver_compound.yaml @@ -1,25 +1,25 @@ -# Entity Resolver configuration — Compound blocking (country_code AND city) -# Blocks pairs unless both country_code AND city match. -# Creates tight, city-level blocks within countries. -# Trade-off: fewer comparisons (faster) but may miss cross-city variants. - -cache_strategy: tf_incremental - -threshold: 0.5 - -top_n: 100 - -match_weight_threshold: -10 - -splink: - probability_two_random_records_match: 0.3 - - comparisons: - - type: jaro_winkler - field: legal_name - thresholds: [0.9, 0.8] - - # Compound blocking rule: a pair is compared only if both country_code AND city match. - # This is expressed as a list with two fields. - blocking_rules: - - [country_code, city] +# Entity Resolver configuration — Compound blocking (country_code AND city) +# Blocks pairs unless both country_code AND city match. +# Creates tight, city-level blocks within countries. +# Trade-off: fewer comparisons (faster) but may miss cross-city variants. + +cache_strategy: tf_incremental + +threshold: 0.5 + +top_n: 100 + +match_weight_threshold: -10 + +splink: + probability_two_random_records_match: 0.3 + + comparisons: + - type: jaro_winkler + field: legal_name + thresholds: [0.9, 0.8] + + # Compound blocking rule: a pair is compared only if both country_code AND city match. + # This is expressed as a list with two fields. + blocking_rules: + - [country_code, city] diff --git a/infra/config/resolver_multirule.yaml b/src/config/resolver_multirule.yaml similarity index 96% rename from infra/config/resolver_multirule.yaml rename to src/config/resolver_multirule.yaml index 6e76a8c..c8395c9 100644 --- a/infra/config/resolver_multirule.yaml +++ b/src/config/resolver_multirule.yaml @@ -1,28 +1,28 @@ -# Entity Resolver configuration — Multi-rule blocking (country OR city OR name) -# Three independent blocking rules evaluated as OR (union). -# A pair is included if any rule fires: same country, OR same city, OR exact name match. -# Trade-off: more comparisons (slower) but higher recall for diverse datasets. - -cache_strategy: tf_incremental - -threshold: 0.5 - -top_n: 100 - -match_weight_threshold: -10 - -splink: - probability_two_random_records_match: 0.3 - - comparisons: - - type: jaro_winkler - field: legal_name - thresholds: [0.9, 0.8] - - # Multi-rule blocking: three independent rules, evaluated as UNION ALL. - # A pair is included if any rule fires (country_code match, OR city match, OR exact legal_name match). - # Splink deduplicates the results internally. - blocking_rules: - - country_code - - city - - legal_name +# Entity Resolver configuration — Multi-rule blocking (country OR city OR name) +# Three independent blocking rules evaluated as OR (union). +# A pair is included if any rule fires: same country, OR same city, OR exact name match. +# Trade-off: more comparisons (slower) but higher recall for diverse datasets. + +cache_strategy: tf_incremental + +threshold: 0.5 + +top_n: 100 + +match_weight_threshold: -10 + +splink: + probability_two_random_records_match: 0.3 + + comparisons: + - type: jaro_winkler + field: legal_name + thresholds: [0.9, 0.8] + + # Multi-rule blocking: three independent rules, evaluated as UNION ALL. + # A pair is included if any rule fires (country_code match, OR city match, OR exact legal_name match). + # Splink deduplicates the results internally. + blocking_rules: + - country_code + - city + - legal_name diff --git a/demo/README.md b/src/demo/README.md similarity index 85% rename from demo/README.md rename to src/demo/README.md index aa45f79..e83c762 100644 --- a/demo/README.md +++ b/src/demo/README.md @@ -18,7 +18,7 @@ The demo treats ERE as a black box service accessible only through Redis message ## Configuration -Configuration is loaded from `.env.local` (or environment variables): +Configuration is loaded from `infra/.env` (or environment variables): | Variable | Default | Purpose | |----------|---------|---------| @@ -44,14 +44,13 @@ The script tries the configured host first, then falls back to `localhost` if th Start the full stack including Redis and ERE: ```bash -cd /home/greg/PROJECTS/ERS/ere-basic -docker-compose -f infra/docker-compose.yml up -d +make infra-rebuild ``` Wait for services to be ready (check logs): ```bash -docker-compose -f infra/docker-compose.yml logs -f +make infra-logs ``` ### 2. Locally (development) @@ -64,13 +63,13 @@ redis-cli ping # should return "PONG" # Run the demo cd /home/greg/PROJECTS/ERS/ere-basic -python3 demo/demo.py +python3 src/demo/demo.py ``` Or with Poetry: ```bash -poetry run python3 demo/demo.py +poetry run python3 src/demo/demo.py ``` **Runtime**: Approximately 5-35 seconds (5s sending + up to 30s waiting for responses). @@ -78,17 +77,17 @@ The demo sends messages with 1-second delays between them, then waits for respon ### Using Different Datasets -By default, the demo loads `demo/data/org-tiny.json`. Specify a different dataset with the `--data` parameter: +By default, the demo loads `src/demo/data/org-tiny.json`. Specify a different dataset with the `--data` parameter: ```bash # Use mentions dataset -poetry run python3 demo/demo.py --data demo/data/mentions_100b.json +poetry run python3 src/demo/demo.py --data src/demo/data/mentions_100b.json # Use larger dataset -poetry run python3 demo/demo.py --data demo/data/org-mid.json +poetry run python3 src/demo/demo.py --data src/demo/data/org-mid.json ``` -Available datasets in `demo/data/`: +Available datasets in `src/demo/data/`: - `org-tiny.json` (default) — 8 organization mentions, 2 clusters - `org-small.json` — Small (100 mentions) organization dataset - `org-mid.json` — Mid-size (1000 mentions) organization dataset @@ -138,12 +137,12 @@ CLUSTERING SUMMARY The demo logs: - **Request tracking**: Each sent mention with descriptive details - **Response logging**: Received cluster candidates with confidence/similarity scores -- **Clustering summary**: Final cluster assignments with member organizations (by default, saved to `demo/log/`) +- **Clustering summary**: Final cluster assignments with member organizations (by default, saved to `src/demo/log/`) - **Extended logging**: Trace-level logging for detailed resolution diagnostics ## Demo Data -Datasets are stored in `demo/data/` (JSON format with RDF Turtle content). +Datasets are stored in `src/demo/data/` (JSON format with RDF Turtle content). ### Dataset Correspondence to Stress Tests @@ -205,7 +204,7 @@ If it returns `PONG`, Redis is running. If not: - **Docker**: `docker run -d -p 6379:6379 redis:latest` - **Local Redis**: `brew install redis && brew services start redis` (macOS) -- **Docker Compose**: Ensure the service is running: `docker-compose -f infra/docker-compose.yml up redis` +- **Docker Compose**: Ensure the service is running: `make infra-up` ### Timeout waiting for responses @@ -216,14 +215,14 @@ If it returns `PONG`, Redis is running. If not: **Check ERE logs:** ```bash -docker-compose -f infra/docker-compose.yml logs ere +make infra-logs ``` ### Password authentication fails **Edit Redis connection parameters:** -Option 1: Modify `.env.local`: +Option 1: Modify `infra/.env`: ```bash REDIS_PASSWORD=your_password ``` @@ -231,7 +230,7 @@ REDIS_PASSWORD=your_password Option 2: Set environment variable: ```bash export REDIS_PASSWORD=your_password -python3 demo/demo.py +python3 src/demo/demo.py ``` ## Design Notes @@ -247,12 +246,12 @@ python3 demo/demo.py The demo logs all activity to: - **Console**: INFO-level messages (requests, responses, clustering summary) -- **Log file**: `demo/log/demo_YYYYMMDD-HHMM--DATASETNAME.log` with TRACE-level diagnostics +- **Log file**: `src/demo/log/demo_YYYYMMDD-HHMM--DATASETNAME.log` with TRACE-level diagnostics - Trace logs include detailed resolution diagnostics (field extraction, similarity scoring, etc.) - Clustering summary included at the end of each log file Configure logging via environment variable: ```bash export LOG_LEVEL=TRACE # TRACE, DEBUG, INFO, WARNING, ERROR -python3 demo/demo.py +python3 src/demo/demo.py ``` diff --git a/demo/__init__.py b/src/demo/__init__.py similarity index 100% rename from demo/__init__.py rename to src/demo/__init__.py diff --git a/demo/data/org-mid.json b/src/demo/data/org-mid.json similarity index 100% rename from demo/data/org-mid.json rename to src/demo/data/org-mid.json diff --git a/demo/data/org-small.json b/src/demo/data/org-small.json similarity index 100% rename from demo/data/org-small.json rename to src/demo/data/org-small.json diff --git a/demo/data/org-tiny.json b/src/demo/data/org-tiny.json similarity index 100% rename from demo/data/org-tiny.json rename to src/demo/data/org-tiny.json diff --git a/demo/demo.py b/src/demo/demo.py similarity index 88% rename from demo/demo.py rename to src/demo/demo.py index 711178d..6bf2570 100755 --- a/demo/demo.py +++ b/src/demo/demo.py @@ -1,531 +1,563 @@ -#!/usr/bin/env python3 -""" -Demo: Indirect Redis client for ERE (Entity Resolution Engine). - -This demo connects to ERE through the Redis queue infrastructure (no direct Python API). -It demonstrates: -1. Checking Redis connectivity -2. Sending EntityMentionResolutionRequest messages to the queue -3. Listening for EntityMentionResolutionResponse messages -4. Logging all interactions - -The example uses 6 synthetic mentions from ALGORITHM.md that cluster into 2 groups: - - Cluster 1: {1, 2, 5} (organizations with high similarity) - - Cluster 2: {3, 4, 6} (different organizations, also highly similar) - -⚠️ IMPORTANT: The ERE resolver persists state in a DuckDB database volume. - Before running a fresh demo with different data, clear the old database: - - docker volume rm ere-local_ere-data - docker-compose -f infra/docker-compose.yml up -d - - Failure to do so will mix old mentions with new ones, corrupting demo results. -""" - -import json -import logging -import os -import sys -import time -from datetime import datetime, timezone -from pathlib import Path - -import redis - -# Default data file path -DEFAULT_DATA_FILE = Path(__file__).parent / "data" / "org-tiny.json" - -DELAY_BETWEEN_MESSAGES = 0 # seconds to wait between sending messages (set to >0 for sequential processing) -GLOBAL_TIMEOUT = 0 # seconds to wait for responses before giving up (0 = no timeout) - - -# =============================================================================== -# Configuration -# =============================================================================== - -def load_env_file(env_path: str = None) -> dict: - """Load configuration from .env.local or environment variables.""" - config = {} - - # Try to load from .env.local if it exists - if env_path is None: - env_path = Path(__file__).parent.parent / "infra" / ".env.local" - - if Path(env_path).exists(): - with open(env_path) as f: - for line in f: - line = line.strip() - if line and not line.startswith("#"): - if "=" in line: - key, value = line.split("=", 1) - config[key.strip()] = value.strip() - - # Environment variables override .env.local - config["REDIS_HOST"] = os.environ.get("REDIS_HOST", config.get("REDIS_HOST", "localhost")) - config["REDIS_PORT"] = int(os.environ.get("REDIS_PORT", config.get("REDIS_PORT", "6379"))) - config["REDIS_DB"] = int(os.environ.get("REDIS_DB", config.get("REDIS_DB", "0"))) - config["REDIS_PASSWORD"] = os.environ.get("REDIS_PASSWORD", config.get("REDIS_PASSWORD")) - config["REQUEST_QUEUE"] = os.environ.get("REQUEST_QUEUE", config.get("REQUEST_QUEUE", "ere_requests")) - config["RESPONSE_QUEUE"] = os.environ.get("RESPONSE_QUEUE", config.get("RESPONSE_QUEUE", "ere_responses")) - - return config - - -# =============================================================================== -# Logging Setup -# =============================================================================== - -TRACE = 5 - -def setup_logging(): - """Configure logging with timestamps.""" - log_level_name = os.environ.get("LOG_LEVEL", "INFO").upper() - - # Handle custom TRACE level - if log_level_name == "TRACE": - log_level = TRACE - logging.addLevelName(TRACE, "TRACE") - else: - log_level = getattr(logging, log_level_name, logging.INFO) - - logging.basicConfig( - level=log_level, - format="%(asctime)s [%(levelname)s] %(message)s", - datefmt="%Y-%m-%d %H:%M:%S", - ) - - logger = logging.getLogger(__name__) - logger.setLevel(log_level) - logger.info(f"Logging configured at level {log_level_name}") - - return logger - - -# =============================================================================== -# Redis Connection -# =============================================================================== - -def check_redis_connectivity(host: str, port: int, db: int, password: str) -> redis.Redis: - """ - Check Redis connectivity and return client. - - Attempts connection to specified host first, then fallback to localhost - if configured host is "redis" (Docker). - - Raises: - RuntimeError: If Redis is not accessible. - """ - hosts_to_try = [host] - - # Fallback: if configured host is "redis" (Docker), also try localhost - if host == "redis": - hosts_to_try.append("localhost") - - last_error = None - for try_host in hosts_to_try: - try: - logging.getLogger(__name__).info(f"Attempting Redis connection to {try_host}:{port}...") - client = redis.Redis( - host=try_host, - port=port, - db=db, - password=password, - decode_responses=False, - ) - client.ping() - return client - except Exception as e: - last_error = e - continue - - raise RuntimeError( - f"Redis unavailable. Tried hosts: {hosts_to_try}, port: {port}, db: {db}" - ) from last_error - - -# =============================================================================== -# Request/Response Handling -# =============================================================================== - -def escape_turtle_string(value: str) -> str: - """ - Escape a string for safe inclusion in Turtle RDF format. - - Handles special characters: backslash, double quotes, newlines, carriage returns, tabs. - - Args: - value: String to escape - - Returns: - Escaped string safe for use in Turtle string literals - """ - if not value: - return value - - # Escape backslash first (must be done before other escapes) - value = value.replace("\\", "\\\\") - # Escape double quotes - value = value.replace('"', '\\"') - # Escape newlines - value = value.replace("\n", "\\n") - # Escape carriage returns - value = value.replace("\r", "\\r") - # Escape tabs - value = value.replace("\t", "\\t") - - return value - - -def create_entity_mention_request( - request_id: str, - source_id: str, - entity_type: str, - legal_name: str, - country_code: str, - nuts_code: str | None = None, - post_code: str | None = None, - post_name: str | None = None, - thoroughfare: str | None = None, -) -> dict: - """ - Create an EntityMentionResolutionRequest payload. - - Uses RDF/Turtle format with entity metadata including extended address fields. - All string values are properly escaped for Turtle compatibility. - - Args: - request_id: Unique request identifier - source_id: Source system identifier - entity_type: Entity type (e.g., ORGANISATION) - legal_name: Legal name of the entity - country_code: ISO 2-letter country code - nuts_code: Optional NUTS regional code - post_code: Optional postal code - post_name: Optional city/locality name - thoroughfare: Optional street address - """ - # Escape all string values for Turtle safety - legal_name_safe = escape_turtle_string(legal_name or "") - country_code_safe = escape_turtle_string(country_code or "") - - # Build address properties dynamically - address_props = [f'epo:hasCountryCode "{country_code_safe}"'] - if nuts_code: - nuts_code_safe = escape_turtle_string(nuts_code) - address_props.append(f'epo:hasNutsCode "{nuts_code_safe}"') - if post_code: - post_code_safe = escape_turtle_string(post_code) - address_props.append(f'locn:postCode "{post_code_safe}"') - if post_name: - post_name_safe = escape_turtle_string(post_name) - address_props.append(f'locn:postName "{post_name_safe}"') - if thoroughfare: - thoroughfare_safe = escape_turtle_string(thoroughfare) - address_props.append(f'locn:thoroughfare "{thoroughfare_safe}"') - - address_content = ' ;\n '.join(address_props) - - content = f"""@prefix org: . -@prefix cccev: . -@prefix epo: . -@prefix locn: . -@prefix epd: . - -epd:ent{request_id} a org:Organization ; - epo:hasLegalName "{legal_name_safe}" ; - cccev:registeredAddress [ - {address_content} - ] . -""" - - return { - "type": "EntityMentionResolutionRequest", - "entity_mention": { - "identifiedBy": { - "request_id": request_id, - "source_id": source_id, - "entity_type": entity_type, - }, - "content": content.strip(), - "content_type": "text/turtle", - }, - "timestamp": datetime.now(timezone.utc).isoformat(), - "ere_request_id": f"{request_id}:01", - } - - -def parse_response(response_bytes: bytes) -> dict: - """Parse JSON response from Redis.""" - return json.loads(response_bytes.decode("utf-8")) - - -# =============================================================================== -# Demo Data Loading -# =============================================================================== - -def load_demo_mentions(data_file: str | None = None) -> list[dict]: - """ - Load demo mentions from a JSON file. - - Args: - data_file: Path to JSON file containing mentions. If None, uses default. - - Returns: - List of mention dicts with keys: request_id, source_id, entity_type, - legal_name, country_code, description. - - Raises: - FileNotFoundError: If data file does not exist. - ValueError: If JSON is invalid or missing 'mentions' key. - """ - if data_file is None: - data_file = DEFAULT_DATA_FILE - - data_path = Path(data_file) - if not data_path.exists(): - raise FileNotFoundError(f"Data file not found: {data_path}") - - with open(data_path) as f: - data = json.load(f) - - if "mentions" not in data: - raise ValueError(f"JSON must contain 'mentions' key") - - return data["mentions"] - - -# =============================================================================== -# Main Demo -# =============================================================================== - -def main(data_file: str | None = None): - """ - Run the Redis-based ERE demo. - - Args: - data_file: Path to JSON file containing demo mentions. - If None, uses default (mentions_mixed_countries.json). - """ - logger = setup_logging() - - # Load configuration - logger.info("Loading configuration...") - config = load_env_file() - logger.info( - f"Redis config: host={config['REDIS_HOST']}, " - f"port={config['REDIS_PORT']}, db={config['REDIS_DB']}" - ) - logger.info( - f"Queue names: request={config['REQUEST_QUEUE']}, " - f"response={config['RESPONSE_QUEUE']}" - ) - - # Load demo mentions from JSON - try: - demo_mentions = load_demo_mentions(data_file) - logger.info(f"Loaded {len(demo_mentions)} mentions from {data_file or DEFAULT_DATA_FILE}") - except (FileNotFoundError, ValueError) as e: - logger.error(f"Failed to load demo mentions: {e}") - return 1 - - # Check Redis connectivity - logger.info("Checking Redis connectivity...") - try: - redis_client = check_redis_connectivity( - host=config["REDIS_HOST"], - port=config["REDIS_PORT"], - db=config["REDIS_DB"], - password=config["REDIS_PASSWORD"], - ) - logger.info("✓ Redis is available") - except RuntimeError as e: - logger.error(f"✗ Redis check failed: {e}") - return 1 - - # Clear queues - logger.info("Clearing request and response queues...") - redis_client.delete(config["REQUEST_QUEUE"], config["RESPONSE_QUEUE"]) - - # ⚠️ Check if DuckDB database is non-empty (stale from prior runs) - # This guards against corrupting demo results by mixing old and new mentions - duckdb_path = Path(os.environ.get("DUCKDB_PATH", "/data/app.duckdb")) - if duckdb_path.exists() and duckdb_path.stat().st_size > 0: - logger.warning( - f"⚠️ WARNING: DuckDB database file exists and is non-empty!\n" - f" This may contain mentions from a prior run.\n" - f" This will CORRUPT demo results by mixing old and new data.\n" - f" \n" - f" To reset the database:\n" - f" 1. docker volume rm ere-local_ere-data\n" - f" 2. docker-compose -f infra/docker-compose.yml up -d\n" - ) - - # Send demo requests - logger.info(f"Sending {len(demo_mentions)} entity mentions...") - request_ids = [] - - for mention in demo_mentions: - request = create_entity_mention_request( - request_id=mention["request_id"], - source_id=mention["source_id"], - entity_type=mention["entity_type"], - legal_name=mention["legal_name"], - country_code=mention["country_code"], - nuts_code=mention.get("nuts_code"), - post_code=mention.get("post_code"), - post_name=mention.get("post_name"), - thoroughfare=mention.get("thoroughfare"), - ) - - message_json = json.dumps(request) - if logger.isEnabledFor(TRACE): - logger.log(TRACE, f"Full request message:\n{json.dumps(request, indent=2)}") - - message_bytes = message_json.encode("utf-8") - redis_client.rpush(config["REQUEST_QUEUE"], message_bytes) - request_ids.append(mention["request_id"]) - - logger.info( - f" → Sent request {mention['request_id']}: " - f"{mention['legal_name']} ({mention['country_code']}) " - f"[{mention.get('description', '')}]" - ) - - # Wait 1 second between messages to ensure sequential processing - if DELAY_BETWEEN_MESSAGES: - time.sleep(1) - - logger.info("") - logger.info("Listening for responses...") - logger.info("-" * 80) - - # Track mentions for summary: map request_id → (legal_name, cluster_id) - mention_tracking = {} - for mention in demo_mentions: - mention_tracking[mention["request_id"]] = { - "legal_name": mention["legal_name"], - "cluster_id": None, # Will be filled in from response - } - - # Listen for responses - responses_received = {} - start_time = time.time() - - while len(responses_received) < len(request_ids): - elapsed = time.time() - start_time - if GLOBAL_TIMEOUT > 0 and elapsed > GLOBAL_TIMEOUT: - logger.warning(f"Timeout after {GLOBAL_TIMEOUT}s. Received {len(responses_received)}/{len(request_ids)} responses.") - break - - # Try to get a response with short timeout - result = redis_client.brpop(config["RESPONSE_QUEUE"], timeout=1) - - if result is not None: - _, response_bytes = result - response = parse_response(response_bytes) - - if logger.isEnabledFor(TRACE): - logger.log(TRACE, f"Full response message:\n{json.dumps(response, indent=2)}") - - req_id = response["entity_mention_id"]["request_id"] - responses_received[req_id] = response - - logger.info(f"\n✓ Response received for {req_id}:") - logger.info(f" Type: {response['type']}") - logger.info(f" Timestamp: {response['timestamp']}") - - source_id = response["entity_mention_id"]["source_id"] - entity_type = response["entity_mention_id"]["entity_type"] - logger.info(f" Mention: ({source_id}, {req_id}, {entity_type})") - - logger.info(f" Candidates:") - - # Track the top cluster assignment (first candidate is the assignment) - if response.get("candidates"): - top_candidate = response["candidates"][0] - assigned_cluster = top_candidate["cluster_id"] - mention_tracking[req_id]["cluster_id"] = assigned_cluster - logger.info(f" → Assigned to cluster: {assigned_cluster}") - - for i, candidate in enumerate(response.get("candidates", []), 1): - logger.info( - f" {i}. Cluster {candidate['cluster_id']}: " - f"confidence={candidate['confidence_score']:.4f}, " - f"similarity={candidate['similarity_score']:.4f}" - ) - - logger.info("-" * 80) - logger.info(f"\nDemo complete. Received {len(responses_received)}/{len(request_ids)} responses.") - - # Build clustering summary as single block - summary_lines = [] - summary_lines.append("=" * 80) - summary_lines.append("CLUSTERING SUMMARY") - summary_lines.append("=" * 80) - - # Group mentions by assigned cluster - clusters = {} - unassigned = [] - - for req_id in request_ids: - tracking = mention_tracking.get(req_id) - if tracking: - cluster_id = tracking["cluster_id"] - legal_name = tracking["legal_name"] - - if cluster_id is None: - unassigned.append((req_id, legal_name)) - else: - if cluster_id not in clusters: - clusters[cluster_id] = [] - clusters[cluster_id].append((req_id, legal_name)) - - # Build cluster output - if clusters: - for cluster_id in sorted(clusters.keys()): - members = clusters[cluster_id] - summary_lines.append("") - summary_lines.append(f"{cluster_id} ({len(members)} members):") - for req_id, legal_name in members: - summary_lines.append(f" {req_id:4s} | {legal_name}") - else: - summary_lines.append("") - summary_lines.append("(No clusters formed)") - - # Add unassigned mentions - if unassigned: - summary_lines.append("") - summary_lines.append(f"Unassigned ({len(unassigned)} mentions):") - for req_id, legal_name in unassigned: - summary_lines.append(f" {req_id:4s} | {legal_name}") - - summary_lines.append("=" * 80) - - # Print entire summary in one log call - summary_block = "\n".join(summary_lines) - logger.info(f"\n{summary_block}") - - # Summary - if len(responses_received) == len(request_ids): - logger.info("✓ All responses received successfully!") - return 0 - else: - logger.warning(f"✗ Missing {len(request_ids) - len(responses_received)} response(s).") - return 1 - - -if __name__ == "__main__": - import argparse - - parser = argparse.ArgumentParser( - description="Redis-based ERE demo with parametrized mentions data." - ) - parser.add_argument( - "--data", - type=str, - default=None, - help=f"Path to JSON file with demo mentions (default: {DEFAULT_DATA_FILE})", - ) - args = parser.parse_args() - - sys.exit(main(data_file=args.data)) +#!/usr/bin/env python3 +""" +Demo: Indirect Redis client for ERE (Entity Resolution Engine). + +This demo connects to ERE through the Redis queue infrastructure (no direct Python API). +It demonstrates: +1. Checking Redis connectivity +2. Sending EntityMentionResolutionRequest messages to the queue +3. Listening for EntityMentionResolutionResponse messages +4. Logging all interactions + +The example uses 6 synthetic mentions from ALGORITHM.md that cluster into 2 groups: + - Cluster 1: {1, 2, 5} (organizations with high similarity) + - Cluster 2: {3, 4, 6} (different organizations, also highly similar) + +⚠️ IMPORTANT: The ERE resolver persists state in a DuckDB database volume. + Before running a fresh demo with different data, clear the old database: + + docker volume rm ere-local_ere-data + make infra-rebuild + + Failure to do so will mix old mentions with new ones, corrupting demo results. +""" + +import json +import logging +import os +import sys +import time +from datetime import datetime, timezone +from pathlib import Path + +import redis + +# Default data file path +DEFAULT_DATA_FILE = Path(__file__).parent / "data" / "org-tiny.json" + +DELAY_BETWEEN_MESSAGES = ( + 0 # seconds to wait between sending messages (set to >0 for sequential processing) +) +GLOBAL_TIMEOUT = 0 # seconds to wait for responses before giving up (0 = no timeout) + + +# =============================================================================== +# Configuration +# =============================================================================== + + +def load_env_file(env_path: str = None) -> dict: + """Load configuration from .env or environment variables.""" + config = {} + + # Try to load from .env if it exists + if env_path is None: + env_path = Path(__file__).parent.parent / "infra" / ".env" + + if Path(env_path).exists(): + with open(env_path) as f: + for line in f: + line = line.strip() + if line and not line.startswith("#"): + if "=" in line: + key, value = line.split("=", 1) + config[key.strip()] = value.strip() + + # Environment variables override .env + config["REDIS_HOST"] = os.environ.get( + "REDIS_HOST", config.get("REDIS_HOST", "localhost") + ) + config["REDIS_PORT"] = int( + os.environ.get("REDIS_PORT", config.get("REDIS_PORT", "6379")) + ) + config["REDIS_DB"] = int(os.environ.get("REDIS_DB", config.get("REDIS_DB", "0"))) + config["REDIS_PASSWORD"] = os.environ.get( + "REDIS_PASSWORD", config.get("REDIS_PASSWORD") + ) + config["REQUEST_QUEUE"] = os.environ.get( + "REQUEST_QUEUE", config.get("REQUEST_QUEUE", "ere_requests") + ) + config["RESPONSE_QUEUE"] = os.environ.get( + "RESPONSE_QUEUE", config.get("RESPONSE_QUEUE", "ere_responses") + ) + + return config + + +# =============================================================================== +# Logging Setup +# =============================================================================== + +TRACE = 5 + + +def setup_logging(): + """Configure logging with timestamps.""" + log_level_name = os.environ.get("ERE_LOG_LEVEL", "INFO").upper() + + # Handle custom TRACE level + if log_level_name == "TRACE": + log_level = TRACE + logging.addLevelName(TRACE, "TRACE") + else: + log_level = getattr(logging, log_level_name, logging.INFO) + + logging.basicConfig( + level=log_level, + format="%(asctime)s [%(levelname)s] %(message)s", + datefmt="%Y-%m-%d %H:%M:%S", + ) + + logger = logging.getLogger(__name__) + logger.setLevel(log_level) + logger.info(f"Logging configured at level {log_level_name}") + + return logger + + +# =============================================================================== +# Redis Connection +# =============================================================================== + + +def check_redis_connectivity( + host: str, port: int, db: int, password: str +) -> redis.Redis: + """ + Check Redis connectivity and return client. + + Attempts connection to specified host first, then fallback to localhost + if configured host is "redis" (Docker). + + Raises: + RuntimeError: If Redis is not accessible. + """ + hosts_to_try = [host] + + # Fallback: if configured host is "redis" (Docker), also try localhost + if host == "redis": + hosts_to_try.append("localhost") + + last_error = None + for try_host in hosts_to_try: + try: + logging.getLogger(__name__).info( + f"Attempting Redis connection to {try_host}:{port}..." + ) + client = redis.Redis( + host=try_host, + port=port, + db=db, + password=password, + decode_responses=False, + ) + client.ping() + return client + except Exception as e: + last_error = e + continue + + raise RuntimeError( + f"Redis unavailable. Tried hosts: {hosts_to_try}, port: {port}, db: {db}" + ) from last_error + + +# =============================================================================== +# Request/Response Handling +# =============================================================================== + + +def escape_turtle_string(value: str) -> str: + """ + Escape a string for safe inclusion in Turtle RDF format. + + Handles special characters: backslash, double quotes, newlines, carriage returns, tabs. + + Args: + value: String to escape + + Returns: + Escaped string safe for use in Turtle string literals + """ + if not value: + return value + + # Escape backslash first (must be done before other escapes) + value = value.replace("\\", "\\\\") + # Escape double quotes + value = value.replace('"', '\\"') + # Escape newlines + value = value.replace("\n", "\\n") + # Escape carriage returns + value = value.replace("\r", "\\r") + # Escape tabs + value = value.replace("\t", "\\t") + + return value + + +def create_entity_mention_request( + request_id: str, + source_id: str, + entity_type: str, + legal_name: str, + country_code: str, + nuts_code: str | None = None, + post_code: str | None = None, + post_name: str | None = None, + thoroughfare: str | None = None, +) -> dict: + """ + Create an EntityMentionResolutionRequest payload. + + Uses RDF/Turtle format with entity metadata including extended address fields. + All string values are properly escaped for Turtle compatibility. + + Args: + request_id: Unique request identifier + source_id: Source system identifier + entity_type: Entity type (e.g., ORGANISATION) + legal_name: Legal name of the entity + country_code: ISO 2-letter country code + nuts_code: Optional NUTS regional code + post_code: Optional postal code + post_name: Optional city/locality name + thoroughfare: Optional street address + """ + # Escape all string values for Turtle safety + legal_name_safe = escape_turtle_string(legal_name or "") + country_code_safe = escape_turtle_string(country_code or "") + + # Build address properties dynamically + address_props = [f'epo:hasCountryCode "{country_code_safe}"'] + if nuts_code: + nuts_code_safe = escape_turtle_string(nuts_code) + address_props.append(f'epo:hasNutsCode "{nuts_code_safe}"') + if post_code: + post_code_safe = escape_turtle_string(post_code) + address_props.append(f'locn:postCode "{post_code_safe}"') + if post_name: + post_name_safe = escape_turtle_string(post_name) + address_props.append(f'locn:postName "{post_name_safe}"') + if thoroughfare: + thoroughfare_safe = escape_turtle_string(thoroughfare) + address_props.append(f'locn:thoroughfare "{thoroughfare_safe}"') + + address_content = " ;\n ".join(address_props) + + content = f"""@prefix org: . +@prefix cccev: . +@prefix epo: . +@prefix locn: . +@prefix epd: . + +epd:ent{request_id} a org:Organization ; + epo:hasLegalName "{legal_name_safe}" ; + cccev:registeredAddress [ + {address_content} + ] . +""" + + return { + "type": "EntityMentionResolutionRequest", + "entity_mention": { + "identifiedBy": { + "request_id": request_id, + "source_id": source_id, + "entity_type": entity_type, + }, + "content": content.strip(), + "content_type": "text/turtle", + }, + "timestamp": datetime.now(timezone.utc).isoformat(), + "ere_request_id": f"{request_id}:01", + } + + +def parse_response(response_bytes: bytes) -> dict: + """Parse JSON response from Redis.""" + return json.loads(response_bytes.decode("utf-8")) + + +# =============================================================================== +# Demo Data Loading +# =============================================================================== + + +def load_demo_mentions(data_file: str | None = None) -> list[dict]: + """ + Load demo mentions from a JSON file. + + Args: + data_file: Path to JSON file containing mentions. If None, uses default. + + Returns: + List of mention dicts with keys: request_id, source_id, entity_type, + legal_name, country_code, description. + + Raises: + FileNotFoundError: If data file does not exist. + ValueError: If JSON is invalid or missing 'mentions' key. + """ + if data_file is None: + data_file = DEFAULT_DATA_FILE + + data_path = Path(data_file) + if not data_path.exists(): + raise FileNotFoundError(f"Data file not found: {data_path}") + + with open(data_path) as f: + data = json.load(f) + + if "mentions" not in data: + raise ValueError(f"JSON must contain 'mentions' key") + + return data["mentions"] + + +# =============================================================================== +# Main Demo +# =============================================================================== + + +def main(data_file: str | None = None): + """ + Run the Redis-based ERE demo. + + Args: + data_file: Path to JSON file containing demo mentions. + If None, uses default (mentions_mixed_countries.json). + """ + logger = setup_logging() + + # Load configuration + logger.info("Loading configuration...") + config = load_env_file() + logger.info( + f"Redis config: host={config['REDIS_HOST']}, " + f"port={config['REDIS_PORT']}, db={config['REDIS_DB']}" + ) + logger.info( + f"Queue names: request={config['REQUEST_QUEUE']}, " + f"response={config['RESPONSE_QUEUE']}" + ) + + # Load demo mentions from JSON + try: + demo_mentions = load_demo_mentions(data_file) + logger.info( + f"Loaded {len(demo_mentions)} mentions from {data_file or DEFAULT_DATA_FILE}" + ) + except (FileNotFoundError, ValueError) as e: + logger.error(f"Failed to load demo mentions: {e}") + return 1 + + # Check Redis connectivity + logger.info("Checking Redis connectivity...") + try: + redis_client = check_redis_connectivity( + host=config["REDIS_HOST"], + port=config["REDIS_PORT"], + db=config["REDIS_DB"], + password=config["REDIS_PASSWORD"], + ) + logger.info("✓ Redis is available") + except RuntimeError as e: + logger.error(f"✗ Redis check failed: {e}") + return 1 + + # Clear queues + logger.info("Clearing request and response queues...") + redis_client.delete(config["REQUEST_QUEUE"], config["RESPONSE_QUEUE"]) + + # ⚠️ Check if DuckDB database is non-empty (stale from prior runs) + # This guards against corrupting demo results by mixing old and new mentions + duckdb_path = Path(os.environ.get("DUCKDB_PATH", "/data/app.duckdb")) + if duckdb_path.exists() and duckdb_path.stat().st_size > 0: + logger.warning( + f"⚠️ WARNING: DuckDB database file exists and is non-empty!\n" + f" This may contain mentions from a prior run.\n" + f" This will CORRUPT demo results by mixing old and new data.\n" + f" \n" + f" To reset the database:\n" + f" 1. docker volume rm ere-local_ere-data\n" + f" 2. make infra-rebuild\n" + ) + + # Send demo requests + logger.info(f"Sending {len(demo_mentions)} entity mentions...") + request_ids = [] + + for mention in demo_mentions: + request = create_entity_mention_request( + request_id=mention["request_id"], + source_id=mention["source_id"], + entity_type=mention["entity_type"], + legal_name=mention["legal_name"], + country_code=mention["country_code"], + nuts_code=mention.get("nuts_code"), + post_code=mention.get("post_code"), + post_name=mention.get("post_name"), + thoroughfare=mention.get("thoroughfare"), + ) + + message_json = json.dumps(request) + if logger.isEnabledFor(TRACE): + logger.log(TRACE, f"Full request message:\n{json.dumps(request, indent=2)}") + + message_bytes = message_json.encode("utf-8") + redis_client.rpush(config["REQUEST_QUEUE"], message_bytes) + request_ids.append(mention["request_id"]) + + logger.info( + f" → Sent request {mention['request_id']}: " + f"{mention['legal_name']} ({mention['country_code']}) " + f"[{mention.get('description', '')}]" + ) + + # Wait 1 second between messages to ensure sequential processing + if DELAY_BETWEEN_MESSAGES: + time.sleep(1) + + logger.info("") + logger.info("Listening for responses...") + logger.info("-" * 80) + + # Track mentions for summary: map request_id → (legal_name, cluster_id) + mention_tracking = {} + for mention in demo_mentions: + mention_tracking[mention["request_id"]] = { + "legal_name": mention["legal_name"], + "cluster_id": None, # Will be filled in from response + } + + # Listen for responses + responses_received = {} + start_time = time.time() + + while len(responses_received) < len(request_ids): + elapsed = time.time() - start_time + if GLOBAL_TIMEOUT > 0 and elapsed > GLOBAL_TIMEOUT: + logger.warning( + f"Timeout after {GLOBAL_TIMEOUT}s. Received {len(responses_received)}/{len(request_ids)} responses." + ) + break + + # Try to get a response with short timeout + result = redis_client.brpop(config["RESPONSE_QUEUE"], timeout=1) + + if result is not None: + _, response_bytes = result + response = parse_response(response_bytes) + + if logger.isEnabledFor(TRACE): + logger.log( + TRACE, f"Full response message:\n{json.dumps(response, indent=2)}" + ) + + req_id = response["entity_mention_id"]["request_id"] + responses_received[req_id] = response + + logger.info(f"\n✓ Response received for {req_id}:") + logger.info(f" Type: {response['type']}") + logger.info(f" Timestamp: {response['timestamp']}") + + source_id = response["entity_mention_id"]["source_id"] + entity_type = response["entity_mention_id"]["entity_type"] + logger.info(f" Mention: ({source_id}, {req_id}, {entity_type})") + + logger.info(f" Candidates:") + + # Track the top cluster assignment (first candidate is the assignment) + if response.get("candidates"): + top_candidate = response["candidates"][0] + assigned_cluster = top_candidate["cluster_id"] + mention_tracking[req_id]["cluster_id"] = assigned_cluster + logger.info(f" → Assigned to cluster: {assigned_cluster}") + + for i, candidate in enumerate(response.get("candidates", []), 1): + logger.info( + f" {i}. Cluster {candidate['cluster_id']}: " + f"confidence={candidate['confidence_score']:.4f}, " + f"similarity={candidate['similarity_score']:.4f}" + ) + + logger.info("-" * 80) + logger.info( + f"\nDemo complete. Received {len(responses_received)}/{len(request_ids)} responses." + ) + + # Build clustering summary as single block + summary_lines = [] + summary_lines.append("=" * 80) + summary_lines.append("CLUSTERING SUMMARY") + summary_lines.append("=" * 80) + + # Group mentions by assigned cluster + clusters = {} + unassigned = [] + + for req_id in request_ids: + tracking = mention_tracking.get(req_id) + if tracking: + cluster_id = tracking["cluster_id"] + legal_name = tracking["legal_name"] + + if cluster_id is None: + unassigned.append((req_id, legal_name)) + else: + if cluster_id not in clusters: + clusters[cluster_id] = [] + clusters[cluster_id].append((req_id, legal_name)) + + # Build cluster output + if clusters: + for cluster_id in sorted(clusters.keys()): + members = clusters[cluster_id] + summary_lines.append("") + summary_lines.append(f"{cluster_id} ({len(members)} members):") + for req_id, legal_name in members: + summary_lines.append(f" {req_id:4s} | {legal_name}") + else: + summary_lines.append("") + summary_lines.append("(No clusters formed)") + + # Add unassigned mentions + if unassigned: + summary_lines.append("") + summary_lines.append(f"Unassigned ({len(unassigned)} mentions):") + for req_id, legal_name in unassigned: + summary_lines.append(f" {req_id:4s} | {legal_name}") + + summary_lines.append("=" * 80) + + # Print entire summary in one log call + summary_block = "\n".join(summary_lines) + logger.info(f"\n{summary_block}") + + # Summary + if len(responses_received) == len(request_ids): + logger.info("✓ All responses received successfully!") + return 0 + else: + logger.warning( + f"✗ Missing {len(request_ids) - len(responses_received)} response(s)." + ) + return 1 + + +if __name__ == "__main__": + import argparse + + parser = argparse.ArgumentParser( + description="Redis-based ERE demo with parametrized mentions data." + ) + parser.add_argument( + "--data", + type=str, + default=None, + help=f"Path to JSON file with demo mentions (default: {DEFAULT_DATA_FILE})", + ) + args = parser.parse_args() + + sys.exit(main(data_file=args.data)) diff --git a/src/ere/adapters/duckdb_repositories.py b/src/ere/adapters/duckdb_repositories.py index 7b58ad9..c65caa8 100644 --- a/src/ere/adapters/duckdb_repositories.py +++ b/src/ere/adapters/duckdb_repositories.py @@ -3,7 +3,13 @@ import duckdb import pandas as pd -from ere.models.resolver import ClusterId, ClusterMembership, Mention, MentionId, MentionLink +from ere.models.resolver import ( + ClusterId, + ClusterMembership, + Mention, + MentionId, + MentionLink, +) from ere.adapters.repositories import ( ClusterRepository, MentionRepository, diff --git a/src/ere/adapters/rdf_mapper.py b/src/ere/adapters/rdf_mapper.py index 37bf6bb..1f45fc1 100644 --- a/src/ere/adapters/rdf_mapper.py +++ b/src/ere/adapters/rdf_mapper.py @@ -87,9 +87,7 @@ def extract_mention_attributes( entity_subject = graph.value(predicate=RDF.type, object=rdf_type) if entity_subject is None: - raise ValueError( - f"No entity of type {rdf_type} found in RDF content" - ) + raise ValueError(f"No entity of type {rdf_type} found in RDF content") # Extract attributes per config attributes = {} diff --git a/src/ere/adapters/rdf_mapper_impl.py b/src/ere/adapters/rdf_mapper_impl.py index 243f6b1..09bd35c 100644 --- a/src/ere/adapters/rdf_mapper_impl.py +++ b/src/ere/adapters/rdf_mapper_impl.py @@ -42,7 +42,11 @@ def _load_mappings(rdf_mapping_path: str | Path = None) -> dict: dict: Entity type mappings from config. """ if rdf_mapping_path is None: - rdf_mapping_path = Path(__file__).parent.parent.parent.parent / "infra" / "config" / "rdf_mapping.yaml" + rdf_mapping_path = ( + Path(__file__).parent.parent.parent + / "config" + / "rdf_mapping.yaml" + ) else: rdf_mapping_path = Path(rdf_mapping_path) return load_entity_mappings(rdf_mapping_path) @@ -70,9 +74,13 @@ def map_entity_mention_to_domain(self, entity_mention: EntityMention) -> Mention ) mention_id = MentionId( - value=self._derive_mention_id(eid.source_id, eid.request_id, eid.entity_type) + value=self._derive_mention_id( + eid.source_id, eid.request_id, eid.entity_type + ) + ) + attributes = extract_mention_attributes( + entity_mention.content, entity_type_config ) - attributes = extract_mention_attributes(entity_mention.content, entity_type_config) return Mention(id=mention_id, attributes=attributes) @staticmethod diff --git a/src/ere/adapters/repositories.py b/src/ere/adapters/repositories.py index 6ac6dc2..2a99e6d 100644 --- a/src/ere/adapters/repositories.py +++ b/src/ere/adapters/repositories.py @@ -9,7 +9,13 @@ from abc import ABC, abstractmethod -from ere.models.resolver import ClusterId, ClusterMembership, Mention, MentionId, MentionLink +from ere.models.resolver import ( + ClusterId, + ClusterMembership, + Mention, + MentionId, + MentionLink, +) class MentionRepository(ABC): diff --git a/src/ere/adapters/splink_linker_impl.py b/src/ere/adapters/splink_linker_impl.py index 8172614..283432a 100644 --- a/src/ere/adapters/splink_linker_impl.py +++ b/src/ere/adapters/splink_linker_impl.py @@ -45,7 +45,9 @@ def build_tf_df(mentions: list[Mention], entity_fields: list[str]) -> pd.DataFra flat_dict = mention.to_flat_dict() row = { "mention_id": flat_dict["mention_id"], - **{f: flat_dict.get(f) or "" for f in entity_fields}, # Convert None to empty string + **{ + f: flat_dict.get(f) or "" for f in entity_fields + }, # Convert None to empty string "__splink_salt": 0.5, } rows.append(row) @@ -246,11 +248,15 @@ def register_mention(self, mention: Mention) -> None: ) # Build new row with same schema as _tf_df - new_row = pd.DataFrame([{ - "mention_id": flat_dict["mention_id"], - **{f: flat_dict.get(f) for f in self._entity_fields}, - "__splink_salt": 0.5, - }]) + new_row = pd.DataFrame( + [ + { + "mention_id": flat_dict["mention_id"], + **{f: flat_dict.get(f) for f in self._entity_fields}, + "__splink_salt": 0.5, + } + ] + ) # Cast string columns to pd.StringDtype() to prevent type drift on None values for col in self._entity_fields: @@ -324,7 +330,9 @@ def _build_settings(self) -> SettingsCreator: comp["field"], thresholds, ) - comparisons.append(cl.JaroWinklerAtThresholds(comp["field"], thresholds)) + comparisons.append( + cl.JaroWinklerAtThresholds(comp["field"], thresholds) + ) elif comp["type"] == "exact_match": log.trace( "_build_settings: Adding ExactMatch comparison on field '%s'", @@ -406,7 +414,9 @@ def _train_safe(self) -> None: log.info("EM training: estimating u-probabilities via random sampling") linker_new.training.estimate_u_using_random_sampling(max_pairs=1e6) - log.info("EM training: estimating m-probabilities and lambda via EM algorithm") + log.info( + "EM training: estimating m-probabilities and lambda via EM algorithm" + ) linker_new.training.estimate_parameters_using_expectation_maximisation( self._get_em_training_rule(), estimate_without_term_frequencies=True ) @@ -455,12 +465,16 @@ def _apply_cold_start_params(self) -> None: # Check if cold_start config exists cold_start_cfg = self._config.get("splink", {}).get("cold_start", {}) if not cold_start_cfg: - log.info("Linker initializing: No cold_start config found, using Splink defaults") + log.info( + "Linker initializing: No cold_start config found, using Splink defaults" + ) return comparisons_cfg = cold_start_cfg.get("comparisons", {}) if not comparisons_cfg: - log.info("Linker initializing: No comparisons config in cold_start, using Splink defaults") + log.info( + "Linker initializing: No comparisons config in cold_start, using Splink defaults" + ) return log.info( @@ -475,11 +489,11 @@ def _apply_cold_start_params(self) -> None: for _, comparison in enumerate(self._linker._settings_obj.comparisons): # Get the field name from the comparison field_name = None - if hasattr(comparison, 'output_column_name'): + if hasattr(comparison, "output_column_name"): field_name = comparison.output_column_name - elif hasattr(comparison, '_field_names') and comparison._field_names: + elif hasattr(comparison, "_field_names") and comparison._field_names: field_name = comparison._field_names[0] - # pylint: enable=protected-access + # pylint: enable=protected-access if field_name not in comparisons_cfg: continue @@ -494,8 +508,9 @@ def _apply_cold_start_params(self) -> None: # Collect non-null levels to properly map cold-start probabilities non_null_levels = [ - (i, level) for i, level in enumerate(comparison.comparison_levels) - if not (hasattr(level, 'is_null_level') and level.is_null_level) + (i, level) + for i, level in enumerate(comparison.comparison_levels) + if not (hasattr(level, "is_null_level") and level.is_null_level) ] log.trace( "_apply_cold_start_params: Field '%s' has %d non-null levels: %s", @@ -505,8 +520,8 @@ def _apply_cold_start_params(self) -> None: ) # Apply m-probabilities to non-null levels in order - if 'm_probabilities' in field_cfg: - m_probs = field_cfg['m_probabilities'] + if "m_probabilities" in field_cfg: + m_probs = field_cfg["m_probabilities"] for config_idx, m_prob in enumerate(m_probs): if config_idx < len(non_null_levels): actual_level_idx, level = non_null_levels[config_idx] @@ -528,8 +543,8 @@ def _apply_cold_start_params(self) -> None: ) # Apply u-probabilities to non-null levels in order - if 'u_probabilities' in field_cfg: - u_probs = field_cfg['u_probabilities'] + if "u_probabilities" in field_cfg: + u_probs = field_cfg["u_probabilities"] for config_idx, u_prob in enumerate(u_probs): if config_idx < len(non_null_levels): actual_level_idx, level = non_null_levels[config_idx] @@ -566,7 +581,7 @@ def _log_trained_parameters(self, linker: Linker) -> None: # Get the Fellegi-Sunter prior (lambda) prior = None # pylint: disable=protected-access # Splink exposes no public API for settings introspection - if hasattr(linker._settings_obj, 'probability_two_random_records_match'): + if hasattr(linker._settings_obj, "probability_two_random_records_match"): prior = linker._settings_obj.probability_two_random_records_match log.info( "EM trained parameter: lambda (P(match)) = %.6f", @@ -577,11 +592,11 @@ def _log_trained_parameters(self, linker: Linker) -> None: for comparison in linker._settings_obj.comparisons: # Get field name field_name = None - if hasattr(comparison, 'output_column_name'): + if hasattr(comparison, "output_column_name"): field_name = comparison.output_column_name - elif hasattr(comparison, '_field_names') and comparison._field_names: + elif hasattr(comparison, "_field_names") and comparison._field_names: field_name = comparison._field_names[0] - # pylint: enable=protected-access + # pylint: enable=protected-access if not field_name: continue @@ -593,8 +608,9 @@ def _log_trained_parameters(self, linker: Linker) -> None: # Collect non-null levels non_null_levels = [ - (i, level) for i, level in enumerate(comparison.comparison_levels) - if not (hasattr(level, 'is_null_level') and level.is_null_level) + (i, level) + for i, level in enumerate(comparison.comparison_levels) + if not (hasattr(level, "is_null_level") and level.is_null_level) ] # Log m and u probabilities for each level @@ -605,19 +621,25 @@ def _log_trained_parameters(self, linker: Linker) -> None: trained_u = False # Extract m-probability - if hasattr(level, 'm_probability') and level.m_probability is not None: + if ( + hasattr(level, "m_probability") + and level.m_probability is not None + ): m_prob = level.m_probability # Check if it was trained (non-cold-start values have specific patterns) # Cold-start values are typically set exactly; trained values may vary trained_m = True # Extract u-probability - if hasattr(level, 'u_probability') and level.u_probability is not None: + if ( + hasattr(level, "u_probability") + and level.u_probability is not None + ): u_prob = level.u_probability trained_u = True # Log level details - level_desc = getattr(level, 'label', f"Level {config_idx}") + level_desc = getattr(level, "label", f"Level {config_idx}") m_status = "✓ trained" if trained_m else "✗ cold-start" u_status = "✓ trained" if trained_u else "✗ cold-start" diff --git a/src/ere/adapters/utils.py b/src/ere/adapters/utils.py index 63ad5f9..c1535ae 100644 --- a/src/ere/adapters/utils.py +++ b/src/ere/adapters/utils.py @@ -21,7 +21,10 @@ ) SUPPORTED_REQUEST_CLASSES = { - cls.__name__: cls for cls in [EntityMentionResolutionRequest] # , FullRebuildRequest] # TODO: Add when available + cls.__name__: cls + for cls in [ + EntityMentionResolutionRequest + ] } """ Explicit list of supported Request classes, used in utilities like :meth:`get_request_from_message`. @@ -34,7 +37,10 @@ SUPPORTED_RESPONSE_CLASSES = { cls.__name__: cls - for cls in [EntityMentionResolutionResponse, EREErrorResponse] # , FullRebuildResponse] # TODO: Add when available + for cls in [ + EntityMentionResolutionResponse, + EREErrorResponse, + ] } """ Explicit list of supported Response classes, used in utilities like :meth:`get_response_from_message`. diff --git a/src/ere/entrypoints/app.py b/src/ere/entrypoints/app.py index e2bfd35..6a6fbd7 100644 --- a/src/ere/entrypoints/app.py +++ b/src/ere/entrypoints/app.py @@ -12,7 +12,7 @@ REDIS_HOST Redis hostname (default: localhost) REDIS_PORT Redis port (default: 6379) REDIS_DB Redis DB index (default: 0) - LOG_LEVEL Python log level name (default: INFO) — supports TRACE + ERE_LOG_LEVEL Python log level name (default: INFO) — supports TRACE RDF_MAPPING_PATH Path to rdf_mapping.yaml config file RESOLVER_CONFIG_PATH Path to resolver.yaml config file DUCKDB_PATH Path to persistent DuckDB file (overrides resolver.yaml) @@ -78,7 +78,9 @@ def main() -> None: # Config file paths: CLI takes precedence over environment rdf_mapping_path = args.rdf_mapping_path or os.environ.get("RDF_MAPPING_PATH") - resolver_config_path = args.resolver_config_path or os.environ.get("RESOLVER_CONFIG_PATH") + resolver_config_path = args.resolver_config_path or os.environ.get( + "RESOLVER_CONFIG_PATH" + ) duckdb_path = os.environ.get("DUCKDB_PATH") log.info( diff --git a/src/ere/entrypoints/queue_worker.py b/src/ere/entrypoints/queue_worker.py index 020f18c..e3d435b 100644 --- a/src/ere/entrypoints/queue_worker.py +++ b/src/ere/entrypoints/queue_worker.py @@ -47,7 +47,9 @@ def process_single_message(self) -> bool: Exception: Propagates connection errors. """ # Wait for a request - queue_message = self.redis_client.brpop(self.request_queue, timeout=self.queue_timeout) + queue_message = self.redis_client.brpop( + self.request_queue, timeout=self.queue_timeout + ) if not queue_message: return False # Timeout @@ -88,7 +90,9 @@ def _send_response(self, response: EREResponse) -> None: log.error("Failed to send response: %s", e) @staticmethod - def _build_error_response(error_detail: str, ere_request_id: str = "unknown") -> EREErrorResponse: + def _build_error_response( + error_detail: str, ere_request_id: str = "unknown" + ) -> EREErrorResponse: """Build error response for request processing failures.""" log.error("Building error response: %s", error_detail) return EREErrorResponse( diff --git a/src/ere/models/exceptions.py b/src/ere/models/exceptions.py index 889a82c..2d648d4 100644 --- a/src/ere/models/exceptions.py +++ b/src/ere/models/exceptions.py @@ -4,7 +4,9 @@ class ConflictError(Exception): """Raised when the same mention_id is submitted with different content.""" - def __init__(self, mention_id: str, existing_attributes: dict, incoming_attributes: dict): + def __init__( + self, mention_id: str, existing_attributes: dict, incoming_attributes: dict + ): super().__init__( f"Mention '{mention_id}' was already resolved with different content. " f"Existing: {existing_attributes!r}, Incoming: {incoming_attributes!r}" diff --git a/src/ere/models/resolver/mention.py b/src/ere/models/resolver/mention.py index 71c09ac..4e5cd05 100644 --- a/src/ere/models/resolver/mention.py +++ b/src/ere/models/resolver/mention.py @@ -28,7 +28,11 @@ def _from_flat_dict(cls, raw_input: object) -> object: {"mention_id": "m1", "legal_name": "Acme", "country_code": "US"} and convert to the structured form expected by the model. """ - if isinstance(raw_input, dict) and "mention_id" in raw_input and "id" not in raw_input: + if ( + isinstance(raw_input, dict) + and "mention_id" in raw_input + and "id" not in raw_input + ): return { "id": MentionId(value=raw_input["mention_id"]), "attributes": {k: v for k, v in raw_input.items() if k != "mention_id"}, diff --git a/src/ere/services/entity_resolution_service.py b/src/ere/services/entity_resolution_service.py index dc27376..2bbec9b 100644 --- a/src/ere/services/entity_resolution_service.py +++ b/src/ere/services/entity_resolution_service.py @@ -128,7 +128,9 @@ def resolve(self, mention: Mention) -> ResolutionResult: cluster_id = ClusterId(value=mention.id.value) log.trace("New cluster generated for mention with id=%s", mention.id.value) - self._cluster_repo.save(ClusterMembership(mention_id=mention.id, cluster_id=cluster_id)) + self._cluster_repo.save( + ClusterMembership(mention_id=mention.id, cluster_id=cluster_id) + ) # Log cluster contents after assignment all_memberships = self._cluster_repo.get_all_memberships() @@ -147,7 +149,10 @@ def resolve(self, mention: Mention) -> ResolutionResult: # Trigger auto-training if threshold is reached (non-blocking background thread). count = self._mention_repo.count() - if self._config.auto_train_threshold > 0 and count == self._config.auto_train_threshold: + if ( + self._config.auto_train_threshold > 0 + and count == self._config.auto_train_threshold + ): log.info( "Auto-training triggered: %d mentions reached (threshold=%d). " "Starting background EM training thread. Scoring continues with current parameters.", @@ -155,9 +160,7 @@ def resolve(self, mention: Mention) -> ResolutionResult: self._config.auto_train_threshold, ) threading.Thread( - target=self._linker.train, - daemon=True, - name="linker-training" + target=self._linker.train, daemon=True, name="linker-training" ).start() # Step 5: Return cluster references (non-empty, always top-N). @@ -351,7 +354,9 @@ def resolve_to_result( def resolve_entity_mention( - entity_mention: EntityMention, resolver: EntityResolver = None, mapper: RDFMapper = None + entity_mention: EntityMention, + resolver: EntityResolver = None, + mapper: RDFMapper = None, ) -> ClusterReference: """ Resolve an entity mention to a Cluster (public API - returns top candidate). @@ -454,7 +459,9 @@ def process_request(self, request: ERERequest) -> EREResponse: entity_mention.identifiedBy.request_id, ) - resolution_outcome = resolve_to_result(entity_mention, self._resolver, self._mapper) + resolution_outcome = resolve_to_result( + entity_mention, self._resolver, self._mapper + ) # Log resolution result with candidates candidate_info = [ @@ -482,7 +489,12 @@ def process_request(self, request: ERERequest) -> EREResponse: timestamp=now, ) except Exception as exc: # pylint: disable=broad-exception-caught - log.error("Resolution error for mention %s: %s", request.ere_request_id, exc, exc_info=True) + log.error( + "Resolution error for mention %s: %s", + request.ere_request_id, + exc, + exc_info=True, + ) return EREErrorResponse( ere_request_id=request.ere_request_id, error_type=type(exc).__name__, diff --git a/src/ere/services/factories.py b/src/ere/services/factories.py index 6442ae8..4550209 100644 --- a/src/ere/services/factories.py +++ b/src/ere/services/factories.py @@ -19,7 +19,10 @@ from ere.adapters.duckdb_schema import init_schema from ere.adapters.rdf_mapper_port import RDFMapper from ere.adapters.splink_linker_impl import SpLinkSimilarityLinker -from ere.services.entity_resolution_service import EntityResolver, EntityResolutionService +from ere.services.entity_resolution_service import ( + EntityResolver, + EntityResolutionService, +) from ere.services.resolver_config import ResolverConfig @@ -47,7 +50,11 @@ def build_entity_resolver( Fully-constructed EntityResolver with DuckDB backend and Splink linker. """ if resolver_config_path is None: - config_path = Path(__file__).parent.parent.parent.parent / "infra" / "config" / "resolver.yaml" + config_path = ( + Path(__file__).parent.parent.parent + / "config" + / "resolver.yaml" + ) else: config_path = Path(resolver_config_path) diff --git a/src/ere/services/resolver_config.py b/src/ere/services/resolver_config.py index e3c839f..50b49bb 100644 --- a/src/ere/services/resolver_config.py +++ b/src/ere/services/resolver_config.py @@ -7,7 +7,9 @@ class DuckDBConfig(BaseModel): """DuckDB database configuration.""" type: str = "in-memory" # "in-memory" or "persistent" - path: str = ":memory:" # Database path: ":memory:" for in-memory, file path for persistent + path: str = ( + ":memory:" # Database path: ":memory:" for in-memory, file path for persistent + ) class ResolverConfig(BaseModel): diff --git a/src/ere/utils/logging.py b/src/ere/utils/logging.py index 9100a1b..70e36a3 100644 --- a/src/ere/utils/logging.py +++ b/src/ere/utils/logging.py @@ -26,10 +26,10 @@ def configure_logging(log_level: str = None) -> None: Args: log_level: Log level name (e.g., 'DEBUG', 'INFO', 'TRACE'). - If None, reads from LOG_LEVEL environment variable (default: INFO). + If None, reads from ERE_LOG_LEVEL environment variable (default: INFO). """ if log_level is None: - log_level = os.environ.get("LOG_LEVEL", "INFO").upper() + log_level = os.environ.get("ERE_LOG_LEVEL", "INFO").upper() else: log_level = log_level.upper() diff --git a/src/infra/.env.example b/src/infra/.env.example new file mode 100644 index 0000000..e80b817 --- /dev/null +++ b/src/infra/.env.example @@ -0,0 +1,23 @@ +# ERE local development environment +# Copy to infra/.env and customise: cp infra/.env.example infra/.env +# +# Compatible with the ERSys unified environment (infra/.env.example). +# When running ERE standalone, use this file with infra/compose.dev.yaml. +# When running inside the full ERSys stack, the parent project's .env covers these. + +# --- Redis --- +REDIS_HOST=ersys-redis +REDIS_PORT=6379 +REDIS_DB=0 +REDIS_PASSWORD=changeme + +# --- Queues --- +REQUEST_QUEUE=ere_requests +RESPONSE_QUEUE=ere_responses + +# --- Storage --- +DUCKDB_PATH=/data/app.duckdb + +# --- Logging --- +# ERE_LOG_LEVEL is read directly by the application (src/ere/utils/logging.py). +ERE_LOG_LEVEL=INFO diff --git a/src/infra/Dockerfile b/src/infra/Dockerfile new file mode 100644 index 0000000..99affbf --- /dev/null +++ b/src/infra/Dockerfile @@ -0,0 +1,81 @@ +# Multi-stage build for the Entity Resolution Engine. +# Build context: repository root (one level above /infra) + +# ============================================================================= +# Fetcher stage: resolve dependencies and build wheels +# ============================================================================= +FROM python:3.12-slim AS fetcher + +ARG POETRY_VERSION=">=2.0.0,<3.0.0" + +ENV POETRY_NO_INTERACTION=1 \ + PYTHONDONTWRITEBYTECODE=1 + +RUN apt-get update \ + && apt-get install -y --no-install-recommends git \ + && rm -rf /var/lib/apt/lists/* + +RUN pip install --no-cache-dir "poetry${POETRY_VERSION}" poetry-plugin-export + +WORKDIR /app +COPY src/pyproject.toml src/poetry.lock ./ + +# Export dependencies, build wheels (including git repos), then clean the requirements file +RUN poetry export -f requirements.txt --output requirements.txt --without-hashes \ + && pip wheel -r requirements.txt -w /wheels \ + && sed -i 's/ @ git.*//' requirements.txt + +# ============================================================================= +# Builder stage: install wheels into a virtual environment +# ============================================================================= +FROM python:3.12-slim AS builder + +ENV PYTHONDONTWRITEBYTECODE=1 \ + PYTHONUNBUFFERED=1 + +WORKDIR /app + +# Create a virtual environment +RUN python -m venv /app/.venv +ENV PATH="/app/.venv/bin:${PATH}" + +# Copy wheels and requirements from fetcher +COPY --from=fetcher /wheels /wheels +COPY --from=fetcher /app/requirements.txt /requirements.txt + +# Install dependencies without using the network +RUN pip install --no-cache-dir --no-index --find-links=/wheels -r /requirements.txt + +COPY README.md /README.md +COPY src/pyproject.toml src/poetry.lock ./ +COPY src/ere ere/ +COPY src/config config/ + +# Install the application itself without re-triggering dependency resolution +RUN pip install --no-deps . + + +# ============================================================================= +# Runtime stage: minimal production image +# ============================================================================= +FROM python:3.12-slim AS runtime + +ENV PYTHONDONTWRITEBYTECODE=1 \ + PYTHONUNBUFFERED=1 \ + PATH="/app/.venv/bin:${PATH}" + +RUN groupadd --gid 1000 appuser && \ + useradd --uid 1000 --gid appuser --shell /bin/bash --create-home appuser + +WORKDIR /app + +# Copy only the virtual environment and necessary app files +COPY --from=builder /app/.venv /app/.venv +COPY --from=builder /app/ere /app/ere +COPY --from=builder /app/config /app/config +# Volume mount point for DuckDB persistent storage +RUN mkdir -p /data && chown appuser:appuser /data + +USER appuser + +CMD ["python", "-m", "ere.entrypoints.app"] diff --git a/src/infra/README.md b/src/infra/README.md new file mode 100644 index 0000000..db68b06 --- /dev/null +++ b/src/infra/README.md @@ -0,0 +1,71 @@ +# Infrastructure + +Deployment and infrastructure files for the Entity Resolution Engine. + +## Structure + +``` +infra/ +├── .env.example # Environment variable template +├── compose.dev.yaml # Docker Compose for local development +├── Dockerfile # Multi-stage build (builder + runtime) +└── README.md +``` + +## Services + +| Service | Purpose | Port | +|---|---|---| +| `ere` | Entity Resolution Engine (Redis queue worker) | — (no HTTP API) | +| `redis` | Message queue for ERE requests/responses | 6379 | +| `redisinsight` | Redis GUI (development tool) | 5540 | + +## Usage + +All commands run from the repo root via `make`: + +```bash +make infra-build # Build the ERE Docker image +make infra-up # Start services (docker compose up -d) +make infra-down # Stop and remove containers and networks +make infra-down-volumes # Stop services and remove volumes (clean slate) +make infra-rebuild # Rebuild images and start services +make infra-rebuild-clean # Rebuild from scratch (no cache) +make infra-logs # Follow service logs +make infra-watch # Start services with file watching (sync src/ and config/) +``` + +### File watching (development) + +`make infra-watch` uses Docker Compose's `watch` feature to sync source code and +configuration changes into the running container without a full rebuild: + +- **Source changes** (`src/`) are synced live into the container +- **Config changes** (`config/`) are synced live into the container +- **Dependency changes** (`pyproject.toml`, `poetry.lock`) trigger a full rebuild + +> **Note:** ERE is a long-running queue worker, not an HTTP server with hot-reload. +> After syncing, restart the container to pick up changes: `docker compose -f infra/compose.dev.yaml restart ere` + +### Manual build + +```bash +docker build -f infra/Dockerfile -t ere:latest . +``` + +## Configuration + +Environment variables are loaded from `infra/.env`. See `infra/.env.example` for available options. To set up: + +```bash +cp infra/.env.example infra/.env +``` + +### Resolver configuration + +Entity resolution behaviour is configured via YAML files in the top-level `config/` directory: + +- **[resolver.yaml](../config/resolver.yaml)** — Splink comparisons, cold-start parameters, blocking rules, thresholds +- **[rdf_mapping.yaml](../config/rdf_mapping.yaml)** — RDF namespace bindings, field extraction rules, entity type definitions + +See the [configuration README](../config/README.md) for detailed tuning guidance. diff --git a/src/infra/compose.dev.yaml b/src/infra/compose.dev.yaml new file mode 100644 index 0000000..94e909f --- /dev/null +++ b/src/infra/compose.dev.yaml @@ -0,0 +1,77 @@ +# Docker Compose configuration for local development + +name: ere-local + +services: + ersys-redis: + image: redis:7-alpine + container_name: "ersys-redis" + restart: unless-stopped + command: redis-server --requirepass ${REDIS_PASSWORD:-changeme} + ports: + - "6379:6379" + healthcheck: + test: ["CMD", "sh", "-c", "redis-cli --no-auth-warning -a $REDIS_PASSWORD ping"] + interval: 5s + timeout: 3s + retries: 5 + environment: + - REDIS_PASSWORD=${REDIS_PASSWORD:-changeme} + networks: + - ersys-local + + redisinsight: + image: redis/redisinsight:3.2.0 + container_name: "redisinsight" + restart: unless-stopped + ports: + - "5540:5540" + healthcheck: + test: ["CMD", "wget", "--spider", "-q", "http://127.0.0.1:5540/api/health"] + interval: 5s + timeout: 3s + retries: 5 + networks: + - ersys-local + + ere: + build: + context: ../.. + dockerfile: src/infra/Dockerfile + container_name: "ere" + env_file: .env + restart: unless-stopped + environment: + - DUCKDB_PATH=${DUCKDB_PATH:-/data/app.duckdb} + - RDF_MAPPING_PATH=/app/config/rdf_mapping.yaml + - RESOLVER_CONFIG_PATH=/app/config/resolver.yaml + # Remaining REDIS_* and queue vars inherited from env_file + healthcheck: + test: ["CMD", "sh", "-c", "test -f /proc/1/cmdline"] + interval: 10s + timeout: 3s + retries: 3 + volumes: + - ere-data:/data + - ../../src/config:/app/config + develop: + watch: + - action: sync + path: ../../src/ere + target: /app/ere + - action: sync + path: ../../src/config + target: /app/config + - action: rebuild + path: ../../src/pyproject.toml + - action: rebuild + path: ../../src/poetry.lock + networks: + - ersys-local + +volumes: + ere-data: + +networks: + ersys-local: + external: true diff --git a/poetry.lock b/src/poetry.lock similarity index 55% rename from poetry.lock rename to src/poetry.lock index 9d536e1..b09f103 100644 --- a/poetry.lock +++ b/src/poetry.lock @@ -2,40 +2,28 @@ [[package]] name = "altair" -version = "6.0.0" +version = "6.1.0" description = "Vega-Altair: A declarative statistical visualization library for Python." optional = false -python-versions = ">=3.9" +python-versions = ">=3.10" groups = ["main"] files = [ - {file = "altair-6.0.0-py3-none-any.whl", hash = "sha256:09ae95b53d5fe5b16987dccc785a7af8588f2dca50de1e7a156efa8a461515f8"}, - {file = "altair-6.0.0.tar.gz", hash = "sha256:614bf5ecbe2337347b590afb111929aa9c16c9527c4887d96c9bc7f6640756b4"}, + {file = "altair-6.1.0-py3-none-any.whl", hash = "sha256:fdf5fd939512e5b2fc4441c82dfd2635e706defbd037db0ac429ef5ddce66c3b"}, + {file = "altair-6.1.0.tar.gz", hash = "sha256:dda699216cf85b040d968ae5a569ad45957616811e38760a85e5118269daca67"}, ] [package.dependencies] jinja2 = "*" jsonschema = ">=3.0" -narwhals = ">=1.27.1" +narwhals = ">=2.4.0" packaging = "*" typing-extensions = {version = ">=4.12.0", markers = "python_version < \"3.15\""} [package.extras] -all = ["altair-tiles (>=0.3.0)", "anywidget (>=0.9.0)", "numpy", "pandas (>=1.1.3)", "pyarrow (>=11)", "vegafusion (>=2.0.3)", "vl-convert-python (>=1.8.0)"] -dev = ["duckdb (>=1.0) ; python_version < \"3.14\"", "geopandas (>=0.14.3) ; python_version < \"3.14\"", "hatch (>=1.13.0)", "ipykernel", "ipython", "mistune", "mypy", "pandas (>=1.1.3)", "pandas-stubs", "polars (>=0.20.3)", "pyarrow-stubs", "pytest", "pytest-cov", "pytest-xdist[psutil] (>=3.5,<4.0)", "ruff (>=0.9.5)", "taskipy (>=1.14.1)", "tomli (>=2.2.1)", "types-jsonschema", "types-setuptools"] -doc = ["docutils", "jinja2", "myst-parser", "numpydoc", "pillow", "pydata-sphinx-theme (>=0.14.1)", "scipy", "scipy-stubs ; python_version >= \"3.10\"", "sphinx", "sphinx-copybutton", "sphinx-design", "sphinxext-altair"] -save = ["vl-convert-python (>=1.8.0)"] - -[[package]] -name = "annotated-doc" -version = "0.0.4" -description = "Document parameters, class attributes, return types, and variables inline, with Annotated." -optional = false -python-versions = ">=3.8" -groups = ["dev"] -files = [ - {file = "annotated_doc-0.0.4-py3-none-any.whl", hash = "sha256:571ac1dc6991c450b25a9c2d84a3705e2ae7a53467b5d111c24fa8baabbed320"}, - {file = "annotated_doc-0.0.4.tar.gz", hash = "sha256:fbcda96e87e9c92ad167c2e53839e57503ecfda18804ea28102353485033faa4"}, -] +all = ["altair-tiles (>=0.3.0)", "anywidget (>=0.9.0)", "numpy", "pandas (>=1.1.3)", "pyarrow (>=11)", "vegafusion (>=2.0.3)", "vl-convert-python (>=1.9.0)"] +dev = ["duckdb (>=1.0)", "geopandas (>=0.14.3)", "hatch (>=1.13.0)", "ipykernel", "ipython", "mistune", "mypy", "pandas (>=1.1.3)", "pandas-stubs (<2.3.3)", "polars (>=0.20.3)", "pyarrow-stubs", "pytest", "pytest-cov", "pytest-xdist[psutil] (>=3.5,<4.0)", "ruff (>=0.9.5)", "taskipy (>=1.14.1)", "tomli (>=2.2.1)", "types-jsonschema", "types-setuptools"] +doc = ["docutils", "jinja2", "myst-parser", "numpydoc", "pillow", "pydata-sphinx-theme (>=0.14.1)", "scipy", "scipy-stubs ; python_version >= \"3.10\"", "sphinx", "sphinx-autobuild", "sphinx-copybutton", "sphinx-design", "sphinxext-altair"] +save = ["vl-convert-python (>=1.9.0)"] [[package]] name = "annotated-types" @@ -43,31 +31,12 @@ version = "0.7.0" description = "Reusable constraint types to use with typing.Annotated" optional = false python-versions = ">=3.8" -groups = ["main", "dev"] +groups = ["main"] files = [ {file = "annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53"}, {file = "annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89"}, ] -[[package]] -name = "anyio" -version = "4.12.1" -description = "High-level concurrency and networking framework on top of asyncio or Trio" -optional = false -python-versions = ">=3.9" -groups = ["dev"] -files = [ - {file = "anyio-4.12.1-py3-none-any.whl", hash = "sha256:d405828884fc140aa80a3c667b8beed277f1dfedec42ba031bd6ac3db606ab6c"}, - {file = "anyio-4.12.1.tar.gz", hash = "sha256:41cfcc3a4c85d3f05c932da7c26d0201ac36f72abd4435ba90d0464a3ffed703"}, -] - -[package.dependencies] -idna = ">=2.8" -typing_extensions = {version = ">=4.5", markers = "python_version < \"3.13\""} - -[package.extras] -trio = ["trio (>=0.31.0) ; python_version < \"3.10\"", "trio (>=0.32.0) ; python_version >= \"3.10\""] - [[package]] name = "assertpy" version = "1.1" @@ -93,26 +62,26 @@ files = [ [[package]] name = "attrs" -version = "25.4.0" +version = "26.1.0" description = "Classes Without Boilerplate" optional = false python-versions = ">=3.9" groups = ["main"] files = [ - {file = "attrs-25.4.0-py3-none-any.whl", hash = "sha256:adcf7e2a1fb3b36ac48d97835bb6d8ade15b8dcce26aba8bf1d14847b57a3373"}, - {file = "attrs-25.4.0.tar.gz", hash = "sha256:16d5969b87f0859ef33a48b35d55ac1be6e42ae49d5e853b597db70c35c57e11"}, + {file = "attrs-26.1.0-py3-none-any.whl", hash = "sha256:c647aa4a12dfbad9333ca4e71fe62ddc36f4e63b2d260a37a8b83d2f043ac309"}, + {file = "attrs-26.1.0.tar.gz", hash = "sha256:d03ceb89cb322a8fd706d4fb91940737b6642aa36998fe130a9bc96c985eff32"}, ] [[package]] name = "cachetools" -version = "7.0.2" +version = "7.0.6" description = "Extensible memoizing collections and decorators" optional = false python-versions = ">=3.10" groups = ["dev"] files = [ - {file = "cachetools-7.0.2-py3-none-any.whl", hash = "sha256:938dcad184827c5e94928c4fd5526e2b46692b7fb1ae94472da9131d0299343c"}, - {file = "cachetools-7.0.2.tar.gz", hash = "sha256:7e7f09a4ca8b791d8bb4864afc71e9c17e607a28e6839ca1a644253c97dbeae0"}, + {file = "cachetools-7.0.6-py3-none-any.whl", hash = "sha256:4e94956cfdd3086f12042cdd29318f5ced3893014f7d0d059bf3ead3f85b7f8b"}, + {file = "cachetools-7.0.6.tar.gz", hash = "sha256:e5d524d36d65703a87243a26ff08ad84f73352adbeafb1cde81e207b456aaf24"}, ] [[package]] @@ -141,137 +110,153 @@ files = [ [[package]] name = "charset-normalizer" -version = "3.4.4" +version = "3.4.7" description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet." optional = false python-versions = ">=3.7" groups = ["main", "dev"] files = [ - {file = "charset_normalizer-3.4.4-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:e824f1492727fa856dd6eda4f7cee25f8518a12f3c4a56a74e8095695089cf6d"}, - {file = "charset_normalizer-3.4.4-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4bd5d4137d500351a30687c2d3971758aac9a19208fc110ccb9d7188fbe709e8"}, - {file = "charset_normalizer-3.4.4-cp310-cp310-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:027f6de494925c0ab2a55eab46ae5129951638a49a34d87f4c3eda90f696b4ad"}, - {file = "charset_normalizer-3.4.4-cp310-cp310-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:f820802628d2694cb7e56db99213f930856014862f3fd943d290ea8438d07ca8"}, - {file = "charset_normalizer-3.4.4-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:798d75d81754988d2565bff1b97ba5a44411867c0cf32b77a7e8f8d84796b10d"}, - {file = "charset_normalizer-3.4.4-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9d1bb833febdff5c8927f922386db610b49db6e0d4f4ee29601d71e7c2694313"}, - {file = "charset_normalizer-3.4.4-cp310-cp310-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:9cd98cdc06614a2f768d2b7286d66805f94c48cde050acdbbb7db2600ab3197e"}, - {file = "charset_normalizer-3.4.4-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:077fbb858e903c73f6c9db43374fd213b0b6a778106bc7032446a8e8b5b38b93"}, - {file = "charset_normalizer-3.4.4-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:244bfb999c71b35de57821b8ea746b24e863398194a4014e4c76adc2bbdfeff0"}, - {file = "charset_normalizer-3.4.4-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:64b55f9dce520635f018f907ff1b0df1fdc31f2795a922fb49dd14fbcdf48c84"}, - {file = "charset_normalizer-3.4.4-cp310-cp310-musllinux_1_2_riscv64.whl", hash = "sha256:faa3a41b2b66b6e50f84ae4a68c64fcd0c44355741c6374813a800cd6695db9e"}, - {file = "charset_normalizer-3.4.4-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:6515f3182dbe4ea06ced2d9e8666d97b46ef4c75e326b79bb624110f122551db"}, - {file = "charset_normalizer-3.4.4-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:cc00f04ed596e9dc0da42ed17ac5e596c6ccba999ba6bd92b0e0aef2f170f2d6"}, - {file = "charset_normalizer-3.4.4-cp310-cp310-win32.whl", hash = "sha256:f34be2938726fc13801220747472850852fe6b1ea75869a048d6f896838c896f"}, - {file = "charset_normalizer-3.4.4-cp310-cp310-win_amd64.whl", hash = "sha256:a61900df84c667873b292c3de315a786dd8dac506704dea57bc957bd31e22c7d"}, - {file = "charset_normalizer-3.4.4-cp310-cp310-win_arm64.whl", hash = "sha256:cead0978fc57397645f12578bfd2d5ea9138ea0fac82b2f63f7f7c6877986a69"}, - {file = "charset_normalizer-3.4.4-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:6e1fcf0720908f200cd21aa4e6750a48ff6ce4afe7ff5a79a90d5ed8a08296f8"}, - {file = "charset_normalizer-3.4.4-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5f819d5fe9234f9f82d75bdfa9aef3a3d72c4d24a6e57aeaebba32a704553aa0"}, - {file = "charset_normalizer-3.4.4-cp311-cp311-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:a59cb51917aa591b1c4e6a43c132f0cdc3c76dbad6155df4e28ee626cc77a0a3"}, - {file = "charset_normalizer-3.4.4-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:8ef3c867360f88ac904fd3f5e1f902f13307af9052646963ee08ff4f131adafc"}, - {file = "charset_normalizer-3.4.4-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:d9e45d7faa48ee908174d8fe84854479ef838fc6a705c9315372eacbc2f02897"}, - {file = "charset_normalizer-3.4.4-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:840c25fb618a231545cbab0564a799f101b63b9901f2569faecd6b222ac72381"}, - {file = "charset_normalizer-3.4.4-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:ca5862d5b3928c4940729dacc329aa9102900382fea192fc5e52eb69d6093815"}, - {file = "charset_normalizer-3.4.4-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:d9c7f57c3d666a53421049053eaacdd14bbd0a528e2186fcb2e672effd053bb0"}, - {file = "charset_normalizer-3.4.4-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:277e970e750505ed74c832b4bf75dac7476262ee2a013f5574dd49075879e161"}, - {file = "charset_normalizer-3.4.4-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:31fd66405eaf47bb62e8cd575dc621c56c668f27d46a61d975a249930dd5e2a4"}, - {file = "charset_normalizer-3.4.4-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:0d3d8f15c07f86e9ff82319b3d9ef6f4bf907608f53fe9d92b28ea9ae3d1fd89"}, - {file = "charset_normalizer-3.4.4-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:9f7fcd74d410a36883701fafa2482a6af2ff5ba96b9a620e9e0721e28ead5569"}, - {file = "charset_normalizer-3.4.4-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:ebf3e58c7ec8a8bed6d66a75d7fb37b55e5015b03ceae72a8e7c74495551e224"}, - {file = "charset_normalizer-3.4.4-cp311-cp311-win32.whl", hash = "sha256:eecbc200c7fd5ddb9a7f16c7decb07b566c29fa2161a16cf67b8d068bd21690a"}, - {file = "charset_normalizer-3.4.4-cp311-cp311-win_amd64.whl", hash = "sha256:5ae497466c7901d54b639cf42d5b8c1b6a4fead55215500d2f486d34db48d016"}, - {file = "charset_normalizer-3.4.4-cp311-cp311-win_arm64.whl", hash = "sha256:65e2befcd84bc6f37095f5961e68a6f077bf44946771354a28ad434c2cce0ae1"}, - {file = "charset_normalizer-3.4.4-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:0a98e6759f854bd25a58a73fa88833fba3b7c491169f86ce1180c948ab3fd394"}, - {file = "charset_normalizer-3.4.4-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b5b290ccc2a263e8d185130284f8501e3e36c5e02750fc6b6bdeb2e9e96f1e25"}, - {file = "charset_normalizer-3.4.4-cp312-cp312-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:74bb723680f9f7a6234dcf67aea57e708ec1fbdf5699fb91dfd6f511b0a320ef"}, - {file = "charset_normalizer-3.4.4-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:f1e34719c6ed0b92f418c7c780480b26b5d9c50349e9a9af7d76bf757530350d"}, - {file = "charset_normalizer-3.4.4-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:2437418e20515acec67d86e12bf70056a33abdacb5cb1655042f6538d6b085a8"}, - {file = "charset_normalizer-3.4.4-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:11d694519d7f29d6cd09f6ac70028dba10f92f6cdd059096db198c283794ac86"}, - {file = "charset_normalizer-3.4.4-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:ac1c4a689edcc530fc9d9aa11f5774b9e2f33f9a0c6a57864e90908f5208d30a"}, - {file = "charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:21d142cc6c0ec30d2efee5068ca36c128a30b0f2c53c1c07bd78cb6bc1d3be5f"}, - {file = "charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:5dbe56a36425d26d6cfb40ce79c314a2e4dd6211d51d6d2191c00bed34f354cc"}, - {file = "charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:5bfbb1b9acf3334612667b61bd3002196fe2a1eb4dd74d247e0f2a4d50ec9bbf"}, - {file = "charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:d055ec1e26e441f6187acf818b73564e6e6282709e9bcb5b63f5b23068356a15"}, - {file = "charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:af2d8c67d8e573d6de5bc30cdb27e9b95e49115cd9baad5ddbd1a6207aaa82a9"}, - {file = "charset_normalizer-3.4.4-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:780236ac706e66881f3b7f2f32dfe90507a09e67d1d454c762cf642e6e1586e0"}, - {file = "charset_normalizer-3.4.4-cp312-cp312-win32.whl", hash = "sha256:5833d2c39d8896e4e19b689ffc198f08ea58116bee26dea51e362ecc7cd3ed26"}, - {file = "charset_normalizer-3.4.4-cp312-cp312-win_amd64.whl", hash = "sha256:a79cfe37875f822425b89a82333404539ae63dbdddf97f84dcbc3d339aae9525"}, - {file = "charset_normalizer-3.4.4-cp312-cp312-win_arm64.whl", hash = "sha256:376bec83a63b8021bb5c8ea75e21c4ccb86e7e45ca4eb81146091b56599b80c3"}, - {file = "charset_normalizer-3.4.4-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:e1f185f86a6f3403aa2420e815904c67b2f9ebc443f045edd0de921108345794"}, - {file = "charset_normalizer-3.4.4-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6b39f987ae8ccdf0d2642338faf2abb1862340facc796048b604ef14919e55ed"}, - {file = "charset_normalizer-3.4.4-cp313-cp313-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:3162d5d8ce1bb98dd51af660f2121c55d0fa541b46dff7bb9b9f86ea1d87de72"}, - {file = "charset_normalizer-3.4.4-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:81d5eb2a312700f4ecaa977a8235b634ce853200e828fbadf3a9c50bab278328"}, - {file = "charset_normalizer-3.4.4-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:5bd2293095d766545ec1a8f612559f6b40abc0eb18bb2f5d1171872d34036ede"}, - {file = "charset_normalizer-3.4.4-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a8a8b89589086a25749f471e6a900d3f662d1d3b6e2e59dcecf787b1cc3a1894"}, - {file = "charset_normalizer-3.4.4-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:bc7637e2f80d8530ee4a78e878bce464f70087ce73cf7c1caf142416923b98f1"}, - {file = "charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:f8bf04158c6b607d747e93949aa60618b61312fe647a6369f88ce2ff16043490"}, - {file = "charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:554af85e960429cf30784dd47447d5125aaa3b99a6f0683589dbd27e2f45da44"}, - {file = "charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:74018750915ee7ad843a774364e13a3db91682f26142baddf775342c3f5b1133"}, - {file = "charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:c0463276121fdee9c49b98908b3a89c39be45d86d1dbaa22957e38f6321d4ce3"}, - {file = "charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:362d61fd13843997c1c446760ef36f240cf81d3ebf74ac62652aebaf7838561e"}, - {file = "charset_normalizer-3.4.4-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:9a26f18905b8dd5d685d6d07b0cdf98a79f3c7a918906af7cc143ea2e164c8bc"}, - {file = "charset_normalizer-3.4.4-cp313-cp313-win32.whl", hash = "sha256:9b35f4c90079ff2e2edc5b26c0c77925e5d2d255c42c74fdb70fb49b172726ac"}, - {file = "charset_normalizer-3.4.4-cp313-cp313-win_amd64.whl", hash = "sha256:b435cba5f4f750aa6c0a0d92c541fb79f69a387c91e61f1795227e4ed9cece14"}, - {file = "charset_normalizer-3.4.4-cp313-cp313-win_arm64.whl", hash = "sha256:542d2cee80be6f80247095cc36c418f7bddd14f4a6de45af91dfad36d817bba2"}, - {file = "charset_normalizer-3.4.4-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:da3326d9e65ef63a817ecbcc0df6e94463713b754fe293eaa03da99befb9a5bd"}, - {file = "charset_normalizer-3.4.4-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8af65f14dc14a79b924524b1e7fffe304517b2bff5a58bf64f30b98bbc5079eb"}, - {file = "charset_normalizer-3.4.4-cp314-cp314-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:74664978bb272435107de04e36db5a9735e78232b85b77d45cfb38f758efd33e"}, - {file = "charset_normalizer-3.4.4-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:752944c7ffbfdd10c074dc58ec2d5a8a4cd9493b314d367c14d24c17684ddd14"}, - {file = "charset_normalizer-3.4.4-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:d1f13550535ad8cff21b8d757a3257963e951d96e20ec82ab44bc64aeb62a191"}, - {file = "charset_normalizer-3.4.4-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ecaae4149d99b1c9e7b88bb03e3221956f68fd6d50be2ef061b2381b61d20838"}, - {file = "charset_normalizer-3.4.4-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:cb6254dc36b47a990e59e1068afacdcd02958bdcce30bb50cc1700a8b9d624a6"}, - {file = "charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:c8ae8a0f02f57a6e61203a31428fa1d677cbe50c93622b4149d5c0f319c1d19e"}, - {file = "charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:47cc91b2f4dd2833fddaedd2893006b0106129d4b94fdb6af1f4ce5a9965577c"}, - {file = "charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:82004af6c302b5d3ab2cfc4cc5f29db16123b1a8417f2e25f9066f91d4411090"}, - {file = "charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:2b7d8f6c26245217bd2ad053761201e9f9680f8ce52f0fcd8d0755aeae5b2152"}, - {file = "charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:799a7a5e4fb2d5898c60b640fd4981d6a25f1c11790935a44ce38c54e985f828"}, - {file = "charset_normalizer-3.4.4-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:99ae2cffebb06e6c22bdc25801d7b30f503cc87dbd283479e7b606f70aff57ec"}, - {file = "charset_normalizer-3.4.4-cp314-cp314-win32.whl", hash = "sha256:f9d332f8c2a2fcbffe1378594431458ddbef721c1769d78e2cbc06280d8155f9"}, - {file = "charset_normalizer-3.4.4-cp314-cp314-win_amd64.whl", hash = "sha256:8a6562c3700cce886c5be75ade4a5db4214fda19fede41d9792d100288d8f94c"}, - {file = "charset_normalizer-3.4.4-cp314-cp314-win_arm64.whl", hash = "sha256:de00632ca48df9daf77a2c65a484531649261ec9f25489917f09e455cb09ddb2"}, - {file = "charset_normalizer-3.4.4-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:ce8a0633f41a967713a59c4139d29110c07e826d131a316b50ce11b1d79b4f84"}, - {file = "charset_normalizer-3.4.4-cp38-cp38-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:eaabd426fe94daf8fd157c32e571c85cb12e66692f15516a83a03264b08d06c3"}, - {file = "charset_normalizer-3.4.4-cp38-cp38-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:c4ef880e27901b6cc782f1b95f82da9313c0eb95c3af699103088fa0ac3ce9ac"}, - {file = "charset_normalizer-3.4.4-cp38-cp38-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:2aaba3b0819274cc41757a1da876f810a3e4d7b6eb25699253a4effef9e8e4af"}, - {file = "charset_normalizer-3.4.4-cp38-cp38-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:778d2e08eda00f4256d7f672ca9fef386071c9202f5e4607920b86d7803387f2"}, - {file = "charset_normalizer-3.4.4-cp38-cp38-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f155a433c2ec037d4e8df17d18922c3a0d9b3232a396690f17175d2946f0218d"}, - {file = "charset_normalizer-3.4.4-cp38-cp38-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:a8bf8d0f749c5757af2142fe7903a9df1d2e8aa3841559b2bad34b08d0e2bcf3"}, - {file = "charset_normalizer-3.4.4-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:194f08cbb32dc406d6e1aea671a68be0823673db2832b38405deba2fb0d88f63"}, - {file = "charset_normalizer-3.4.4-cp38-cp38-musllinux_1_2_armv7l.whl", hash = "sha256:6aee717dcfead04c6eb1ce3bd29ac1e22663cdea57f943c87d1eab9a025438d7"}, - {file = "charset_normalizer-3.4.4-cp38-cp38-musllinux_1_2_ppc64le.whl", hash = "sha256:cd4b7ca9984e5e7985c12bc60a6f173f3c958eae74f3ef6624bb6b26e2abbae4"}, - {file = "charset_normalizer-3.4.4-cp38-cp38-musllinux_1_2_riscv64.whl", hash = "sha256:b7cf1017d601aa35e6bb650b6ad28652c9cd78ee6caff19f3c28d03e1c80acbf"}, - {file = "charset_normalizer-3.4.4-cp38-cp38-musllinux_1_2_s390x.whl", hash = "sha256:e912091979546adf63357d7e2ccff9b44f026c075aeaf25a52d0e95ad2281074"}, - {file = "charset_normalizer-3.4.4-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:5cb4d72eea50c8868f5288b7f7f33ed276118325c1dfd3957089f6b519e1382a"}, - {file = "charset_normalizer-3.4.4-cp38-cp38-win32.whl", hash = "sha256:837c2ce8c5a65a2035be9b3569c684358dfbf109fd3b6969630a87535495ceaa"}, - {file = "charset_normalizer-3.4.4-cp38-cp38-win_amd64.whl", hash = "sha256:44c2a8734b333e0578090c4cd6b16f275e07aa6614ca8715e6c038e865e70576"}, - {file = "charset_normalizer-3.4.4-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:a9768c477b9d7bd54bc0c86dbaebdec6f03306675526c9927c0e8a04e8f94af9"}, - {file = "charset_normalizer-3.4.4-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1bee1e43c28aa63cb16e5c14e582580546b08e535299b8b6158a7c9c768a1f3d"}, - {file = "charset_normalizer-3.4.4-cp39-cp39-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:fd44c878ea55ba351104cb93cc85e74916eb8fa440ca7903e57575e97394f608"}, - {file = "charset_normalizer-3.4.4-cp39-cp39-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:0f04b14ffe5fdc8c4933862d8306109a2c51e0704acfa35d51598eb45a1e89fc"}, - {file = "charset_normalizer-3.4.4-cp39-cp39-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:cd09d08005f958f370f539f186d10aec3377d55b9eeb0d796025d4886119d76e"}, - {file = "charset_normalizer-3.4.4-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4fe7859a4e3e8457458e2ff592f15ccb02f3da787fcd31e0183879c3ad4692a1"}, - {file = "charset_normalizer-3.4.4-cp39-cp39-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:fa09f53c465e532f4d3db095e0c55b615f010ad81803d383195b6b5ca6cbf5f3"}, - {file = "charset_normalizer-3.4.4-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:7fa17817dc5625de8a027cb8b26d9fefa3ea28c8253929b8d6649e705d2835b6"}, - {file = "charset_normalizer-3.4.4-cp39-cp39-musllinux_1_2_armv7l.whl", hash = "sha256:5947809c8a2417be3267efc979c47d76a079758166f7d43ef5ae8e9f92751f88"}, - {file = "charset_normalizer-3.4.4-cp39-cp39-musllinux_1_2_ppc64le.whl", hash = "sha256:4902828217069c3c5c71094537a8e623f5d097858ac6ca8252f7b4d10b7560f1"}, - {file = "charset_normalizer-3.4.4-cp39-cp39-musllinux_1_2_riscv64.whl", hash = "sha256:7c308f7e26e4363d79df40ca5b2be1c6ba9f02bdbccfed5abddb7859a6ce72cf"}, - {file = "charset_normalizer-3.4.4-cp39-cp39-musllinux_1_2_s390x.whl", hash = "sha256:2c9d3c380143a1fedbff95a312aa798578371eb29da42106a29019368a475318"}, - {file = "charset_normalizer-3.4.4-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:cb01158d8b88ee68f15949894ccc6712278243d95f344770fa7593fa2d94410c"}, - {file = "charset_normalizer-3.4.4-cp39-cp39-win32.whl", hash = "sha256:2677acec1a2f8ef614c6888b5b4ae4060cc184174a938ed4e8ef690e15d3e505"}, - {file = "charset_normalizer-3.4.4-cp39-cp39-win_amd64.whl", hash = "sha256:f8e160feb2aed042cd657a72acc0b481212ed28b1b9a95c0cee1621b524e1966"}, - {file = "charset_normalizer-3.4.4-cp39-cp39-win_arm64.whl", hash = "sha256:b5d84d37db046c5ca74ee7bb47dd6cbc13f80665fdde3e8040bdd3fb015ecb50"}, - {file = "charset_normalizer-3.4.4-py3-none-any.whl", hash = "sha256:7a32c560861a02ff789ad905a2fe94e3f840803362c84fecf1851cb4cf3dc37f"}, - {file = "charset_normalizer-3.4.4.tar.gz", hash = "sha256:94537985111c35f28720e43603b8e7b43a6ecfb2ce1d3058bbe955b73404e21a"}, + {file = "charset_normalizer-3.4.7-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:cdd68a1fb318e290a2077696b7eb7a21a49163c455979c639bf5a5dcdc46617d"}, + {file = "charset_normalizer-3.4.7-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e17b8d5d6a8c47c85e68ca8379def1303fd360c3e22093a807cd34a71cd082b8"}, + {file = "charset_normalizer-3.4.7-cp310-cp310-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:511ef87c8aec0783e08ac18565a16d435372bc1ac25a91e6ac7f5ef2b0bff790"}, + {file = "charset_normalizer-3.4.7-cp310-cp310-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:007d05ec7321d12a40227aae9e2bc6dca73f3cb21058999a1df9e193555a9dcc"}, + {file = "charset_normalizer-3.4.7-cp310-cp310-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:cf29836da5119f3c8a8a70667b0ef5fdca3bb12f80fd06487cfa575b3909b393"}, + {file = "charset_normalizer-3.4.7-cp310-cp310-manylinux_2_31_armv7l.whl", hash = "sha256:12d8baf840cc7889b37c7c770f478adea7adce3dcb3944d02ec87508e2dcf153"}, + {file = "charset_normalizer-3.4.7-cp310-cp310-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:d560742f3c0d62afaccf9f41fe485ed69bd7661a241f86a3ef0f0fb8b1a397af"}, + {file = "charset_normalizer-3.4.7-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:b14b2d9dac08e28bb8046a1a0434b1750eb221c8f5b87a68f4fa11a6f97b5e34"}, + {file = "charset_normalizer-3.4.7-cp310-cp310-musllinux_1_2_armv7l.whl", hash = "sha256:bc17a677b21b3502a21f66a8cc64f5bfad4df8a0b8434d661666f8ce90ac3af1"}, + {file = "charset_normalizer-3.4.7-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:750e02e074872a3fad7f233b47734166440af3cdea0add3e95163110816d6752"}, + {file = "charset_normalizer-3.4.7-cp310-cp310-musllinux_1_2_riscv64.whl", hash = "sha256:4e5163c14bffd570ef2affbfdd77bba66383890797df43dc8b4cc7d6f500bf53"}, + {file = "charset_normalizer-3.4.7-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:6ed74185b2db44f41ef35fd1617c5888e59792da9bbc9190d6c7300617182616"}, + {file = "charset_normalizer-3.4.7-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:94e1885b270625a9a828c9793b4d52a64445299baa1fea5a173bf1d3dd9a1a5a"}, + {file = "charset_normalizer-3.4.7-cp310-cp310-win32.whl", hash = "sha256:6785f414ae0f3c733c437e0f3929197934f526d19dfaa75e18fdb4f94c6fb374"}, + {file = "charset_normalizer-3.4.7-cp310-cp310-win_amd64.whl", hash = "sha256:6696b7688f54f5af4462118f0bfa7c1621eeb87154f77fa04b9295ce7a8f2943"}, + {file = "charset_normalizer-3.4.7-cp310-cp310-win_arm64.whl", hash = "sha256:66671f93accb62ed07da56613636f3641f1a12c13046ce91ffc923721f23c008"}, + {file = "charset_normalizer-3.4.7-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:7641bb8895e77f921102f72833904dcd9901df5d6d72a2ab8f31d04b7e51e4e7"}, + {file = "charset_normalizer-3.4.7-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:202389074300232baeb53ae2569a60901f7efadd4245cf3a3bf0617d60b439d7"}, + {file = "charset_normalizer-3.4.7-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:30b8d1d8c52a48c2c5690e152c169b673487a2a58de1ec7393196753063fcd5e"}, + {file = "charset_normalizer-3.4.7-cp311-cp311-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:532bc9bf33a68613fd7d65e4b1c71a6a38d7d42604ecf239c77392e9b4e8998c"}, + {file = "charset_normalizer-3.4.7-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:2fe249cb4651fd12605b7288b24751d8bfd46d35f12a20b1ba33dea122e690df"}, + {file = "charset_normalizer-3.4.7-cp311-cp311-manylinux_2_31_armv7l.whl", hash = "sha256:65bcd23054beab4d166035cabbc868a09c1a49d1efe458fe8e4361215df40265"}, + {file = "charset_normalizer-3.4.7-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:08e721811161356f97b4059a9ba7bafb23ea5ee2255402c42881c214e173c6b4"}, + {file = "charset_normalizer-3.4.7-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:e060d01aec0a910bdccb8be71faf34e7799ce36950f8294c8bf612cba65a2c9e"}, + {file = "charset_normalizer-3.4.7-cp311-cp311-musllinux_1_2_armv7l.whl", hash = "sha256:38c0109396c4cfc574d502df99742a45c72c08eff0a36158b6f04000043dbf38"}, + {file = "charset_normalizer-3.4.7-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:1c2a768fdd44ee4a9339a9b0b130049139b8ce3c01d2ce09f67f5a68048d477c"}, + {file = "charset_normalizer-3.4.7-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:1a87ca9d5df6fe460483d9a5bbf2b18f620cbed41b432e2bddb686228282d10b"}, + {file = "charset_normalizer-3.4.7-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:d635aab80466bc95771bb78d5370e74d36d1fe31467b6b29b8b57b2a3cd7d22c"}, + {file = "charset_normalizer-3.4.7-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:ae196f021b5e7c78e918242d217db021ed2a6ace2bc6ae94c0fc596221c7f58d"}, + {file = "charset_normalizer-3.4.7-cp311-cp311-win32.whl", hash = "sha256:adb2597b428735679446b46c8badf467b4ca5f5056aae4d51a19f9570301b1ad"}, + {file = "charset_normalizer-3.4.7-cp311-cp311-win_amd64.whl", hash = "sha256:8e385e4267ab76874ae30db04c627faaaf0b509e1ccc11a95b3fc3e83f855c00"}, + {file = "charset_normalizer-3.4.7-cp311-cp311-win_arm64.whl", hash = "sha256:d4a48e5b3c2a489fae013b7589308a40146ee081f6f509e047e0e096084ceca1"}, + {file = "charset_normalizer-3.4.7-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:eca9705049ad3c7345d574e3510665cb2cf844c2f2dcfe675332677f081cbd46"}, + {file = "charset_normalizer-3.4.7-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6178f72c5508bfc5fd446a5905e698c6212932f25bcdd4b47a757a50605a90e2"}, + {file = "charset_normalizer-3.4.7-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:e1421b502d83040e6d7fb2fb18dff63957f720da3d77b2fbd3187ceb63755d7b"}, + {file = "charset_normalizer-3.4.7-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:edac0f1ab77644605be2cbba52e6b7f630731fc42b34cb0f634be1a6eface56a"}, + {file = "charset_normalizer-3.4.7-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:5649fd1c7bade02f320a462fdefd0b4bd3ce036065836d4f42e0de958038e116"}, + {file = "charset_normalizer-3.4.7-cp312-cp312-manylinux_2_31_armv7l.whl", hash = "sha256:203104ed3e428044fd943bc4bf45fa73c0730391f9621e37fe39ecf477b128cb"}, + {file = "charset_normalizer-3.4.7-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:298930cec56029e05497a76988377cbd7457ba864beeea92ad7e844fe74cd1f1"}, + {file = "charset_normalizer-3.4.7-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:708838739abf24b2ceb208d0e22403dd018faeef86ddac04319a62ae884c4f15"}, + {file = "charset_normalizer-3.4.7-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:0f7eb884681e3938906ed0434f20c63046eacd0111c4ba96f27b76084cd679f5"}, + {file = "charset_normalizer-3.4.7-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:4dc1e73c36828f982bfe79fadf5919923f8a6f4df2860804db9a98c48824ce8d"}, + {file = "charset_normalizer-3.4.7-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:aed52fea0513bac0ccde438c188c8a471c4e0f457c2dd20cdbf6ea7a450046c7"}, + {file = "charset_normalizer-3.4.7-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:fea24543955a6a729c45a73fe90e08c743f0b3334bbf3201e6c4bc1b0c7fa464"}, + {file = "charset_normalizer-3.4.7-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:bb6d88045545b26da47aa879dd4a89a71d1dce0f0e549b1abcb31dfe4a8eac49"}, + {file = "charset_normalizer-3.4.7-cp312-cp312-win32.whl", hash = "sha256:2257141f39fe65a3fdf38aeccae4b953e5f3b3324f4ff0daf9f15b8518666a2c"}, + {file = "charset_normalizer-3.4.7-cp312-cp312-win_amd64.whl", hash = "sha256:5ed6ab538499c8644b8a3e18debabcd7ce684f3fa91cf867521a7a0279cab2d6"}, + {file = "charset_normalizer-3.4.7-cp312-cp312-win_arm64.whl", hash = "sha256:56be790f86bfb2c98fb742ce566dfb4816e5a83384616ab59c49e0604d49c51d"}, + {file = "charset_normalizer-3.4.7-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:f496c9c3cc02230093d8330875c4c3cdfc3b73612a5fd921c65d39cbcef08063"}, + {file = "charset_normalizer-3.4.7-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0ea948db76d31190bf08bd371623927ee1339d5f2a0b4b1b4a4439a65298703c"}, + {file = "charset_normalizer-3.4.7-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:a277ab8928b9f299723bc1a2dabb1265911b1a76341f90a510368ca44ad9ab66"}, + {file = "charset_normalizer-3.4.7-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:3bec022aec2c514d9cf199522a802bd007cd588ab17ab2525f20f9c34d067c18"}, + {file = "charset_normalizer-3.4.7-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e044c39e41b92c845bc815e5ae4230804e8e7bc29e399b0437d64222d92809dd"}, + {file = "charset_normalizer-3.4.7-cp313-cp313-manylinux_2_31_armv7l.whl", hash = "sha256:f495a1652cf3fbab2eb0639776dad966c2fb874d79d87ca07f9d5f059b8bd215"}, + {file = "charset_normalizer-3.4.7-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:e712b419df8ba5e42b226c510472b37bd57b38e897d3eca5e8cfd410a29fa859"}, + {file = "charset_normalizer-3.4.7-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:7804338df6fcc08105c7745f1502ba68d900f45fd770d5bdd5288ddccb8a42d8"}, + {file = "charset_normalizer-3.4.7-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:481551899c856c704d58119b5025793fa6730adda3571971af568f66d2424bb5"}, + {file = "charset_normalizer-3.4.7-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:f59099f9b66f0d7145115e6f80dd8b1d847176df89b234a5a6b3f00437aa0832"}, + {file = "charset_normalizer-3.4.7-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:f59ad4c0e8f6bba240a9bb85504faa1ab438237199d4cce5f622761507b8f6a6"}, + {file = "charset_normalizer-3.4.7-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:3dedcc22d73ec993f42055eff4fcfed9318d1eeb9a6606c55892a26964964e48"}, + {file = "charset_normalizer-3.4.7-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:64f02c6841d7d83f832cd97ccf8eb8a906d06eb95d5276069175c696b024b60a"}, + {file = "charset_normalizer-3.4.7-cp313-cp313-win32.whl", hash = "sha256:4042d5c8f957e15221d423ba781e85d553722fc4113f523f2feb7b188cc34c5e"}, + {file = "charset_normalizer-3.4.7-cp313-cp313-win_amd64.whl", hash = "sha256:3946fa46a0cf3e4c8cb1cc52f56bb536310d34f25f01ca9b6c16afa767dab110"}, + {file = "charset_normalizer-3.4.7-cp313-cp313-win_arm64.whl", hash = "sha256:80d04837f55fc81da168b98de4f4b797ef007fc8a79ab71c6ec9bc4dd662b15b"}, + {file = "charset_normalizer-3.4.7-cp314-cp314-macosx_10_15_universal2.whl", hash = "sha256:c36c333c39be2dbca264d7803333c896ab8fa7d4d6f0ab7edb7dfd7aea6e98c0"}, + {file = "charset_normalizer-3.4.7-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1c2aed2e5e41f24ea8ef1590b8e848a79b56f3a5564a65ceec43c9d692dc7d8a"}, + {file = "charset_normalizer-3.4.7-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:54523e136b8948060c0fa0bc7b1b50c32c186f2fceee897a495406bb6e311d2b"}, + {file = "charset_normalizer-3.4.7-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:715479b9a2802ecac752a3b0efa2b0b60285cf962ee38414211abdfccc233b41"}, + {file = "charset_normalizer-3.4.7-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bd6c2a1c7573c64738d716488d2cdd3c00e340e4835707d8fdb8dc1a66ef164e"}, + {file = "charset_normalizer-3.4.7-cp314-cp314-manylinux_2_31_armv7l.whl", hash = "sha256:c45e9440fb78f8ddabcf714b68f936737a121355bf59f3907f4e17721b9d1aae"}, + {file = "charset_normalizer-3.4.7-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:3534e7dcbdcf757da6b85a0bbf5b6868786d5982dd959b065e65481644817a18"}, + {file = "charset_normalizer-3.4.7-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:e8ac484bf18ce6975760921bb6148041faa8fef0547200386ea0b52b5d27bf7b"}, + {file = "charset_normalizer-3.4.7-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:a5fe03b42827c13cdccd08e6c0247b6a6d4b5e3cdc53fd1749f5896adcdc2356"}, + {file = "charset_normalizer-3.4.7-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:2d6eb928e13016cea4f1f21d1e10c1cebd5a421bc57ddf5b1142ae3f86824fab"}, + {file = "charset_normalizer-3.4.7-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:e74327fb75de8986940def6e8dee4f127cc9752bee7355bb323cc5b2659b6d46"}, + {file = "charset_normalizer-3.4.7-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:d6038d37043bced98a66e68d3aa2b6a35505dc01328cd65217cefe82f25def44"}, + {file = "charset_normalizer-3.4.7-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:7579e913a5339fb8fa133f6bbcfd8e6749696206cf05acdbdca71a1b436d8e72"}, + {file = "charset_normalizer-3.4.7-cp314-cp314-win32.whl", hash = "sha256:5b77459df20e08151cd6f8b9ef8ef1f961ef73d85c21a555c7eed5b79410ec10"}, + {file = "charset_normalizer-3.4.7-cp314-cp314-win_amd64.whl", hash = "sha256:92a0a01ead5e668468e952e4238cccd7c537364eb7d851ab144ab6627dbbe12f"}, + {file = "charset_normalizer-3.4.7-cp314-cp314-win_arm64.whl", hash = "sha256:67f6279d125ca0046a7fd386d01b311c6363844deac3e5b069b514ba3e63c246"}, + {file = "charset_normalizer-3.4.7-cp314-cp314t-macosx_10_15_universal2.whl", hash = "sha256:effc3f449787117233702311a1b7d8f59cba9ced946ba727bdc329ec69028e24"}, + {file = "charset_normalizer-3.4.7-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:fbccdc05410c9ee21bbf16a35f4c1d16123dcdeb8a1d38f33654fa21d0234f79"}, + {file = "charset_normalizer-3.4.7-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:733784b6d6def852c814bce5f318d25da2ee65dd4839a0718641c696e09a2960"}, + {file = "charset_normalizer-3.4.7-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:a89c23ef8d2c6b27fd200a42aa4ac72786e7c60d40efdc76e6011260b6e949c4"}, + {file = "charset_normalizer-3.4.7-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:6c114670c45346afedc0d947faf3c7f701051d2518b943679c8ff88befe14f8e"}, + {file = "charset_normalizer-3.4.7-cp314-cp314t-manylinux_2_31_armv7l.whl", hash = "sha256:a180c5e59792af262bf263b21a3c49353f25945d8d9f70628e73de370d55e1e1"}, + {file = "charset_normalizer-3.4.7-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:3c9a494bc5ec77d43cea229c4f6db1e4d8fe7e1bbffa8b6f0f0032430ff8ab44"}, + {file = "charset_normalizer-3.4.7-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:8d828b6667a32a728a1ad1d93957cdf37489c57b97ae6c4de2860fa749b8fc1e"}, + {file = "charset_normalizer-3.4.7-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:cf1493cd8607bec4d8a7b9b004e699fcf8f9103a9284cc94962cb73d20f9d4a3"}, + {file = "charset_normalizer-3.4.7-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:0c96c3b819b5c3e9e165495db84d41914d6894d55181d2d108cc1a69bfc9cce0"}, + {file = "charset_normalizer-3.4.7-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:752a45dc4a6934060b3b0dab47e04edc3326575f82be64bc4fc293914566503e"}, + {file = "charset_normalizer-3.4.7-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:8778f0c7a52e56f75d12dae53ae320fae900a8b9b4164b981b9c5ce059cd1fcb"}, + {file = "charset_normalizer-3.4.7-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:ce3412fbe1e31eb81ea42f4169ed94861c56e643189e1e75f0041f3fe7020abe"}, + {file = "charset_normalizer-3.4.7-cp314-cp314t-win32.whl", hash = "sha256:c03a41a8784091e67a39648f70c5f97b5b6a37f216896d44d2cdcb82615339a0"}, + {file = "charset_normalizer-3.4.7-cp314-cp314t-win_amd64.whl", hash = "sha256:03853ed82eeebbce3c2abfdbc98c96dc205f32a79627688ac9a27370ea61a49c"}, + {file = "charset_normalizer-3.4.7-cp314-cp314t-win_arm64.whl", hash = "sha256:c35abb8bfff0185efac5878da64c45dafd2b37fb0383add1be155a763c1f083d"}, + {file = "charset_normalizer-3.4.7-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:e5f4d355f0a2b1a31bc3edec6795b46324349c9cb25eed068049e4f472fb4259"}, + {file = "charset_normalizer-3.4.7-cp38-cp38-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:16d971e29578a5e97d7117866d15889a4a07befe0e87e703ed63cd90cb348c01"}, + {file = "charset_normalizer-3.4.7-cp38-cp38-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:dca4bbc466a95ba9c0234ef56d7dd9509f63da22274589ebd4ed7f1f4d4c54e3"}, + {file = "charset_normalizer-3.4.7-cp38-cp38-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:e80c8378d8f3d83cd3164da1ad2df9e37a666cdde7b1cb2298ed0b558064be30"}, + {file = "charset_normalizer-3.4.7-cp38-cp38-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:36836d6ff945a00b88ba1e4572d721e60b5b8c98c155d465f56ad19d68f23734"}, + {file = "charset_normalizer-3.4.7-cp38-cp38-manylinux_2_31_armv7l.whl", hash = "sha256:bd9b23791fe793e4968dba0c447e12f78e425c59fc0e3b97f6450f4781f3ee60"}, + {file = "charset_normalizer-3.4.7-cp38-cp38-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:aef65cd602a6d0e0ff6f9930fcb1c8fec60dd2cfcb6facaf4bdb0e5873042db0"}, + {file = "charset_normalizer-3.4.7-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:82b271f5137d07749f7bf32f70b17ab6eaabedd297e75dce75081a24f76eb545"}, + {file = "charset_normalizer-3.4.7-cp38-cp38-musllinux_1_2_armv7l.whl", hash = "sha256:1efde3cae86c8c273f1eb3b287be7d8499420cf2fe7585c41d370d3e790054a5"}, + {file = "charset_normalizer-3.4.7-cp38-cp38-musllinux_1_2_ppc64le.whl", hash = "sha256:c593052c465475e64bbfe5dbd81680f64a67fdc752c56d7a0ae205dc8aeefe0f"}, + {file = "charset_normalizer-3.4.7-cp38-cp38-musllinux_1_2_riscv64.whl", hash = "sha256:af21eb4409a119e365397b2adbaca4c9ccab56543a65d5dbd9f920d6ac29f686"}, + {file = "charset_normalizer-3.4.7-cp38-cp38-musllinux_1_2_s390x.whl", hash = "sha256:84c018e49c3bf790f9c2771c45e9313a08c2c2a6342b162cd650258b57817706"}, + {file = "charset_normalizer-3.4.7-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:dd915403e231e6b1809fe9b6d9fc55cf8fb5e02765ac625d9cd623342a7905d7"}, + {file = "charset_normalizer-3.4.7-cp38-cp38-win32.whl", hash = "sha256:320ade88cfb846b8cd6b4ddf5ee9e80ee0c1f52401f2456b84ae1ae6a1a5f207"}, + {file = "charset_normalizer-3.4.7-cp38-cp38-win_amd64.whl", hash = "sha256:1dc8b0ea451d6e69735094606991f32867807881400f808a106ee1d963c46a83"}, + {file = "charset_normalizer-3.4.7-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:177a0ba5f0211d488e295aaf82707237e331c24788d8d76c96c5a41594723217"}, + {file = "charset_normalizer-3.4.7-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6e0d51f618228538a3e8f46bd246f87a6cd030565e015803691603f55e12afb5"}, + {file = "charset_normalizer-3.4.7-cp39-cp39-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:14265bfe1f09498b9d8ec91e9ec9fa52775edf90fcbde092b25f4a33d444fea9"}, + {file = "charset_normalizer-3.4.7-cp39-cp39-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:87fad7d9ba98c86bcb41b2dc8dbb326619be2562af1f8ff50776a39e55721c5a"}, + {file = "charset_normalizer-3.4.7-cp39-cp39-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f22dec1690b584cea26fade98b2435c132c1b5f68e39f5a0b7627cd7ae31f1dc"}, + {file = "charset_normalizer-3.4.7-cp39-cp39-manylinux_2_31_armv7l.whl", hash = "sha256:d61f00a0869d77422d9b2aba989e2d24afa6ffd552af442e0e58de4f35ea6d00"}, + {file = "charset_normalizer-3.4.7-cp39-cp39-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:6370e8686f662e6a3941ee48ed4742317cafbe5707e36406e9df792cdb535776"}, + {file = "charset_normalizer-3.4.7-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:a6c5863edfbe888d9eff9c8b8087354e27618d9da76425c119293f11712a6319"}, + {file = "charset_normalizer-3.4.7-cp39-cp39-musllinux_1_2_armv7l.whl", hash = "sha256:ed065083d0898c9d5b4bbec7b026fd755ff7454e6e8b73a67f8c744b13986e24"}, + {file = "charset_normalizer-3.4.7-cp39-cp39-musllinux_1_2_ppc64le.whl", hash = "sha256:2cd4a60d0e2fb04537162c62bbbb4182f53541fe0ede35cdf270a1c1e723cc42"}, + {file = "charset_normalizer-3.4.7-cp39-cp39-musllinux_1_2_riscv64.whl", hash = "sha256:813c0e0132266c08eb87469a642cb30aaff57c5f426255419572aaeceeaa7bf4"}, + {file = "charset_normalizer-3.4.7-cp39-cp39-musllinux_1_2_s390x.whl", hash = "sha256:07d9e39b01743c3717745f4c530a6349eadbfa043c7577eef86c502c15df2c67"}, + {file = "charset_normalizer-3.4.7-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:c0f081d69a6e58272819b70288d3221a6ee64b98df852631c80f293514d3b274"}, + {file = "charset_normalizer-3.4.7-cp39-cp39-win32.whl", hash = "sha256:8751d2787c9131302398b11e6c8068053dcb55d5a8964e114b6e196cf16cb366"}, + {file = "charset_normalizer-3.4.7-cp39-cp39-win_amd64.whl", hash = "sha256:12a6fff75f6bc66711b73a2f0addfc4c8c15a20e805146a02d147a318962c444"}, + {file = "charset_normalizer-3.4.7-cp39-cp39-win_arm64.whl", hash = "sha256:bb8cc7534f51d9a017b93e3e85b260924f909601c3df002bcdb58ddb4dc41a5c"}, + {file = "charset_normalizer-3.4.7-py3-none-any.whl", hash = "sha256:3dce51d0f5e7951f8bb4900c257dad282f49190fdbebecd4ba99bcc41fef404d"}, + {file = "charset_normalizer-3.4.7.tar.gz", hash = "sha256:ae89db9e5f98a11a4bf50407d4363e7b09b31e55bc117b4f7d80aab97ba009e5"}, ] [[package]] name = "click" -version = "8.3.1" +version = "8.3.2" description = "Composable command line interface toolkit" optional = false python-versions = ">=3.10" groups = ["main", "dev"] files = [ - {file = "click-8.3.1-py3-none-any.whl", hash = "sha256:981153a64e25f12d547d3426c367a4857371575ee7ad18df2a6183ab0545b2a6"}, - {file = "click-8.3.1.tar.gz", hash = "sha256:12ff4785d337a1bb490bb7e9c2b1ee5da3112e94a8622f26a6c77f5d2fc6842a"}, + {file = "click-8.3.2-py3-none-any.whl", hash = "sha256:1924d2c27c5653561cd2cae4548d1406039cb79b858b747cfea24924bbc1616d"}, + {file = "click-8.3.2.tar.gz", hash = "sha256:14162b8b3b3550a7d479eafa77dfd3c38d9dc8951f6f69c78913a8f9a7540fd5"}, ] [package.dependencies] @@ -292,118 +277,118 @@ markers = {main = "platform_system == \"Windows\" or sys_platform == \"win32\""} [[package]] name = "coverage" -version = "7.13.4" +version = "7.13.5" description = "Code coverage measurement for Python" optional = false python-versions = ">=3.10" groups = ["dev"] files = [ - {file = "coverage-7.13.4-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:0fc31c787a84f8cd6027eba44010517020e0d18487064cd3d8968941856d1415"}, - {file = "coverage-7.13.4-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:a32ebc02a1805adf637fc8dec324b5cdacd2e493515424f70ee33799573d661b"}, - {file = "coverage-7.13.4-cp310-cp310-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:e24f9156097ff9dc286f2f913df3a7f63c0e333dcafa3c196f2c18b4175ca09a"}, - {file = "coverage-7.13.4-cp310-cp310-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:8041b6c5bfdc03257666e9881d33b1abc88daccaf73f7b6340fb7946655cd10f"}, - {file = "coverage-7.13.4-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:2a09cfa6a5862bc2fc6ca7c3def5b2926194a56b8ab78ffcf617d28911123012"}, - {file = "coverage-7.13.4-cp310-cp310-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:296f8b0af861d3970c2a4d8c91d48eb4dd4771bcef9baedec6a9b515d7de3def"}, - {file = "coverage-7.13.4-cp310-cp310-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:e101609bcbbfb04605ea1027b10dc3735c094d12d40826a60f897b98b1c30256"}, - {file = "coverage-7.13.4-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:aa3feb8db2e87ff5e6d00d7e1480ae241876286691265657b500886c98f38bda"}, - {file = "coverage-7.13.4-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:4fc7fa81bbaf5a02801b65346c8b3e657f1d93763e58c0abdf7c992addd81a92"}, - {file = "coverage-7.13.4-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:33901f604424145c6e9c2398684b92e176c0b12df77d52db81c20abd48c3794c"}, - {file = "coverage-7.13.4-cp310-cp310-musllinux_1_2_riscv64.whl", hash = "sha256:bb28c0f2cf2782508a40cec377935829d5fcc3ad9a3681375af4e84eb34b6b58"}, - {file = "coverage-7.13.4-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:9d107aff57a83222ddbd8d9ee705ede2af2cc926608b57abed8ef96b50b7e8f9"}, - {file = "coverage-7.13.4-cp310-cp310-win32.whl", hash = "sha256:a6f94a7d00eb18f1b6d403c91a88fd58cfc92d4b16080dfdb774afc8294469bf"}, - {file = "coverage-7.13.4-cp310-cp310-win_amd64.whl", hash = "sha256:2cb0f1e000ebc419632bbe04366a8990b6e32c4e0b51543a6484ffe15eaeda95"}, - {file = "coverage-7.13.4-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:d490ba50c3f35dd7c17953c68f3270e7ccd1c6642e2d2afe2d8e720b98f5a053"}, - {file = "coverage-7.13.4-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:19bc3c88078789f8ef36acb014d7241961dbf883fd2533d18cb1e7a5b4e28b11"}, - {file = "coverage-7.13.4-cp311-cp311-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:3998e5a32e62fdf410c0dbd3115df86297995d6e3429af80b8798aad894ca7aa"}, - {file = "coverage-7.13.4-cp311-cp311-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:8e264226ec98e01a8e1054314af91ee6cde0eacac4f465cc93b03dbe0bce2fd7"}, - {file = "coverage-7.13.4-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:a3aa4e7b9e416774b21797365b358a6e827ffadaaca81b69ee02946852449f00"}, - {file = "coverage-7.13.4-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:71ca20079dd8f27fcf808817e281e90220475cd75115162218d0e27549f95fef"}, - {file = "coverage-7.13.4-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:e2f25215f1a359ab17320b47bcdaca3e6e6356652e8256f2441e4ef972052903"}, - {file = "coverage-7.13.4-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:d65b2d373032411e86960604dc4edac91fdfb5dca539461cf2cbe78327d1e64f"}, - {file = "coverage-7.13.4-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:94eb63f9b363180aff17de3e7c8760c3ba94664ea2695c52f10111244d16a299"}, - {file = "coverage-7.13.4-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:e856bf6616714c3a9fbc270ab54103f4e685ba236fa98c054e8f87f266c93505"}, - {file = "coverage-7.13.4-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:65dfcbe305c3dfe658492df2d85259e0d79ead4177f9ae724b6fb245198f55d6"}, - {file = "coverage-7.13.4-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:b507778ae8a4c915436ed5c2e05b4a6cecfa70f734e19c22a005152a11c7b6a9"}, - {file = "coverage-7.13.4-cp311-cp311-win32.whl", hash = "sha256:784fc3cf8be001197b652d51d3fd259b1e2262888693a4636e18879f613a62a9"}, - {file = "coverage-7.13.4-cp311-cp311-win_amd64.whl", hash = "sha256:2421d591f8ca05b308cf0092807308b2facbefe54af7c02ac22548b88b95c98f"}, - {file = "coverage-7.13.4-cp311-cp311-win_arm64.whl", hash = "sha256:79e73a76b854d9c6088fe5d8b2ebe745f8681c55f7397c3c0a016192d681045f"}, - {file = "coverage-7.13.4-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:02231499b08dabbe2b96612993e5fc34217cdae907a51b906ac7fca8027a4459"}, - {file = "coverage-7.13.4-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40aa8808140e55dc022b15d8aa7f651b6b3d68b365ea0398f1441e0b04d859c3"}, - {file = "coverage-7.13.4-cp312-cp312-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:5b856a8ccf749480024ff3bd7310adaef57bf31fd17e1bfc404b7940b6986634"}, - {file = "coverage-7.13.4-cp312-cp312-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:2c048ea43875fbf8b45d476ad79f179809c590ec7b79e2035c662e7afa3192e3"}, - {file = "coverage-7.13.4-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b7b38448866e83176e28086674fe7368ab8590e4610fb662b44e345b86d63ffa"}, - {file = "coverage-7.13.4-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:de6defc1c9badbf8b9e67ae90fd00519186d6ab64e5cc5f3d21359c2a9b2c1d3"}, - {file = "coverage-7.13.4-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:7eda778067ad7ffccd23ecffce537dface96212576a07924cbf0d8799d2ded5a"}, - {file = "coverage-7.13.4-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:e87f6c587c3f34356c3759f0420693e35e7eb0e2e41e4c011cb6ec6ecbbf1db7"}, - {file = "coverage-7.13.4-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:8248977c2e33aecb2ced42fef99f2d319e9904a36e55a8a68b69207fb7e43edc"}, - {file = "coverage-7.13.4-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:25381386e80ae727608e662474db537d4df1ecd42379b5ba33c84633a2b36d47"}, - {file = "coverage-7.13.4-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:ee756f00726693e5ba94d6df2bdfd64d4852d23b09bb0bc700e3b30e6f333985"}, - {file = "coverage-7.13.4-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:fdfc1e28e7c7cdce44985b3043bc13bbd9c747520f94a4d7164af8260b3d91f0"}, - {file = "coverage-7.13.4-cp312-cp312-win32.whl", hash = "sha256:01d4cbc3c283a17fc1e42d614a119f7f438eabb593391283adca8dc86eff1246"}, - {file = "coverage-7.13.4-cp312-cp312-win_amd64.whl", hash = "sha256:9401ebc7ef522f01d01d45532c68c5ac40fb27113019b6b7d8b208f6e9baa126"}, - {file = "coverage-7.13.4-cp312-cp312-win_arm64.whl", hash = "sha256:b1ec7b6b6e93255f952e27ab58fbc68dcc468844b16ecbee881aeb29b6ab4d8d"}, - {file = "coverage-7.13.4-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:b66a2da594b6068b48b2692f043f35d4d3693fb639d5ea8b39533c2ad9ac3ab9"}, - {file = "coverage-7.13.4-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:3599eb3992d814d23b35c536c28df1a882caa950f8f507cef23d1cbf334995ac"}, - {file = "coverage-7.13.4-cp313-cp313-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:93550784d9281e374fb5a12bf1324cc8a963fd63b2d2f223503ef0fd4aa339ea"}, - {file = "coverage-7.13.4-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:b720ce6a88a2755f7c697c23268ddc47a571b88052e6b155224347389fdf6a3b"}, - {file = "coverage-7.13.4-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:7b322db1284a2ed3aa28ffd8ebe3db91c929b7a333c0820abec3d838ef5b3525"}, - {file = "coverage-7.13.4-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:f4594c67d8a7c89cf922d9df0438c7c7bb022ad506eddb0fdb2863359ff78242"}, - {file = "coverage-7.13.4-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:53d133df809c743eb8bce33b24bcababb371f4441340578cd406e084d94a6148"}, - {file = "coverage-7.13.4-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:76451d1978b95ba6507a039090ba076105c87cc76fc3efd5d35d72093964d49a"}, - {file = "coverage-7.13.4-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:7f57b33491e281e962021de110b451ab8a24182589be17e12a22c79047935e23"}, - {file = "coverage-7.13.4-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:1731dc33dc276dafc410a885cbf5992f1ff171393e48a21453b78727d090de80"}, - {file = "coverage-7.13.4-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:bd60d4fe2f6fa7dff9223ca1bbc9f05d2b6697bc5961072e5d3b952d46e1b1ea"}, - {file = "coverage-7.13.4-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:9181a3ccead280b828fae232df12b16652702b49d41e99d657f46cc7b1f6ec7a"}, - {file = "coverage-7.13.4-cp313-cp313-win32.whl", hash = "sha256:f53d492307962561ac7de4cd1de3e363589b000ab69617c6156a16ba7237998d"}, - {file = "coverage-7.13.4-cp313-cp313-win_amd64.whl", hash = "sha256:e6f70dec1cc557e52df5306d051ef56003f74d56e9c4dd7ddb07e07ef32a84dd"}, - {file = "coverage-7.13.4-cp313-cp313-win_arm64.whl", hash = "sha256:fb07dc5da7e849e2ad31a5d74e9bece81f30ecf5a42909d0a695f8bd1874d6af"}, - {file = "coverage-7.13.4-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:40d74da8e6c4b9ac18b15331c4b5ebc35a17069410cad462ad4f40dcd2d50c0d"}, - {file = "coverage-7.13.4-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:4223b4230a376138939a9173f1bdd6521994f2aff8047fae100d6d94d50c5a12"}, - {file = "coverage-7.13.4-cp313-cp313t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:1d4be36a5114c499f9f1f9195e95ebf979460dbe2d88e6816ea202010ba1c34b"}, - {file = "coverage-7.13.4-cp313-cp313t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:200dea7d1e8095cc6e98cdabe3fd1d21ab17d3cee6dab00cadbb2fe35d9c15b9"}, - {file = "coverage-7.13.4-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b8eb931ee8e6d8243e253e5ed7336deea6904369d2fd8ae6e43f68abbf167092"}, - {file = "coverage-7.13.4-cp313-cp313t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:75eab1ebe4f2f64d9509b984f9314d4aa788540368218b858dad56dc8f3e5eb9"}, - {file = "coverage-7.13.4-cp313-cp313t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:c35eb28c1d085eb7d8c9b3296567a1bebe03ce72962e932431b9a61f28facf26"}, - {file = "coverage-7.13.4-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:eb88b316ec33760714a4720feb2816a3a59180fd58c1985012054fa7aebee4c2"}, - {file = "coverage-7.13.4-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:7d41eead3cc673cbd38a4417deb7fd0b4ca26954ff7dc6078e33f6ff97bed940"}, - {file = "coverage-7.13.4-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:fb26a934946a6afe0e326aebe0730cdff393a8bc0bbb65a2f41e30feddca399c"}, - {file = "coverage-7.13.4-cp313-cp313t-musllinux_1_2_riscv64.whl", hash = "sha256:dae88bc0fc77edaa65c14be099bd57ee140cf507e6bfdeea7938457ab387efb0"}, - {file = "coverage-7.13.4-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:845f352911777a8e722bfce168958214951e07e47e5d5d9744109fa5fe77f79b"}, - {file = "coverage-7.13.4-cp313-cp313t-win32.whl", hash = "sha256:2fa8d5f8de70688a28240de9e139fa16b153cc3cbb01c5f16d88d6505ebdadf9"}, - {file = "coverage-7.13.4-cp313-cp313t-win_amd64.whl", hash = "sha256:9351229c8c8407645840edcc277f4a2d44814d1bc34a2128c11c2a031d45a5dd"}, - {file = "coverage-7.13.4-cp313-cp313t-win_arm64.whl", hash = "sha256:30b8d0512f2dc8c8747557e8fb459d6176a2c9e5731e2b74d311c03b78451997"}, - {file = "coverage-7.13.4-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:300deaee342f90696ed186e3a00c71b5b3d27bffe9e827677954f4ee56969601"}, - {file = "coverage-7.13.4-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:29e3220258d682b6226a9b0925bc563ed9a1ebcff3cad30f043eceea7eaf2689"}, - {file = "coverage-7.13.4-cp314-cp314-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:391ee8f19bef69210978363ca930f7328081c6a0152f1166c91f0b5fdd2a773c"}, - {file = "coverage-7.13.4-cp314-cp314-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:0dd7ab8278f0d58a0128ba2fca25824321f05d059c1441800e934ff2efa52129"}, - {file = "coverage-7.13.4-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:78cdf0d578b15148b009ccf18c686aa4f719d887e76e6b40c38ffb61d264a552"}, - {file = "coverage-7.13.4-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:48685fee12c2eb3b27c62f2658e7ea21e9c3239cba5a8a242801a0a3f6a8c62a"}, - {file = "coverage-7.13.4-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:4e83efc079eb39480e6346a15a1bcb3e9b04759c5202d157e1dd4303cd619356"}, - {file = "coverage-7.13.4-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:ecae9737b72408d6a950f7e525f30aca12d4bd8dd95e37342e5beb3a2a8c4f71"}, - {file = "coverage-7.13.4-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:ae4578f8528569d3cf303fef2ea569c7f4c4059a38c8667ccef15c6e1f118aa5"}, - {file = "coverage-7.13.4-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:6fdef321fdfbb30a197efa02d48fcd9981f0d8ad2ae8903ac318adc653f5df98"}, - {file = "coverage-7.13.4-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:2b0f6ccf3dbe577170bebfce1318707d0e8c3650003cb4b3a9dd744575daa8b5"}, - {file = "coverage-7.13.4-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:75fcd519f2a5765db3f0e391eb3b7d150cce1a771bf4c9f861aeab86c767a3c0"}, - {file = "coverage-7.13.4-cp314-cp314-win32.whl", hash = "sha256:8e798c266c378da2bd819b0677df41ab46d78065fb2a399558f3f6cae78b2fbb"}, - {file = "coverage-7.13.4-cp314-cp314-win_amd64.whl", hash = "sha256:245e37f664d89861cf2329c9afa2c1fe9e6d4e1a09d872c947e70718aeeac505"}, - {file = "coverage-7.13.4-cp314-cp314-win_arm64.whl", hash = "sha256:ad27098a189e5838900ce4c2a99f2fe42a0bf0c2093c17c69b45a71579e8d4a2"}, - {file = "coverage-7.13.4-cp314-cp314t-macosx_10_15_x86_64.whl", hash = "sha256:85480adfb35ffc32d40918aad81b89c69c9cc5661a9b8a81476d3e645321a056"}, - {file = "coverage-7.13.4-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:79be69cf7f3bf9b0deeeb062eab7ac7f36cd4cc4c4dd694bd28921ba4d8596cc"}, - {file = "coverage-7.13.4-cp314-cp314t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:caa421e2684e382c5d8973ac55e4f36bed6821a9bad5c953494de960c74595c9"}, - {file = "coverage-7.13.4-cp314-cp314t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:14375934243ee05f56c45393fe2ce81fe5cc503c07cee2bdf1725fb8bef3ffaf"}, - {file = "coverage-7.13.4-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:25a41c3104d08edb094d9db0d905ca54d0cd41c928bb6be3c4c799a54753af55"}, - {file = "coverage-7.13.4-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:6f01afcff62bf9a08fb32b2c1d6e924236c0383c02c790732b6537269e466a72"}, - {file = "coverage-7.13.4-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:eb9078108fbf0bcdde37c3f4779303673c2fa1fe8f7956e68d447d0dd426d38a"}, - {file = "coverage-7.13.4-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:0e086334e8537ddd17e5f16a344777c1ab8194986ec533711cbe6c41cde841b6"}, - {file = "coverage-7.13.4-cp314-cp314t-musllinux_1_2_i686.whl", hash = "sha256:725d985c5ab621268b2edb8e50dfe57633dc69bda071abc470fed55a14935fd3"}, - {file = "coverage-7.13.4-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:3c06f0f1337c667b971ca2f975523347e63ec5e500b9aa5882d91931cd3ef750"}, - {file = "coverage-7.13.4-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:590c0ed4bf8e85f745e6b805b2e1c457b2e33d5255dd9729743165253bc9ad39"}, - {file = "coverage-7.13.4-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:eb30bf180de3f632cd043322dad5751390e5385108b2807368997d1a92a509d0"}, - {file = "coverage-7.13.4-cp314-cp314t-win32.whl", hash = "sha256:c4240e7eded42d131a2d2c4dec70374b781b043ddc79a9de4d55ca71f8e98aea"}, - {file = "coverage-7.13.4-cp314-cp314t-win_amd64.whl", hash = "sha256:4c7d3cc01e7350f2f0f6f7036caaf5673fb56b6998889ccfe9e1c1fe75a9c932"}, - {file = "coverage-7.13.4-cp314-cp314t-win_arm64.whl", hash = "sha256:23e3f687cf945070d1c90f85db66d11e3025665d8dafa831301a0e0038f3db9b"}, - {file = "coverage-7.13.4-py3-none-any.whl", hash = "sha256:1af1641e57cf7ba1bd67d677c9abdbcd6cc2ab7da3bca7fa1e2b7e50e65f2ad0"}, - {file = "coverage-7.13.4.tar.gz", hash = "sha256:e5c8f6ed1e61a8b2dcdf31eb0b9bbf0130750ca79c1c49eb898e2ad86f5ccc91"}, + {file = "coverage-7.13.5-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:e0723d2c96324561b9aa76fb982406e11d93cdb388a7a7da2b16e04719cf7ca5"}, + {file = "coverage-7.13.5-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:52f444e86475992506b32d4e5ca55c24fc88d73bcbda0e9745095b28ef4dc0cf"}, + {file = "coverage-7.13.5-cp310-cp310-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:704de6328e3d612a8f6c07000a878ff38181ec3263d5a11da1db294fa6a9bdf8"}, + {file = "coverage-7.13.5-cp310-cp310-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:a1a6d79a14e1ec1832cabc833898636ad5f3754a678ef8bb4908515208bf84f4"}, + {file = "coverage-7.13.5-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:79060214983769c7ba3f0cee10b54c97609dca4d478fa1aa32b914480fd5738d"}, + {file = "coverage-7.13.5-cp310-cp310-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:356e76b46783a98c2a2fe81ec79df4883a1e62895ea952968fb253c114e7f930"}, + {file = "coverage-7.13.5-cp310-cp310-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:0cef0cdec915d11254a7f549c1170afecce708d30610c6abdded1f74e581666d"}, + {file = "coverage-7.13.5-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:dc022073d063b25a402454e5712ef9e007113e3a676b96c5f29b2bda29352f40"}, + {file = "coverage-7.13.5-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:9b74db26dfea4f4e50d48a4602207cd1e78be33182bc9cbf22da94f332f99878"}, + {file = "coverage-7.13.5-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:ad146744ca4fd09b50c482650e3c1b1f4dfa1d4792e0a04a369c7f23336f0400"}, + {file = "coverage-7.13.5-cp310-cp310-musllinux_1_2_riscv64.whl", hash = "sha256:c555b48be1853fe3997c11c4bd521cdd9a9612352de01fa4508f16ec341e6fe0"}, + {file = "coverage-7.13.5-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:7034b5c56a58ae5e85f23949d52c14aca2cfc6848a31764995b7de88f13a1ea0"}, + {file = "coverage-7.13.5-cp310-cp310-win32.whl", hash = "sha256:eb7fdf1ef130660e7415e0253a01a7d5a88c9c4d158bcf75cbbd922fd65a5b58"}, + {file = "coverage-7.13.5-cp310-cp310-win_amd64.whl", hash = "sha256:3e1bb5f6c78feeb1be3475789b14a0f0a5b47d505bfc7267126ccbd50289999e"}, + {file = "coverage-7.13.5-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:66a80c616f80181f4d643b0f9e709d97bcea413ecd9631e1dedc7401c8e6695d"}, + {file = "coverage-7.13.5-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:145ede53ccbafb297c1c9287f788d1bc3efd6c900da23bf6931b09eafc931587"}, + {file = "coverage-7.13.5-cp311-cp311-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:0672854dc733c342fa3e957e0605256d2bf5934feeac328da9e0b5449634a642"}, + {file = "coverage-7.13.5-cp311-cp311-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:ec10e2a42b41c923c2209b846126c6582db5e43a33157e9870ba9fb70dc7854b"}, + {file = "coverage-7.13.5-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:be3d4bbad9d4b037791794ddeedd7d64a56f5933a2c1373e18e9e568b9141686"}, + {file = "coverage-7.13.5-cp311-cp311-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:4d2afbc5cc54d286bfb54541aa50b64cdb07a718227168c87b9e2fb8f25e1743"}, + {file = "coverage-7.13.5-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:3ad050321264c49c2fa67bb599100456fc51d004b82534f379d16445da40fb75"}, + {file = "coverage-7.13.5-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:7300c8a6d13335b29bb76d7651c66af6bd8658517c43499f110ddc6717bfc209"}, + {file = "coverage-7.13.5-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:eb07647a5738b89baab047f14edd18ded523de60f3b30e75c2acc826f79c839a"}, + {file = "coverage-7.13.5-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:9adb6688e3b53adffefd4a52d72cbd8b02602bfb8f74dcd862337182fd4d1a4e"}, + {file = "coverage-7.13.5-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:7c8d4bc913dd70b93488d6c496c77f3aff5ea99a07e36a18f865bca55adef8bd"}, + {file = "coverage-7.13.5-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:0e3c426ffc4cd952f54ee9ffbdd10345709ecc78a3ecfd796a57236bfad0b9b8"}, + {file = "coverage-7.13.5-cp311-cp311-win32.whl", hash = "sha256:259b69bb83ad9894c4b25be2528139eecba9a82646ebdda2d9db1ba28424a6bf"}, + {file = "coverage-7.13.5-cp311-cp311-win_amd64.whl", hash = "sha256:258354455f4e86e3e9d0d17571d522e13b4e1e19bf0f8596bcf9476d61e7d8a9"}, + {file = "coverage-7.13.5-cp311-cp311-win_arm64.whl", hash = "sha256:bff95879c33ec8da99fc9b6fe345ddb5be6414b41d6d1ad1c8f188d26f36e028"}, + {file = "coverage-7.13.5-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:460cf0114c5016fa841214ff5564aa4864f11948da9440bc97e21ad1f4ba1e01"}, + {file = "coverage-7.13.5-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:0e223ce4b4ed47f065bfb123687686512e37629be25cc63728557ae7db261422"}, + {file = "coverage-7.13.5-cp312-cp312-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:6e3370441f4513c6252bf042b9c36d22491142385049243253c7e48398a15a9f"}, + {file = "coverage-7.13.5-cp312-cp312-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:03ccc709a17a1de074fb1d11f217342fb0d2b1582ed544f554fc9fc3f07e95f5"}, + {file = "coverage-7.13.5-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3f4818d065964db3c1c66dc0fbdac5ac692ecbc875555e13374fdbe7eedb4376"}, + {file = "coverage-7.13.5-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:012d5319e66e9d5a218834642d6c35d265515a62f01157a45bcc036ecf947256"}, + {file = "coverage-7.13.5-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:8dd02af98971bdb956363e4827d34425cb3df19ee550ef92855b0acb9c7ce51c"}, + {file = "coverage-7.13.5-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:f08fd75c50a760c7eb068ae823777268daaf16a80b918fa58eea888f8e3919f5"}, + {file = "coverage-7.13.5-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:843ea8643cf967d1ac7e8ecd4bb00c99135adf4816c0c0593fdcc47b597fcf09"}, + {file = "coverage-7.13.5-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:9d44d7aa963820b1b971dbecd90bfe5fe8f81cff79787eb6cca15750bd2f79b9"}, + {file = "coverage-7.13.5-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:7132bed4bd7b836200c591410ae7d97bf7ae8be6fc87d160b2bd881df929e7bf"}, + {file = "coverage-7.13.5-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:a698e363641b98843c517817db75373c83254781426e94ada3197cabbc2c919c"}, + {file = "coverage-7.13.5-cp312-cp312-win32.whl", hash = "sha256:bdba0a6b8812e8c7df002d908a9a2ea3c36e92611b5708633c50869e6d922fdf"}, + {file = "coverage-7.13.5-cp312-cp312-win_amd64.whl", hash = "sha256:d2c87e0c473a10bffe991502eac389220533024c8082ec1ce849f4218dded810"}, + {file = "coverage-7.13.5-cp312-cp312-win_arm64.whl", hash = "sha256:bf69236a9a81bdca3bff53796237aab096cdbf8d78a66ad61e992d9dac7eb2de"}, + {file = "coverage-7.13.5-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:5ec4af212df513e399cf11610cc27063f1586419e814755ab362e50a85ea69c1"}, + {file = "coverage-7.13.5-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:941617e518602e2d64942c88ec8499f7fbd49d3f6c4327d3a71d43a1973032f3"}, + {file = "coverage-7.13.5-cp313-cp313-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:da305e9937617ee95c2e39d8ff9f040e0487cbf1ac174f777ed5eddd7a7c1f26"}, + {file = "coverage-7.13.5-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:78e696e1cc714e57e8b25760b33a8b1026b7048d270140d25dafe1b0a1ee05a3"}, + {file = "coverage-7.13.5-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:02ca0eed225b2ff301c474aeeeae27d26e2537942aa0f87491d3e147e784a82b"}, + {file = "coverage-7.13.5-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:04690832cbea4e4663d9149e05dba142546ca05cb1848816760e7f58285c970a"}, + {file = "coverage-7.13.5-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:0590e44dd2745c696a778f7bab6aa95256de2cbc8b8cff4f7db8ff09813d6969"}, + {file = "coverage-7.13.5-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:d7cfad2d6d81dd298ab6b89fe72c3b7b05ec7544bdda3b707ddaecff8d25c161"}, + {file = "coverage-7.13.5-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:e092b9499de38ae0fbfbc603a74660eb6ff3e869e507b50d85a13b6db9863e15"}, + {file = "coverage-7.13.5-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:48c39bc4a04d983a54a705a6389512883d4a3b9862991b3617d547940e9f52b1"}, + {file = "coverage-7.13.5-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:2d3807015f138ffea1ed9afeeb8624fd781703f2858b62a8dd8da5a0994c57b6"}, + {file = "coverage-7.13.5-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:ee2aa19e03161671ec964004fb74b2257805d9710bf14a5c704558b9d8dbaf17"}, + {file = "coverage-7.13.5-cp313-cp313-win32.whl", hash = "sha256:ce1998c0483007608c8382f4ff50164bfc5bd07a2246dd272aa4043b75e61e85"}, + {file = "coverage-7.13.5-cp313-cp313-win_amd64.whl", hash = "sha256:631efb83f01569670a5e866ceb80fe483e7c159fac6f167e6571522636104a0b"}, + {file = "coverage-7.13.5-cp313-cp313-win_arm64.whl", hash = "sha256:f4cd16206ad171cbc2470dbea9103cf9a7607d5fe8c242fdf1edf36174020664"}, + {file = "coverage-7.13.5-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:0428cbef5783ad91fe240f673cc1f76b25e74bbfe1a13115e4aa30d3f538162d"}, + {file = "coverage-7.13.5-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:e0b216a19534b2427cc201a26c25da4a48633f29a487c61258643e89d28200c0"}, + {file = "coverage-7.13.5-cp313-cp313t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:972a9cd27894afe4bc2b1480107054e062df08e671df7c2f18c205e805ccd806"}, + {file = "coverage-7.13.5-cp313-cp313t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:4b59148601efcd2bac8c4dbf1f0ad6391693ccf7a74b8205781751637076aee3"}, + {file = "coverage-7.13.5-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:505d7083c8b0c87a8fa8c07370c285847c1f77739b22e299ad75a6af6c32c5c9"}, + {file = "coverage-7.13.5-cp313-cp313t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:60365289c3741e4db327e7baff2a4aaacf22f788e80fa4683393891b70a89fbd"}, + {file = "coverage-7.13.5-cp313-cp313t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:1b88c69c8ef5d4b6fe7dea66d6636056a0f6a7527c440e890cf9259011f5e606"}, + {file = "coverage-7.13.5-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:5b13955d31d1633cf9376908089b7cebe7d15ddad7aeaabcbe969a595a97e95e"}, + {file = "coverage-7.13.5-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:f70c9ab2595c56f81a89620e22899eea8b212a4041bd728ac6f4a28bf5d3ddd0"}, + {file = "coverage-7.13.5-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:084b84a8c63e8d6fc7e3931b316a9bcafca1458d753c539db82d31ed20091a87"}, + {file = "coverage-7.13.5-cp313-cp313t-musllinux_1_2_riscv64.whl", hash = "sha256:ad14385487393e386e2ea988b09d62dd42c397662ac2dabc3832d71253eee479"}, + {file = "coverage-7.13.5-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:7f2c47b36fe7709a6e83bfadf4eefb90bd25fbe4014d715224c4316f808e59a2"}, + {file = "coverage-7.13.5-cp313-cp313t-win32.whl", hash = "sha256:67e9bc5449801fad0e5dff329499fb090ba4c5800b86805c80617b4e29809b2a"}, + {file = "coverage-7.13.5-cp313-cp313t-win_amd64.whl", hash = "sha256:da86cdcf10d2519e10cabb8ac2de03da1bcb6e4853790b7fbd48523332e3a819"}, + {file = "coverage-7.13.5-cp313-cp313t-win_arm64.whl", hash = "sha256:0ecf12ecb326fe2c339d93fc131816f3a7367d223db37817208905c89bded911"}, + {file = "coverage-7.13.5-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:fbabfaceaeb587e16f7008f7795cd80d20ec548dc7f94fbb0d4ec2e038ce563f"}, + {file = "coverage-7.13.5-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:9bb2a28101a443669a423b665939381084412b81c3f8c0fcfbac57f4e30b5b8e"}, + {file = "coverage-7.13.5-cp314-cp314-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:bd3a2fbc1c6cccb3c5106140d87cc6a8715110373ef42b63cf5aea29df8c217a"}, + {file = "coverage-7.13.5-cp314-cp314-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:6c36ddb64ed9d7e496028d1d00dfec3e428e0aabf4006583bb1839958d280510"}, + {file = "coverage-7.13.5-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:380e8e9084d8eb38db3a9176a1a4f3c0082c3806fa0dc882d1d87abc3c789247"}, + {file = "coverage-7.13.5-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:e808af52a0513762df4d945ea164a24b37f2f518cbe97e03deaa0ee66139b4d6"}, + {file = "coverage-7.13.5-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:e301d30dd7e95ae068671d746ba8c34e945a82682e62918e41b2679acd2051a0"}, + {file = "coverage-7.13.5-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:800bc829053c80d240a687ceeb927a94fd108bbdc68dfbe505d0d75ab578a882"}, + {file = "coverage-7.13.5-cp314-cp314-musllinux_1_2_i686.whl", hash = "sha256:0b67af5492adb31940ee418a5a655c28e48165da5afab8c7fa6fd72a142f8740"}, + {file = "coverage-7.13.5-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:c9136ff29c3a91e25b1d1552b5308e53a1e0653a23e53b6366d7c2dcbbaf8a16"}, + {file = "coverage-7.13.5-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:cff784eef7f0b8f6cb28804fbddcfa99f89efe4cc35fb5627e3ac58f91ed3ac0"}, + {file = "coverage-7.13.5-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:68a4953be99b17ac3c23b6efbc8a38330d99680c9458927491d18700ef23ded0"}, + {file = "coverage-7.13.5-cp314-cp314-win32.whl", hash = "sha256:35a31f2b1578185fbe6aa2e74cea1b1d0bbf4c552774247d9160d29b80ed56cc"}, + {file = "coverage-7.13.5-cp314-cp314-win_amd64.whl", hash = "sha256:2aa055ae1857258f9e0045be26a6d62bdb47a72448b62d7b55f4820f361a2633"}, + {file = "coverage-7.13.5-cp314-cp314-win_arm64.whl", hash = "sha256:1b11eef33edeae9d142f9b4358edb76273b3bfd30bc3df9a4f95d0e49caf94e8"}, + {file = "coverage-7.13.5-cp314-cp314t-macosx_10_15_x86_64.whl", hash = "sha256:10a0c37f0b646eaff7cce1874c31d1f1ccb297688d4c747291f4f4c70741cc8b"}, + {file = "coverage-7.13.5-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:b5db73ba3c41c7008037fa731ad5459fc3944cb7452fc0aa9f822ad3533c583c"}, + {file = "coverage-7.13.5-cp314-cp314t-manylinux1_i686.manylinux_2_28_i686.manylinux_2_5_i686.whl", hash = "sha256:750db93a81e3e5a9831b534be7b1229df848b2e125a604fe6651e48aa070e5f9"}, + {file = "coverage-7.13.5-cp314-cp314t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:9ddb4f4a5479f2539644be484da179b653273bca1a323947d48ab107b3ed1f29"}, + {file = "coverage-7.13.5-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d8a7a2049c14f413163e2bdabd37e41179b1d1ccb10ffc6ccc4b7a718429c607"}, + {file = "coverage-7.13.5-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:e1c85e0b6c05c592ea6d8768a66a254bfb3874b53774b12d4c89c481eb78cb90"}, + {file = "coverage-7.13.5-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:777c4d1eff1b67876139d24288aaf1817f6c03d6bae9c5cc8d27b83bcfe38fe3"}, + {file = "coverage-7.13.5-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:6697e29b93707167687543480a40f0db8f356e86d9f67ddf2e37e2dfd91a9dab"}, + {file = "coverage-7.13.5-cp314-cp314t-musllinux_1_2_i686.whl", hash = "sha256:8fdf453a942c3e4d99bd80088141c4c6960bb232c409d9c3558e2dbaa3998562"}, + {file = "coverage-7.13.5-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:32ca0c0114c9834a43f045a87dcebd69d108d8ffb666957ea65aa132f50332e2"}, + {file = "coverage-7.13.5-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:8769751c10f339021e2638cd354e13adeac54004d1941119b2c96fe5276d45ea"}, + {file = "coverage-7.13.5-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:cec2d83125531bd153175354055cdb7a09987af08a9430bd173c937c6d0fba2a"}, + {file = "coverage-7.13.5-cp314-cp314t-win32.whl", hash = "sha256:0cd9ed7a8b181775459296e402ca4fb27db1279740a24e93b3b41942ebe4b215"}, + {file = "coverage-7.13.5-cp314-cp314t-win_amd64.whl", hash = "sha256:301e3b7dfefecaca37c9f1aa6f0049b7d4ab8dd933742b607765d757aca77d43"}, + {file = "coverage-7.13.5-cp314-cp314t-win_arm64.whl", hash = "sha256:9dacc2ad679b292709e0f5fc1ac74a6d4d5562e424058962c7bb0c658ad25e45"}, + {file = "coverage-7.13.5-py3-none-any.whl", hash = "sha256:34b02417cf070e173989b3db962f7ed56d2f644307b2cf9d5a0f258e13084a61"}, + {file = "coverage-7.13.5.tar.gz", hash = "sha256:c81f6515c4c40141f83f502b07bbfa5c240ba25bbe73da7b33f1e5b6120ff179"}, ] [package.extras] @@ -411,29 +396,28 @@ toml = ["tomli ; python_full_version <= \"3.11.0a6\""] [[package]] name = "curies" -version = "0.12.9" +version = "0.13.6" description = "Idiomatic conversion between URIs and compact URIs (CURIEs)" optional = false python-versions = ">=3.10" groups = ["main"] files = [ - {file = "curies-0.12.9-py3-none-any.whl", hash = "sha256:0f5cc8f5c72d3099dd7cf2a70a56c10664f82b52eda8072d45b7586caf3a5745"}, - {file = "curies-0.12.9.tar.gz", hash = "sha256:bd6826550bd21f0c7508ac9c9869b8dfa4b3376b0bdf4d68fbc461d9bb4af037"}, + {file = "curies-0.13.6-py3-none-any.whl", hash = "sha256:fb9b86198a3f25cf20f9bb63b6a8367ec7f33b35b7bd82c61cc05df262d0fa46"}, + {file = "curies-0.13.6.tar.gz", hash = "sha256:90ade24612054c404469610132260ae2aa161670ec685f96ea1ed28765be2f55"}, ] [package.dependencies] pydantic = ">=2.0" +pystow = ">=0.8.0" typing-extensions = "*" [package.extras] -docs = ["sphinx (>=8)", "sphinx-automodapi", "sphinx-rtd-theme (>=3.0)"] fastapi = ["defusedxml", "fastapi", "httpx", "python-multipart", "uvicorn"] flask = ["defusedxml", "flask"] pandas = ["pandas"] rdflib = ["rdflib"] sqlalchemy = ["sqlalchemy"] sqlmodel = ["sqlmodel"] -tests = ["coverage[toml]", "pytest", "requests"] [[package]] name = "deprecated" @@ -506,53 +490,47 @@ websockets = ["websocket-client (>=1.3.0)"] [[package]] name = "duckdb" -version = "1.4.4" +version = "1.5.2" description = "DuckDB in-process database" optional = false -python-versions = ">=3.9.0" +python-versions = ">=3.10.0" groups = ["main"] files = [ - {file = "duckdb-1.4.4-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:e870a441cb1c41d556205deb665749f26347ed13b3a247b53714f5d589596977"}, - {file = "duckdb-1.4.4-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:49123b579e4a6323e65139210cd72dddc593a72d840211556b60f9703bda8526"}, - {file = "duckdb-1.4.4-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:5e1933fac5293fea5926b0ee75a55b8cfe7f516d867310a5b251831ab61fe62b"}, - {file = "duckdb-1.4.4-cp310-cp310-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:707530f6637e91dc4b8125260595299ec9dd157c09f5d16c4186c5988bfbd09a"}, - {file = "duckdb-1.4.4-cp310-cp310-manylinux_2_26_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:453b115f4777467f35103d8081770ac2f223fb5799178db5b06186e3ab51d1f2"}, - {file = "duckdb-1.4.4-cp310-cp310-win_amd64.whl", hash = "sha256:a3c8542db7ffb128aceb7f3b35502ebaddcd4f73f1227569306cc34bad06680c"}, - {file = "duckdb-1.4.4-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:5ba684f498d4e924c7e8f30dd157da8da34c8479746c5011b6c0e037e9c60ad2"}, - {file = "duckdb-1.4.4-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:5536eb952a8aa6ae56469362e344d4e6403cc945a80bc8c5c2ebdd85d85eb64b"}, - {file = "duckdb-1.4.4-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:47dd4162da6a2be59a0aef640eb08d6360df1cf83c317dcc127836daaf3b7f7c"}, - {file = "duckdb-1.4.4-cp311-cp311-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6cb357cfa3403910e79e2eb46c8e445bb1ee2fd62e9e9588c6b999df4256abc1"}, - {file = "duckdb-1.4.4-cp311-cp311-manylinux_2_26_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4c25d5b0febda02b7944e94fdae95aecf952797afc8cb920f677b46a7c251955"}, - {file = "duckdb-1.4.4-cp311-cp311-win_amd64.whl", hash = "sha256:6703dd1bb650025b3771552333d305d62ddd7ff182de121483d4e042ea6e2e00"}, - {file = "duckdb-1.4.4-cp311-cp311-win_arm64.whl", hash = "sha256:bf138201f56e5d6fc276a25138341b3523e2f84733613fc43f02c54465619a95"}, - {file = "duckdb-1.4.4-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:ddcfd9c6ff234da603a1edd5fd8ae6107f4d042f74951b65f91bc5e2643856b3"}, - {file = "duckdb-1.4.4-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:6792ca647216bd5c4ff16396e4591cfa9b4a72e5ad7cdd312cec6d67e8431a7c"}, - {file = "duckdb-1.4.4-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:1f8d55843cc940e36261689054f7dfb6ce35b1f5b0953b0d355b6adb654b0d52"}, - {file = "duckdb-1.4.4-cp312-cp312-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c65d15c440c31e06baaebfd2c06d71ce877e132779d309f1edf0a85d23c07e92"}, - {file = "duckdb-1.4.4-cp312-cp312-manylinux_2_26_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b297eff642503fd435a9de5a9cb7db4eccb6f61d61a55b30d2636023f149855f"}, - {file = "duckdb-1.4.4-cp312-cp312-win_amd64.whl", hash = "sha256:d525de5f282b03aa8be6db86b1abffdceae5f1055113a03d5b50cd2fb8cf2ef8"}, - {file = "duckdb-1.4.4-cp312-cp312-win_arm64.whl", hash = "sha256:50f2eb173c573811b44aba51176da7a4e5c487113982be6a6a1c37337ec5fa57"}, - {file = "duckdb-1.4.4-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:337f8b24e89bc2e12dadcfe87b4eb1c00fd920f68ab07bc9b70960d6523b8bc3"}, - {file = "duckdb-1.4.4-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:0509b39ea7af8cff0198a99d206dca753c62844adab54e545984c2e2c1381616"}, - {file = "duckdb-1.4.4-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:fb94de6d023de9d79b7edc1ae07ee1d0b4f5fa8a9dcec799650b5befdf7aafec"}, - {file = "duckdb-1.4.4-cp313-cp313-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0d636ceda422e7babd5e2f7275f6a0d1a3405e6a01873f00d38b72118d30c10b"}, - {file = "duckdb-1.4.4-cp313-cp313-manylinux_2_26_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:7df7351328ffb812a4a289732f500d621e7de9942a3a2c9b6d4afcf4c0e72526"}, - {file = "duckdb-1.4.4-cp313-cp313-win_amd64.whl", hash = "sha256:6fb1225a9ea5877421481d59a6c556a9532c32c16c7ae6ca8d127e2b878c9389"}, - {file = "duckdb-1.4.4-cp313-cp313-win_arm64.whl", hash = "sha256:f28a18cc790217e5b347bb91b2cab27aafc557c58d3d8382e04b4fe55d0c3f66"}, - {file = "duckdb-1.4.4-cp314-cp314-macosx_10_15_universal2.whl", hash = "sha256:25874f8b1355e96178079e37312c3ba6d61a2354f51319dae860cf21335c3a20"}, - {file = "duckdb-1.4.4-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:452c5b5d6c349dc5d1154eb2062ee547296fcbd0c20e9df1ed00b5e1809089da"}, - {file = "duckdb-1.4.4-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:8e5c2d8a0452df55e092959c0bfc8ab8897ac3ea0f754cb3b0ab3e165cd79aff"}, - {file = "duckdb-1.4.4-cp314-cp314-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1af6e76fe8bd24875dc56dd8e38300d64dc708cd2e772f67b9fbc635cc3066a3"}, - {file = "duckdb-1.4.4-cp314-cp314-manylinux_2_26_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d0440f59e0cd9936a9ebfcf7a13312eda480c79214ffed3878d75947fc3b7d6d"}, - {file = "duckdb-1.4.4-cp314-cp314-win_amd64.whl", hash = "sha256:59c8d76016dde854beab844935b1ec31de358d4053e792988108e995b18c08e7"}, - {file = "duckdb-1.4.4-cp314-cp314-win_arm64.whl", hash = "sha256:53cd6423136ab44383ec9955aefe7599b3fb3dd1fe006161e6396d8167e0e0d4"}, - {file = "duckdb-1.4.4-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:8097201bc5fd0779d7fcc2f3f4736c349197235f4cb7171622936343a1aa8dbf"}, - {file = "duckdb-1.4.4-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:cd1be3d48577f5b40eb9706c6b2ae10edfe18e78eb28e31a3b922dcff1183597"}, - {file = "duckdb-1.4.4-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:e041f2fbd6888da090eca96ac167a7eb62d02f778385dd9155ed859f1c6b6dc8"}, - {file = "duckdb-1.4.4-cp39-cp39-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:7eec0bf271ac622e57b7f6554a27a6e7d1dd2f43d1871f7962c74bcbbede15ba"}, - {file = "duckdb-1.4.4-cp39-cp39-manylinux_2_26_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:5cdc4126ec925edf3112bc656ac9ed23745294b854935fa7a643a216e4455af6"}, - {file = "duckdb-1.4.4-cp39-cp39-win_amd64.whl", hash = "sha256:c9566a4ed834ec7999db5849f53da0a7ee83d86830c33f471bf0211a1148ca12"}, - {file = "duckdb-1.4.4.tar.gz", hash = "sha256:8bba52fd2acb67668a4615ee17ee51814124223de836d9e2fdcbc4c9021b3d3c"}, + {file = "duckdb-1.5.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:63bf8687feefeed51adf45fa3b062ab8b1b1c350492b7518491b86bae68b1da1"}, + {file = "duckdb-1.5.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:84b193aca20565dedb3172de15f843c659c3a6c773bf14843a9bd781c850e7db"}, + {file = "duckdb-1.5.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:5596bbfc31b1b259db69c8d847b42d036ce2c4804f9ccb28f9fc46a16de7bc53"}, + {file = "duckdb-1.5.2-cp310-cp310-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8dbd7e31e5dc157bfe8803fa7d2652336265c6c19926c5a4a9b40f8222868d08"}, + {file = "duckdb-1.5.2-cp310-cp310-manylinux_2_26_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a9cd5e71702d446613750405cde03f66ed268f4c321da071b0472759dad19536"}, + {file = "duckdb-1.5.2-cp310-cp310-win_amd64.whl", hash = "sha256:ce17670bb392ea1b3650537db02bd720908776b5b95f6d2472d31a7de59d1dc1"}, + {file = "duckdb-1.5.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:7f69164b048e498b9e9140a24343108a5ae5f17bfb3485185f55fdf9b1aa924d"}, + {file = "duckdb-1.5.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:81fc4fbf0b5e25840b39ba2a10b78c6953c0314d5d0434191e7898f34ab1bba3"}, + {file = "duckdb-1.5.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:56d38b3c4e0ef2abb58898d0fd423933999ed535c45e75e9d9f72e1d5fed69b8"}, + {file = "duckdb-1.5.2-cp311-cp311-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:376856066c65ccd55fcb3a380bbe33a71ce089fc4623d229ffc6e82251afdb6d"}, + {file = "duckdb-1.5.2-cp311-cp311-manylinux_2_26_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:c69907354ffee94ba8cf782daf0480dab7557f21ce27fffa6c0ea8f74ed4b8e2"}, + {file = "duckdb-1.5.2-cp311-cp311-win_amd64.whl", hash = "sha256:d9b4f5430bf4f05d4c0dc4c55c75def3a5af4be0343be20fa2bfc577343fbfc9"}, + {file = "duckdb-1.5.2-cp311-cp311-win_arm64.whl", hash = "sha256:2323c1195c10fb2bb982fc0218c730b43d1b92a355d61e68e3c5f3ac9d44c34f"}, + {file = "duckdb-1.5.2-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:e6495b00cad16888384119842797c49316a96ae1cb132bb03856d980d95afee1"}, + {file = "duckdb-1.5.2-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:d72b8856b1839d35648f38301b058f6232f4d36b463fe4dc8f4d3fdff2df1a2e"}, + {file = "duckdb-1.5.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:2a1de4f4d454b8c97aec546c82003fc834d3422ce4bc6a19902f3462ef293bed"}, + {file = "duckdb-1.5.2-cp312-cp312-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ce0b8141a10d37ecef729c45bc41d334854013f4389f1488bd6035c5579aaac1"}, + {file = "duckdb-1.5.2-cp312-cp312-manylinux_2_26_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:c99ef73a277c8921bc0a1f16dee38d924484251d9cfd20951748c20fcd5ed855"}, + {file = "duckdb-1.5.2-cp312-cp312-win_amd64.whl", hash = "sha256:8d599758b4e48bf12e18c9b960cf491d219f0c4972d19a45489c05cc5ab36f83"}, + {file = "duckdb-1.5.2-cp312-cp312-win_arm64.whl", hash = "sha256:fc85a5dbcbe6eccac1113c72370d1d3aacfdd49198d63950bdf7d8638a307f00"}, + {file = "duckdb-1.5.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:4420b3f47027a7849d0e1815532007f377fa95ee5810b47ea717d35525c12f79"}, + {file = "duckdb-1.5.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:bb42e6ed543902e14eae647850da24103a89f0bc2587dec5601b1c1f213bd2ed"}, + {file = "duckdb-1.5.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:98c0535cd6d901f61a5ea3c2e26a1fd28482953d794deb183daf568e3aa5dda6"}, + {file = "duckdb-1.5.2-cp313-cp313-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:486c862bf7f163c0110b6d85b3e5c031d224a671cca468f12ebb1d3a348f6b39"}, + {file = "duckdb-1.5.2-cp313-cp313-manylinux_2_26_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:70631c847ca918ee710ec874241b00cf9d2e5be90762cbb2a0389f17823c08f7"}, + {file = "duckdb-1.5.2-cp313-cp313-win_amd64.whl", hash = "sha256:52a21823f3fbb52f0f0e5425e20b07391ad882464b955879499b5ff0b45a376b"}, + {file = "duckdb-1.5.2-cp313-cp313-win_arm64.whl", hash = "sha256:411ad438bd4140f189a10e7f515781335962c5d18bd07837dc6d202e3985253d"}, + {file = "duckdb-1.5.2-cp314-cp314-macosx_10_15_universal2.whl", hash = "sha256:6b0fe75c148000f060aa1a27b293cacc0ea08cc1cad724fbf2143d56070a3785"}, + {file = "duckdb-1.5.2-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:35579b8e3a064b5eaf15b0eafc558056a13f79a0a62e34cc4baf57119daecfec"}, + {file = "duckdb-1.5.2-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:ea58ff5b0880593a280cf5511734b17711b32ee1f58b47d726e8600848358160"}, + {file = "duckdb-1.5.2-cp314-cp314-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ef461bca07313412dc09961c4a4757a851f56b95ac01c58fac6007632b7b94f2"}, + {file = "duckdb-1.5.2-cp314-cp314-manylinux_2_26_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:be37680ddb380015cb37318e378c53511c45c4f0d8fac5599d22b7d092b9217a"}, + {file = "duckdb-1.5.2-cp314-cp314-win_amd64.whl", hash = "sha256:0b291786014df1133f8f18b9df4d004484613146e858d71a21791e0fcca16cf4"}, + {file = "duckdb-1.5.2-cp314-cp314-win_arm64.whl", hash = "sha256:c9f3e0b71b8a50fccfb42794899285d9d318ce2503782b9dd54868e5ecd0ad31"}, + {file = "duckdb-1.5.2.tar.gz", hash = "sha256:638da0d5102b6cb6f7d47f83d0600708ac1d3cb46c5e9aaabc845f9ba4d69246"}, ] [package.extras] @@ -560,7 +538,7 @@ all = ["adbc-driver-manager", "fsspec", "ipython", "numpy", "pandas", "pyarrow"] [[package]] name = "ers-spec" -version = "0.3.0" +version = "1.0.0" description = " The core components for the Entity Resolution System (ERS) components.\n\n The ERS is a pluggable entity resolution system for data transformation pipelines.\n" optional = false python-versions = ">=3.12,<4.0" @@ -574,43 +552,20 @@ pydantic = ">=2.10.6,<3.0.0" [package.source] type = "git" url = "https://github.com/OP-TED/entity-resolution-spec.git" -reference = "0.3.0-rc.1" -resolved_reference = "67702bf64f5afdfab15cb378afffb4a394516c07" - -[[package]] -name = "fastapi" -version = "0.135.1" -description = "FastAPI framework, high performance, easy to learn, fast to code, ready for production" -optional = false -python-versions = ">=3.10" -groups = ["dev"] -files = [ - {file = "fastapi-0.135.1-py3-none-any.whl", hash = "sha256:46e2fc5745924b7c840f71ddd277382af29ce1cdb7d5eab5bf697e3fb9999c9e"}, - {file = "fastapi-0.135.1.tar.gz", hash = "sha256:d04115b508d936d254cea545b7312ecaa58a7b3a0f84952535b4c9afae7668cd"}, -] - -[package.dependencies] -annotated-doc = ">=0.0.2" -pydantic = ">=2.7.0" -starlette = ">=0.46.0" -typing-extensions = ">=4.8.0" -typing-inspection = ">=0.4.2" - -[package.extras] -all = ["email-validator (>=2.0.0)", "fastapi-cli[standard] (>=0.0.8)", "httpx (>=0.23.0,<1.0.0)", "itsdangerous (>=1.1.0)", "jinja2 (>=3.1.5)", "pydantic-extra-types (>=2.0.0)", "pydantic-settings (>=2.0.0)", "python-multipart (>=0.0.18)", "pyyaml (>=5.3.1)", "uvicorn[standard] (>=0.12.0)"] -standard = ["email-validator (>=2.0.0)", "fastapi-cli[standard] (>=0.0.8)", "httpx (>=0.23.0,<1.0.0)", "jinja2 (>=3.1.5)", "pydantic-extra-types (>=2.0.0)", "pydantic-settings (>=2.0.0)", "python-multipart (>=0.0.18)", "uvicorn[standard] (>=0.12.0)"] -standard-no-fastapi-cloud-cli = ["email-validator (>=2.0.0)", "fastapi-cli[standard-no-fastapi-cloud-cli] (>=0.0.8)", "httpx (>=0.23.0,<1.0.0)", "jinja2 (>=3.1.5)", "pydantic-extra-types (>=2.0.0)", "pydantic-settings (>=2.0.0)", "python-multipart (>=0.0.18)", "uvicorn[standard] (>=0.12.0)"] +reference = "release/1.0.0" +resolved_reference = "457ae516cb1894a3f0ea3786ae05b355785c2f12" +subdirectory = "src" [[package]] name = "filelock" -version = "3.25.0" +version = "3.29.0" description = "A platform independent file lock." optional = false python-versions = ">=3.10" groups = ["dev"] files = [ - {file = "filelock-3.25.0-py3-none-any.whl", hash = "sha256:5ccf8069f7948f494968fc0713c10e5c182a9c9d9eef3a636307a20c2490f047"}, - {file = "filelock-3.25.0.tar.gz", hash = "sha256:8f00faf3abf9dc730a1ffe9c354ae5c04e079ab7d3a683b7c32da5dd05f26af3"}, + {file = "filelock-3.29.0-py3-none-any.whl", hash = "sha256:96f5f6344709aa1572bbf631c640e4ebeeb519e08da902c39a001882f30ac258"}, + {file = "filelock-3.29.0.tar.gz", hash = "sha256:69974355e960702e789734cb4871f884ea6fe50bd8404051a3530bc07809cf90"}, ] [[package]] @@ -743,18 +698,6 @@ files = [ [package.dependencies] typing-extensions = ">=3.10.0.0" -[[package]] -name = "h11" -version = "0.16.0" -description = "A pure-Python, bring-your-own-I/O implementation of HTTP/1.1" -optional = false -python-versions = ">=3.8" -groups = ["dev"] -files = [ - {file = "h11-0.16.0-py3-none-any.whl", hash = "sha256:63cf8bbe7522de3bf65932fda1d9c2772064ffb3dae62d55932da54b31cb6c86"}, - {file = "h11-0.16.0.tar.gz", hash = "sha256:4e35b956cf45792e4caa5885e69fba00bdbc6ffafbfa020300e549b208ee5ff1"}, -] - [[package]] name = "hbreader" version = "0.9.1" @@ -769,18 +712,18 @@ files = [ [[package]] name = "idna" -version = "3.11" +version = "3.12" description = "Internationalized Domain Names in Applications (IDNA)" optional = false python-versions = ">=3.8" groups = ["main", "dev"] files = [ - {file = "idna-3.11-py3-none-any.whl", hash = "sha256:771a87f49d9defaf64091e6e6fe9c18d4833f140bd19464795bc32d966ca37ea"}, - {file = "idna-3.11.tar.gz", hash = "sha256:795dafcc9c04ed0c1fb032c2aa73654d8e8c5023a7df64a53f39190ada629902"}, + {file = "idna-3.12-py3-none-any.whl", hash = "sha256:60ffaa1858fac94c9c124728c24fcde8160f3fb4a7f79aa8cdd33a9d1af60a67"}, + {file = "idna-3.12.tar.gz", hash = "sha256:724e9952cc9e2bd7550ea784adb098d837ab5267ef67a1ab9cf7846bdbdd8254"}, ] [package.extras] -all = ["flake8 (>=7.1.1)", "mypy (>=1.11.2)", "pytest (>=8.3.2)", "ruff (>=0.6.2)"] +all = ["mypy (>=1.11.2)", "pytest (>=8.3.2)", "ruff (>=0.6.2)"] [[package]] name = "igraph" @@ -822,23 +765,24 @@ test-win-arm64 = ["cairocffi (>=1.2.0)", "networkx (>=2.5)", "pytest (>=7.0.1)", [[package]] name = "import-linter" -version = "2.10" +version = "2.11" description = "Lint your Python architecture" optional = false python-versions = ">=3.10" groups = ["dev"] files = [ - {file = "import_linter-2.10-py3-none-any.whl", hash = "sha256:cc2ddd7ec0145cbf83f3b25391d2a5dbbf138382aaf80708612497fa6ebc8f60"}, - {file = "import_linter-2.10.tar.gz", hash = "sha256:c6a5057d2dbd32e1854c4d6b60e90dfad459b7ab5356230486d8521f25872963"}, + {file = "import_linter-2.11-py3-none-any.whl", hash = "sha256:3dc54cae933bae3430358c30989762b721c77aa99d424f56a08265be0eeaa465"}, + {file = "import_linter-2.11.tar.gz", hash = "sha256:5abc3394797a54f9bae315e7242dc98715ba485f840ac38c6d3192c370d0085e"}, ] [package.dependencies] click = ">=6" -fastapi = "*" grimp = ">=3.14" rich = ">=14.2.0" typing-extensions = ">=3.10.0.0" -uvicorn = "*" + +[package.extras] +ui = ["fastapi (>=0.113)", "uvicorn (>=0.17.1)"] [[package]] name = "iniconfig" @@ -986,14 +930,14 @@ dev = ["coverage", "requests-cache"] [[package]] name = "mako" -version = "1.3.10" +version = "1.3.11" description = "A super-fast templating language that borrows the best ideas from the existing templating languages." optional = false python-versions = ">=3.8" groups = ["dev"] files = [ - {file = "mako-1.3.10-py3-none-any.whl", hash = "sha256:baef24a52fc4fc514a0887ac600f9f1cff3d82c61d4d700a1fa84d597b88db59"}, - {file = "mako-1.3.10.tar.gz", hash = "sha256:99579a6f39583fa7e5630a28c3c1f440e4e97a414b80372649c0ce338da2ea28"}, + {file = "mako-1.3.11-py3-none-any.whl", hash = "sha256:e372c6e333cf004aa736a15f425087ec977e1fcbd2966aae7f17c8dc1da27a77"}, + {file = "mako-1.3.11.tar.gz", hash = "sha256:071eb4ab4c5010443152255d77db7faa6ce5916f35226eb02dc34479b6858069"}, ] [package.dependencies] @@ -1171,14 +1115,14 @@ files = [ [[package]] name = "narwhals" -version = "2.17.0" +version = "2.20.0" description = "Extremely lightweight compatibility layer between dataframe libraries" optional = false python-versions = ">=3.9" groups = ["main"] files = [ - {file = "narwhals-2.17.0-py3-none-any.whl", hash = "sha256:2ac5307b7c2b275a7d66eeda906b8605e3d7a760951e188dcfff86e8ebe083dd"}, - {file = "narwhals-2.17.0.tar.gz", hash = "sha256:ebd5bc95bcfa2f8e89a8ac09e2765a63055162837208e67b42d6eeb6651d5e67"}, + {file = "narwhals-2.20.0-py3-none-any.whl", hash = "sha256:16e750ea5507d4ba6e8d03455b5f93a535e0405976561baea235bca5dc9f475d"}, + {file = "narwhals-2.20.0.tar.gz", hash = "sha256:c10994975fa7dc5a68c2cffcddbd5908fc8ebb2d463c5bab085309c0ee1f551e"}, ] [package.extras] @@ -1197,96 +1141,96 @@ sqlframe = ["sqlframe (>=3.22.0,!=3.39.3)"] [[package]] name = "numpy" -version = "2.4.2" +version = "2.4.4" description = "Fundamental package for array computing in Python" optional = false python-versions = ">=3.11" groups = ["main"] files = [ - {file = "numpy-2.4.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:e7e88598032542bd49af7c4747541422884219056c268823ef6e5e89851c8825"}, - {file = "numpy-2.4.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:7edc794af8b36ca37ef5fcb5e0d128c7e0595c7b96a2318d1badb6fcd8ee86b1"}, - {file = "numpy-2.4.2-cp311-cp311-macosx_14_0_arm64.whl", hash = "sha256:6e9f61981ace1360e42737e2bae58b27bf28a1b27e781721047d84bd754d32e7"}, - {file = "numpy-2.4.2-cp311-cp311-macosx_14_0_x86_64.whl", hash = "sha256:cb7bbb88aa74908950d979eeaa24dbdf1a865e3c7e45ff0121d8f70387b55f73"}, - {file = "numpy-2.4.2-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4f069069931240b3fc703f1e23df63443dbd6390614c8c44a87d96cd0ec81eb1"}, - {file = "numpy-2.4.2-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:c02ef4401a506fb60b411467ad501e1429a3487abca4664871d9ae0b46c8ba32"}, - {file = "numpy-2.4.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:2653de5c24910e49c2b106499803124dde62a5a1fe0eedeaecf4309a5f639390"}, - {file = "numpy-2.4.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:1ae241bbfc6ae276f94a170b14785e561cb5e7f626b6688cf076af4110887413"}, - {file = "numpy-2.4.2-cp311-cp311-win32.whl", hash = "sha256:df1b10187212b198dd45fa943d8985a3c8cf854aed4923796e0e019e113a1bda"}, - {file = "numpy-2.4.2-cp311-cp311-win_amd64.whl", hash = "sha256:b9c618d56a29c9cb1c4da979e9899be7578d2e0b3c24d52079c166324c9e8695"}, - {file = "numpy-2.4.2-cp311-cp311-win_arm64.whl", hash = "sha256:47c5a6ed21d9452b10227e5e8a0e1c22979811cad7dcc19d8e3e2fb8fa03f1a3"}, - {file = "numpy-2.4.2-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:21982668592194c609de53ba4933a7471880ccbaadcc52352694a59ecc860b3a"}, - {file = "numpy-2.4.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40397bda92382fcec844066efb11f13e1c9a3e2a8e8f318fb72ed8b6db9f60f1"}, - {file = "numpy-2.4.2-cp312-cp312-macosx_14_0_arm64.whl", hash = "sha256:b3a24467af63c67829bfaa61eecf18d5432d4f11992688537be59ecd6ad32f5e"}, - {file = "numpy-2.4.2-cp312-cp312-macosx_14_0_x86_64.whl", hash = "sha256:805cc8de9fd6e7a22da5aed858e0ab16be5a4db6c873dde1d7451c541553aa27"}, - {file = "numpy-2.4.2-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6d82351358ffbcdcd7b686b90742a9b86632d6c1c051016484fa0b326a0a1548"}, - {file = "numpy-2.4.2-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9e35d3e0144137d9fdae62912e869136164534d64a169f86438bc9561b6ad49f"}, - {file = "numpy-2.4.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:adb6ed2ad29b9e15321d167d152ee909ec73395901b70936f029c3bc6d7f4460"}, - {file = "numpy-2.4.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:8906e71fd8afcb76580404e2a950caef2685df3d2a57fe82a86ac8d33cc007ba"}, - {file = "numpy-2.4.2-cp312-cp312-win32.whl", hash = "sha256:ec055f6dae239a6299cace477b479cca2fc125c5675482daf1dd886933a1076f"}, - {file = "numpy-2.4.2-cp312-cp312-win_amd64.whl", hash = "sha256:209fae046e62d0ce6435fcfe3b1a10537e858249b3d9b05829e2a05218296a85"}, - {file = "numpy-2.4.2-cp312-cp312-win_arm64.whl", hash = "sha256:fbde1b0c6e81d56f5dccd95dd4a711d9b95df1ae4009a60887e56b27e8d903fa"}, - {file = "numpy-2.4.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:25f2059807faea4b077a2b6837391b5d830864b3543627f381821c646f31a63c"}, - {file = "numpy-2.4.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:bd3a7a9f5847d2fb8c2c6d1c862fa109c31a9abeca1a3c2bd5a64572955b2979"}, - {file = "numpy-2.4.2-cp313-cp313-macosx_14_0_arm64.whl", hash = "sha256:8e4549f8a3c6d13d55041925e912bfd834285ef1dd64d6bc7d542583355e2e98"}, - {file = "numpy-2.4.2-cp313-cp313-macosx_14_0_x86_64.whl", hash = "sha256:aea4f66ff44dfddf8c2cffd66ba6538c5ec67d389285292fe428cb2c738c8aef"}, - {file = "numpy-2.4.2-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c3cd545784805de05aafe1dde61752ea49a359ccba9760c1e5d1c88a93bbf2b7"}, - {file = "numpy-2.4.2-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d0d9b7c93578baafcbc5f0b83eaf17b79d345c6f36917ba0c67f45226911d499"}, - {file = "numpy-2.4.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:f74f0f7779cc7ae07d1810aab8ac6b1464c3eafb9e283a40da7309d5e6e48fbb"}, - {file = "numpy-2.4.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:c7ac672d699bf36275c035e16b65539931347d68b70667d28984c9fb34e07fa7"}, - {file = "numpy-2.4.2-cp313-cp313-win32.whl", hash = "sha256:8e9afaeb0beff068b4d9cd20d322ba0ee1cecfb0b08db145e4ab4dd44a6b5110"}, - {file = "numpy-2.4.2-cp313-cp313-win_amd64.whl", hash = "sha256:7df2de1e4fba69a51c06c28f5a3de36731eb9639feb8e1cf7e4a7b0daf4cf622"}, - {file = "numpy-2.4.2-cp313-cp313-win_arm64.whl", hash = "sha256:0fece1d1f0a89c16b03442eae5c56dc0be0c7883b5d388e0c03f53019a4bfd71"}, - {file = "numpy-2.4.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:5633c0da313330fd20c484c78cdd3f9b175b55e1a766c4a174230c6b70ad8262"}, - {file = "numpy-2.4.2-cp313-cp313t-macosx_14_0_arm64.whl", hash = "sha256:d9f64d786b3b1dd742c946c42d15b07497ed14af1a1f3ce840cce27daa0ce913"}, - {file = "numpy-2.4.2-cp313-cp313t-macosx_14_0_x86_64.whl", hash = "sha256:b21041e8cb6a1eb5312dd1d2f80a94d91efffb7a06b70597d44f1bd2dfc315ab"}, - {file = "numpy-2.4.2-cp313-cp313t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:00ab83c56211a1d7c07c25e3217ea6695e50a3e2f255053686b081dc0b091a82"}, - {file = "numpy-2.4.2-cp313-cp313t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:2fb882da679409066b4603579619341c6d6898fc83a8995199d5249f986e8e8f"}, - {file = "numpy-2.4.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:66cb9422236317f9d44b67b4d18f44efe6e9c7f8794ac0462978513359461554"}, - {file = "numpy-2.4.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:0f01dcf33e73d80bd8dc0f20a71303abbafa26a19e23f6b68d1aa9990af90257"}, - {file = "numpy-2.4.2-cp313-cp313t-win32.whl", hash = "sha256:52b913ec40ff7ae845687b0b34d8d93b60cb66dcee06996dd5c99f2fc9328657"}, - {file = "numpy-2.4.2-cp313-cp313t-win_amd64.whl", hash = "sha256:5eea80d908b2c1f91486eb95b3fb6fab187e569ec9752ab7d9333d2e66bf2d6b"}, - {file = "numpy-2.4.2-cp313-cp313t-win_arm64.whl", hash = "sha256:fd49860271d52127d61197bb50b64f58454e9f578cb4b2c001a6de8b1f50b0b1"}, - {file = "numpy-2.4.2-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:444be170853f1f9d528428eceb55f12918e4fda5d8805480f36a002f1415e09b"}, - {file = "numpy-2.4.2-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:d1240d50adff70c2a88217698ca844723068533f3f5c5fa6ee2e3220e3bdb000"}, - {file = "numpy-2.4.2-cp314-cp314-macosx_14_0_arm64.whl", hash = "sha256:7cdde6de52fb6664b00b056341265441192d1291c130e99183ec0d4b110ff8b1"}, - {file = "numpy-2.4.2-cp314-cp314-macosx_14_0_x86_64.whl", hash = "sha256:cda077c2e5b780200b6b3e09d0b42205a3d1c68f30c6dceb90401c13bff8fe74"}, - {file = "numpy-2.4.2-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d30291931c915b2ab5717c2974bb95ee891a1cf22ebc16a8006bd59cd210d40a"}, - {file = "numpy-2.4.2-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bba37bc29d4d85761deed3954a1bc62be7cf462b9510b51d367b769a8c8df325"}, - {file = "numpy-2.4.2-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:b2f0073ed0868db1dcd86e052d37279eef185b9c8db5bf61f30f46adac63c909"}, - {file = "numpy-2.4.2-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:7f54844851cdb630ceb623dcec4db3240d1ac13d4990532446761baede94996a"}, - {file = "numpy-2.4.2-cp314-cp314-win32.whl", hash = "sha256:12e26134a0331d8dbd9351620f037ec470b7c75929cb8a1537f6bfe411152a1a"}, - {file = "numpy-2.4.2-cp314-cp314-win_amd64.whl", hash = "sha256:068cdb2d0d644cdb45670810894f6a0600797a69c05f1ac478e8d31670b8ee75"}, - {file = "numpy-2.4.2-cp314-cp314-win_arm64.whl", hash = "sha256:6ed0be1ee58eef41231a5c943d7d1375f093142702d5723ca2eb07db9b934b05"}, - {file = "numpy-2.4.2-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:98f16a80e917003a12c0580f97b5f875853ebc33e2eaa4bccfc8201ac6869308"}, - {file = "numpy-2.4.2-cp314-cp314t-macosx_14_0_arm64.whl", hash = "sha256:20abd069b9cda45874498b245c8015b18ace6de8546bf50dfa8cea1696ed06ef"}, - {file = "numpy-2.4.2-cp314-cp314t-macosx_14_0_x86_64.whl", hash = "sha256:e98c97502435b53741540a5717a6749ac2ada901056c7db951d33e11c885cc7d"}, - {file = "numpy-2.4.2-cp314-cp314t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:da6cad4e82cb893db4b69105c604d805e0c3ce11501a55b5e9f9083b47d2ffe8"}, - {file = "numpy-2.4.2-cp314-cp314t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9e4424677ce4b47fe73c8b5556d876571f7c6945d264201180db2dc34f676ab5"}, - {file = "numpy-2.4.2-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:2b8f157c8a6f20eb657e240f8985cc135598b2b46985c5bccbde7616dc9c6b1e"}, - {file = "numpy-2.4.2-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:5daf6f3914a733336dab21a05cdec343144600e964d2fcdabaac0c0269874b2a"}, - {file = "numpy-2.4.2-cp314-cp314t-win32.whl", hash = "sha256:8c50dd1fc8826f5b26a5ee4d77ca55d88a895f4e4819c7ecc2a9f5905047a443"}, - {file = "numpy-2.4.2-cp314-cp314t-win_amd64.whl", hash = "sha256:fcf92bee92742edd401ba41135185866f7026c502617f422eb432cfeca4fe236"}, - {file = "numpy-2.4.2-cp314-cp314t-win_arm64.whl", hash = "sha256:1f92f53998a17265194018d1cc321b2e96e900ca52d54c7c77837b71b9465181"}, - {file = "numpy-2.4.2-pp311-pypy311_pp73-macosx_10_15_x86_64.whl", hash = "sha256:89f7268c009bc492f506abd6f5265defa7cb3f7487dc21d357c3d290add45082"}, - {file = "numpy-2.4.2-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:e6dee3bb76aa4009d5a912180bf5b2de012532998d094acee25d9cb8dee3e44a"}, - {file = "numpy-2.4.2-pp311-pypy311_pp73-macosx_14_0_arm64.whl", hash = "sha256:cd2bd2bbed13e213d6b55dc1d035a4f91748a7d3edc9480c13898b0353708920"}, - {file = "numpy-2.4.2-pp311-pypy311_pp73-macosx_14_0_x86_64.whl", hash = "sha256:cf28c0c1d4c4bf00f509fa7eb02c58d7caf221b50b467bcb0d9bbf1584d5c821"}, - {file = "numpy-2.4.2-pp311-pypy311_pp73-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e04ae107ac591763a47398bb45b568fc38f02dbc4aa44c063f67a131f99346cb"}, - {file = "numpy-2.4.2-pp311-pypy311_pp73-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:602f65afdef699cda27ec0b9224ae5dc43e328f4c24c689deaf77133dbee74d0"}, - {file = "numpy-2.4.2-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:be71bf1edb48ebbbf7f6337b5bfd2f895d1902f6335a5830b20141fc126ffba0"}, - {file = "numpy-2.4.2.tar.gz", hash = "sha256:659a6107e31a83c4e33f763942275fd278b21d095094044eb35569e86a21ddae"}, + {file = "numpy-2.4.4-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:f983334aea213c99992053ede6168500e5f086ce74fbc4acc3f2b00f5762e9db"}, + {file = "numpy-2.4.4-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:72944b19f2324114e9dc86a159787333b77874143efcf89a5167ef83cfee8af0"}, + {file = "numpy-2.4.4-cp311-cp311-macosx_14_0_arm64.whl", hash = "sha256:86b6f55f5a352b48d7fbfd2dbc3d5b780b2d79f4d3c121f33eb6efb22e9a2015"}, + {file = "numpy-2.4.4-cp311-cp311-macosx_14_0_x86_64.whl", hash = "sha256:ba1f4fc670ed79f876f70082eff4f9583c15fb9a4b89d6188412de4d18ae2f40"}, + {file = "numpy-2.4.4-cp311-cp311-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8a87ec22c87be071b6bdbd27920b129b94f2fc964358ce38f3822635a3e2e03d"}, + {file = "numpy-2.4.4-cp311-cp311-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:df3775294accfdd75f32c74ae39fcba920c9a378a2fc18a12b6820aa8c1fb502"}, + {file = "numpy-2.4.4-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:0d4e437e295f18ec29bc79daf55e8a47a9113df44d66f702f02a293d93a2d6dd"}, + {file = "numpy-2.4.4-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:6aa3236c78803afbcb255045fbef97a9e25a1f6c9888357d205ddc42f4d6eba5"}, + {file = "numpy-2.4.4-cp311-cp311-win32.whl", hash = "sha256:30caa73029a225b2d40d9fae193e008e24b2026b7ee1a867b7ee8d96ca1a448e"}, + {file = "numpy-2.4.4-cp311-cp311-win_amd64.whl", hash = "sha256:6bbe4eb67390b0a0265a2c25458f6b90a409d5d069f1041e6aff1e27e3d9a79e"}, + {file = "numpy-2.4.4-cp311-cp311-win_arm64.whl", hash = "sha256:fcfe2045fd2e8f3cb0ce9d4ba6dba6333b8fa05bb8a4939c908cd43322d14c7e"}, + {file = "numpy-2.4.4-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:15716cfef24d3a9762e3acdf87e27f58dc823d1348f765bbea6bef8c639bfa1b"}, + {file = "numpy-2.4.4-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:23cbfd4c17357c81021f21540da84ee282b9c8fba38a03b7b9d09ba6b951421e"}, + {file = "numpy-2.4.4-cp312-cp312-macosx_14_0_arm64.whl", hash = "sha256:8b3b60bb7cba2c8c81837661c488637eee696f59a877788a396d33150c35d842"}, + {file = "numpy-2.4.4-cp312-cp312-macosx_14_0_x86_64.whl", hash = "sha256:e4a010c27ff6f210ff4c6ef34394cd61470d01014439b192ec22552ee867f2a8"}, + {file = "numpy-2.4.4-cp312-cp312-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f9e75681b59ddaa5e659898085ae0eaea229d054f2ac0c7e563a62205a700121"}, + {file = "numpy-2.4.4-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:81f4a14bee47aec54f883e0cad2d73986640c1590eb9bfaaba7ad17394481e6e"}, + {file = "numpy-2.4.4-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:62d6b0f03b694173f9fcb1fb317f7222fd0b0b103e784c6549f5e53a27718c44"}, + {file = "numpy-2.4.4-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:fbc356aae7adf9e6336d336b9c8111d390a05df88f1805573ebb0807bd06fd1d"}, + {file = "numpy-2.4.4-cp312-cp312-win32.whl", hash = "sha256:0d35aea54ad1d420c812bfa0385c71cd7cc5bcf7c65fed95fc2cd02fe8c79827"}, + {file = "numpy-2.4.4-cp312-cp312-win_amd64.whl", hash = "sha256:b5f0362dc928a6ecd9db58868fca5e48485205e3855957bdedea308f8672ea4a"}, + {file = "numpy-2.4.4-cp312-cp312-win_arm64.whl", hash = "sha256:846300f379b5b12cc769334464656bc882e0735d27d9726568bc932fdc49d5ec"}, + {file = "numpy-2.4.4-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:08f2e31ed5e6f04b118e49821397f12767934cfdd12a1ce86a058f91e004ee50"}, + {file = "numpy-2.4.4-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:e823b8b6edc81e747526f70f71a9c0a07ac4e7ad13020aa736bb7c9d67196115"}, + {file = "numpy-2.4.4-cp313-cp313-macosx_14_0_arm64.whl", hash = "sha256:4a19d9dba1a76618dd86b164d608566f393f8ec6ac7c44f0cc879011c45e65af"}, + {file = "numpy-2.4.4-cp313-cp313-macosx_14_0_x86_64.whl", hash = "sha256:d2a8490669bfe99a233298348acc2d824d496dee0e66e31b66a6022c2ad74a5c"}, + {file = "numpy-2.4.4-cp313-cp313-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:45dbed2ab436a9e826e302fcdcbe9133f9b0006e5af7168afb8963a6520da103"}, + {file = "numpy-2.4.4-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:c901b15172510173f5cb310eae652908340f8dede90fff9e3bf6c0d8dfd92f83"}, + {file = "numpy-2.4.4-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:99d838547ace2c4aace6c4f76e879ddfe02bb58a80c1549928477862b7a6d6ed"}, + {file = "numpy-2.4.4-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:0aec54fd785890ecca25a6003fd9a5aed47ad607bbac5cd64f836ad8666f4959"}, + {file = "numpy-2.4.4-cp313-cp313-win32.whl", hash = "sha256:07077278157d02f65c43b1b26a3886bce886f95d20aabd11f87932750dfb14ed"}, + {file = "numpy-2.4.4-cp313-cp313-win_amd64.whl", hash = "sha256:5c70f1cc1c4efbe316a572e2d8b9b9cc44e89b95f79ca3331553fbb63716e2bf"}, + {file = "numpy-2.4.4-cp313-cp313-win_arm64.whl", hash = "sha256:ef4059d6e5152fa1a39f888e344c73fdc926e1b2dd58c771d67b0acfbf2aa67d"}, + {file = "numpy-2.4.4-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:4bbc7f303d125971f60ec0aaad5e12c62d0d2c925f0ab1273debd0e4ba37aba5"}, + {file = "numpy-2.4.4-cp313-cp313t-macosx_14_0_arm64.whl", hash = "sha256:4d6d57903571f86180eb98f8f0c839fa9ebbfb031356d87f1361be91e433f5b7"}, + {file = "numpy-2.4.4-cp313-cp313t-macosx_14_0_x86_64.whl", hash = "sha256:4636de7fd195197b7535f231b5de9e4b36d2c440b6e566d2e4e4746e6af0ca93"}, + {file = "numpy-2.4.4-cp313-cp313t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ad2e2ef14e0b04e544ea2fa0a36463f847f113d314aa02e5b402fdf910ef309e"}, + {file = "numpy-2.4.4-cp313-cp313t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:5a285b3b96f951841799528cd1f4f01cd70e7e0204b4abebac9463eecfcf2a40"}, + {file = "numpy-2.4.4-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:f8474c4241bc18b750be2abea9d7a9ec84f46ef861dbacf86a4f6e043401f79e"}, + {file = "numpy-2.4.4-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:4e874c976154687c1f71715b034739b45c7711bec81db01914770373d125e392"}, + {file = "numpy-2.4.4-cp313-cp313t-win32.whl", hash = "sha256:9c585a1790d5436a5374bac930dad6ed244c046ed91b2b2a3634eb2971d21008"}, + {file = "numpy-2.4.4-cp313-cp313t-win_amd64.whl", hash = "sha256:93e15038125dc1e5345d9b5b68aa7f996ec33b98118d18c6ca0d0b7d6198b7e8"}, + {file = "numpy-2.4.4-cp313-cp313t-win_arm64.whl", hash = "sha256:0dfd3f9d3adbe2920b68b5cd3d51444e13a10792ec7154cd0a2f6e74d4ab3233"}, + {file = "numpy-2.4.4-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:f169b9a863d34f5d11b8698ead99febeaa17a13ca044961aa8e2662a6c7766a0"}, + {file = "numpy-2.4.4-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:2483e4584a1cb3092da4470b38866634bafb223cbcd551ee047633fd2584599a"}, + {file = "numpy-2.4.4-cp314-cp314-macosx_14_0_arm64.whl", hash = "sha256:2d19e6e2095506d1736b7d80595e0f252d76b89f5e715c35e06e937679ea7d7a"}, + {file = "numpy-2.4.4-cp314-cp314-macosx_14_0_x86_64.whl", hash = "sha256:6a246d5914aa1c820c9443ddcee9c02bec3e203b0c080349533fae17727dfd1b"}, + {file = "numpy-2.4.4-cp314-cp314-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:989824e9faf85f96ec9c7761cd8d29c531ad857bfa1daa930cba85baaecf1a9a"}, + {file = "numpy-2.4.4-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:27a8d92cd10f1382a67d7cf4db7ce18341b66438bdd9f691d7b0e48d104c2a9d"}, + {file = "numpy-2.4.4-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:e44319a2953c738205bf3354537979eaa3998ed673395b964c1176083dd46252"}, + {file = "numpy-2.4.4-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:e892aff75639bbef0d2a2cfd55535510df26ff92f63c92cd84ef8d4ba5a5557f"}, + {file = "numpy-2.4.4-cp314-cp314-win32.whl", hash = "sha256:1378871da56ca8943c2ba674530924bb8ca40cd228358a3b5f302ad60cf875fc"}, + {file = "numpy-2.4.4-cp314-cp314-win_amd64.whl", hash = "sha256:715d1c092715954784bc79e1174fc2a90093dc4dc84ea15eb14dad8abdcdeb74"}, + {file = "numpy-2.4.4-cp314-cp314-win_arm64.whl", hash = "sha256:2c194dd721e54ecad9ad387c1d35e63dce5c4450c6dc7dd5611283dda239aabb"}, + {file = "numpy-2.4.4-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:2aa0613a5177c264ff5921051a5719d20095ea586ca88cc802c5c218d1c67d3e"}, + {file = "numpy-2.4.4-cp314-cp314t-macosx_14_0_arm64.whl", hash = "sha256:42c16925aa5a02362f986765f9ebabf20de75cdefdca827d14315c568dcab113"}, + {file = "numpy-2.4.4-cp314-cp314t-macosx_14_0_x86_64.whl", hash = "sha256:874f200b2a981c647340f841730fc3a2b54c9d940566a3c4149099591e2c4c3d"}, + {file = "numpy-2.4.4-cp314-cp314t-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c9b39d38a9bd2ae1becd7eac1303d031c5c110ad31f2b319c6e7d98b135c934d"}, + {file = "numpy-2.4.4-cp314-cp314t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b268594bccac7d7cf5844c7732e3f20c50921d94e36d7ec9b79e9857694b1b2f"}, + {file = "numpy-2.4.4-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:ac6b31e35612a26483e20750126d30d0941f949426974cace8e6b5c58a3657b0"}, + {file = "numpy-2.4.4-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:8e3ed142f2728df44263aaf5fb1f5b0b99f4070c553a0d7f033be65338329150"}, + {file = "numpy-2.4.4-cp314-cp314t-win32.whl", hash = "sha256:dddbbd259598d7240b18c9d87c56a9d2fb3b02fe266f49a7c101532e78c1d871"}, + {file = "numpy-2.4.4-cp314-cp314t-win_amd64.whl", hash = "sha256:a7164afb23be6e37ad90b2f10426149fd75aee07ca55653d2aa41e66c4ef697e"}, + {file = "numpy-2.4.4-cp314-cp314t-win_arm64.whl", hash = "sha256:ba203255017337d39f89bdd58417f03c4426f12beed0440cfd933cb15f8669c7"}, + {file = "numpy-2.4.4-pp311-pypy311_pp73-macosx_10_15_x86_64.whl", hash = "sha256:58c8b5929fcb8287cbd6f0a3fae19c6e03a5c48402ae792962ac465224a629a4"}, + {file = "numpy-2.4.4-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:eea7ac5d2dce4189771cedb559c738a71512768210dc4e4753b107a2048b3d0e"}, + {file = "numpy-2.4.4-pp311-pypy311_pp73-macosx_14_0_arm64.whl", hash = "sha256:51fc224f7ca4d92656d5a5eb315f12eb5fe2c97a66249aa7b5f562528a3be38c"}, + {file = "numpy-2.4.4-pp311-pypy311_pp73-macosx_14_0_x86_64.whl", hash = "sha256:28a650663f7314afc3e6ec620f44f333c386aad9f6fc472030865dc0ebb26ee3"}, + {file = "numpy-2.4.4-pp311-pypy311_pp73-manylinux_2_27_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:19710a9ca9992d7174e9c52f643d4272dcd1558c5f7af7f6f8190f633bd651a7"}, + {file = "numpy-2.4.4-pp311-pypy311_pp73-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9b2aec6af35c113b05695ebb5749a787acd63cafc83086a05771d1e1cd1e555f"}, + {file = "numpy-2.4.4-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:f2cf083b324a467e1ab358c105f6cad5ea950f50524668a80c486ff1db24e119"}, + {file = "numpy-2.4.4.tar.gz", hash = "sha256:2d390634c5182175533585cc89f3608a4682ccb173cc9bb940b2881c8d6f8fa0"}, ] [[package]] name = "packaging" -version = "26.0" +version = "26.1" description = "Core utilities for Python packages" optional = false python-versions = ">=3.8" groups = ["main", "dev"] files = [ - {file = "packaging-26.0-py3-none-any.whl", hash = "sha256:b36f1fef9334a5588b4166f8bcd26a14e521f2b55e6b9de3aaa80d3ff7a37529"}, - {file = "packaging-26.0.tar.gz", hash = "sha256:00243ae351a257117b6a241061796684b084ed1c516a08c48a3f7e147a9d80b4"}, + {file = "packaging-26.1-py3-none-any.whl", hash = "sha256:5d9c0669c6285e491e0ced2eee587eaf67b670d94a19e94e3984a481aba6802f"}, + {file = "packaging-26.1.tar.gz", hash = "sha256:f042152b681c4bfac5cae2742a55e103d27ab2ec0f3d88037136b6bfe7c9c5de"}, ] [[package]] @@ -1420,14 +1364,14 @@ testing = ["pytest (<5.0) ; python_version < \"3.0\"", "pytest (>=5.0) ; python_ [[package]] name = "platformdirs" -version = "4.9.2" +version = "4.9.6" description = "A small Python package for determining appropriate platform-specific dirs, e.g. a `user data dir`." optional = false python-versions = ">=3.10" groups = ["dev"] files = [ - {file = "platformdirs-4.9.2-py3-none-any.whl", hash = "sha256:9170634f126f8efdae22fb58ae8a0eaa86f38365bc57897a6c4f781d1f5875bd"}, - {file = "platformdirs-4.9.2.tar.gz", hash = "sha256:9a33809944b9db043ad67ca0db94b14bf452cc6aeaac46a88ea55b26e2e9d291"}, + {file = "platformdirs-4.9.6-py3-none-any.whl", hash = "sha256:e61adb1d5e5cb3441b4b7710bea7e4c12250ca49439228cc1021c00dcfac0917"}, + {file = "platformdirs-4.9.6.tar.gz", hash = "sha256:3bfa75b0ad0db84096ae777218481852c0ebc6c727b3168c1b9e0118e458cf0a"}, ] [[package]] @@ -1482,19 +1426,19 @@ pyyaml = ">=5.3.1" [[package]] name = "pydantic" -version = "2.12.5" +version = "2.13.3" description = "Data validation using Python type hints" optional = false python-versions = ">=3.9" -groups = ["main", "dev"] +groups = ["main"] files = [ - {file = "pydantic-2.12.5-py3-none-any.whl", hash = "sha256:e561593fccf61e8a20fc46dfc2dfe075b8be7d0188df33f221ad1f0139180f9d"}, - {file = "pydantic-2.12.5.tar.gz", hash = "sha256:4d351024c75c0f085a9febbb665ce8c0c6ec5d30e903bdb6394b7ede26aebb49"}, + {file = "pydantic-2.13.3-py3-none-any.whl", hash = "sha256:6db14ac8dfc9a1e57f87ea2c0de670c251240f43cb0c30a5130e9720dc612927"}, + {file = "pydantic-2.13.3.tar.gz", hash = "sha256:af09e9d1d09f4e7fe37145c1f577e1d61ceb9a41924bf0094a36506285d0a84d"}, ] [package.dependencies] annotated-types = ">=0.6.0" -pydantic-core = "2.41.5" +pydantic-core = "2.46.3" typing-extensions = ">=4.14.1" typing-inspection = ">=0.4.2" @@ -1504,133 +1448,132 @@ timezone = ["tzdata ; python_version >= \"3.9\" and platform_system == \"Windows [[package]] name = "pydantic-core" -version = "2.41.5" +version = "2.46.3" description = "Core functionality for Pydantic validation and serialization" optional = false python-versions = ">=3.9" -groups = ["main", "dev"] +groups = ["main"] files = [ - {file = "pydantic_core-2.41.5-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:77b63866ca88d804225eaa4af3e664c5faf3568cea95360d21f4725ab6e07146"}, - {file = "pydantic_core-2.41.5-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:dfa8a0c812ac681395907e71e1274819dec685fec28273a28905df579ef137e2"}, - {file = "pydantic_core-2.41.5-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5921a4d3ca3aee735d9fd163808f5e8dd6c6972101e4adbda9a4667908849b97"}, - {file = "pydantic_core-2.41.5-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:e25c479382d26a2a41b7ebea1043564a937db462816ea07afa8a44c0866d52f9"}, - {file = "pydantic_core-2.41.5-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f547144f2966e1e16ae626d8ce72b4cfa0caedc7fa28052001c94fb2fcaa1c52"}, - {file = "pydantic_core-2.41.5-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6f52298fbd394f9ed112d56f3d11aabd0d5bd27beb3084cc3d8ad069483b8941"}, - {file = "pydantic_core-2.41.5-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:100baa204bb412b74fe285fb0f3a385256dad1d1879f0a5cb1499ed2e83d132a"}, - {file = "pydantic_core-2.41.5-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:05a2c8852530ad2812cb7914dc61a1125dc4e06252ee98e5638a12da6cc6fb6c"}, - {file = "pydantic_core-2.41.5-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:29452c56df2ed968d18d7e21f4ab0ac55e71dc59524872f6fc57dcf4a3249ed2"}, - {file = "pydantic_core-2.41.5-cp310-cp310-musllinux_1_1_armv7l.whl", hash = "sha256:d5160812ea7a8a2ffbe233d8da666880cad0cbaf5d4de74ae15c313213d62556"}, - {file = "pydantic_core-2.41.5-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:df3959765b553b9440adfd3c795617c352154e497a4eaf3752555cfb5da8fc49"}, - {file = "pydantic_core-2.41.5-cp310-cp310-win32.whl", hash = "sha256:1f8d33a7f4d5a7889e60dc39856d76d09333d8a6ed0f5f1190635cbec70ec4ba"}, - {file = "pydantic_core-2.41.5-cp310-cp310-win_amd64.whl", hash = "sha256:62de39db01b8d593e45871af2af9e497295db8d73b085f6bfd0b18c83c70a8f9"}, - {file = "pydantic_core-2.41.5-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:a3a52f6156e73e7ccb0f8cced536adccb7042be67cb45f9562e12b319c119da6"}, - {file = "pydantic_core-2.41.5-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:7f3bf998340c6d4b0c9a2f02d6a400e51f123b59565d74dc60d252ce888c260b"}, - {file = "pydantic_core-2.41.5-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:378bec5c66998815d224c9ca994f1e14c0c21cb95d2f52b6021cc0b2a58f2a5a"}, - {file = "pydantic_core-2.41.5-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:e7b576130c69225432866fe2f4a469a85a54ade141d96fd396dffcf607b558f8"}, - {file = "pydantic_core-2.41.5-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:6cb58b9c66f7e4179a2d5e0f849c48eff5c1fca560994d6eb6543abf955a149e"}, - {file = "pydantic_core-2.41.5-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:88942d3a3dff3afc8288c21e565e476fc278902ae4d6d134f1eeda118cc830b1"}, - {file = "pydantic_core-2.41.5-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f31d95a179f8d64d90f6831d71fa93290893a33148d890ba15de25642c5d075b"}, - {file = "pydantic_core-2.41.5-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:c1df3d34aced70add6f867a8cf413e299177e0c22660cc767218373d0779487b"}, - {file = "pydantic_core-2.41.5-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:4009935984bd36bd2c774e13f9a09563ce8de4abaa7226f5108262fa3e637284"}, - {file = "pydantic_core-2.41.5-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:34a64bc3441dc1213096a20fe27e8e128bd3ff89921706e83c0b1ac971276594"}, - {file = "pydantic_core-2.41.5-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:c9e19dd6e28fdcaa5a1de679aec4141f691023916427ef9bae8584f9c2fb3b0e"}, - {file = "pydantic_core-2.41.5-cp311-cp311-win32.whl", hash = "sha256:2c010c6ded393148374c0f6f0bf89d206bf3217f201faa0635dcd56bd1520f6b"}, - {file = "pydantic_core-2.41.5-cp311-cp311-win_amd64.whl", hash = "sha256:76ee27c6e9c7f16f47db7a94157112a2f3a00e958bc626e2f4ee8bec5c328fbe"}, - {file = "pydantic_core-2.41.5-cp311-cp311-win_arm64.whl", hash = "sha256:4bc36bbc0b7584de96561184ad7f012478987882ebf9f9c389b23f432ea3d90f"}, - {file = "pydantic_core-2.41.5-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:f41a7489d32336dbf2199c8c0a215390a751c5b014c2c1c5366e817202e9cdf7"}, - {file = "pydantic_core-2.41.5-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:070259a8818988b9a84a449a2a7337c7f430a22acc0859c6b110aa7212a6d9c0"}, - {file = "pydantic_core-2.41.5-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e96cea19e34778f8d59fe40775a7a574d95816eb150850a85a7a4c8f4b94ac69"}, - {file = "pydantic_core-2.41.5-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ed2e99c456e3fadd05c991f8f437ef902e00eedf34320ba2b0842bd1c3ca3a75"}, - {file = "pydantic_core-2.41.5-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:65840751b72fbfd82c3c640cff9284545342a4f1eb1586ad0636955b261b0b05"}, - {file = "pydantic_core-2.41.5-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e536c98a7626a98feb2d3eaf75944ef6f3dbee447e1f841eae16f2f0a72d8ddc"}, - {file = "pydantic_core-2.41.5-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:eceb81a8d74f9267ef4081e246ffd6d129da5d87e37a77c9bde550cb04870c1c"}, - {file = "pydantic_core-2.41.5-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:d38548150c39b74aeeb0ce8ee1d8e82696f4a4e16ddc6de7b1d8823f7de4b9b5"}, - {file = "pydantic_core-2.41.5-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:c23e27686783f60290e36827f9c626e63154b82b116d7fe9adba1fda36da706c"}, - {file = "pydantic_core-2.41.5-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:482c982f814460eabe1d3bb0adfdc583387bd4691ef00b90575ca0d2b6fe2294"}, - {file = "pydantic_core-2.41.5-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:bfea2a5f0b4d8d43adf9d7b8bf019fb46fdd10a2e5cde477fbcb9d1fa08c68e1"}, - {file = "pydantic_core-2.41.5-cp312-cp312-win32.whl", hash = "sha256:b74557b16e390ec12dca509bce9264c3bbd128f8a2c376eaa68003d7f327276d"}, - {file = "pydantic_core-2.41.5-cp312-cp312-win_amd64.whl", hash = "sha256:1962293292865bca8e54702b08a4f26da73adc83dd1fcf26fbc875b35d81c815"}, - {file = "pydantic_core-2.41.5-cp312-cp312-win_arm64.whl", hash = "sha256:1746d4a3d9a794cacae06a5eaaccb4b8643a131d45fbc9af23e353dc0a5ba5c3"}, - {file = "pydantic_core-2.41.5-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:941103c9be18ac8daf7b7adca8228f8ed6bb7a1849020f643b3a14d15b1924d9"}, - {file = "pydantic_core-2.41.5-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:112e305c3314f40c93998e567879e887a3160bb8689ef3d2c04b6cc62c33ac34"}, - {file = "pydantic_core-2.41.5-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0cbaad15cb0c90aa221d43c00e77bb33c93e8d36e0bf74760cd00e732d10a6a0"}, - {file = "pydantic_core-2.41.5-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:03ca43e12fab6023fc79d28ca6b39b05f794ad08ec2feccc59a339b02f2b3d33"}, - {file = "pydantic_core-2.41.5-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:dc799088c08fa04e43144b164feb0c13f9a0bc40503f8df3e9fde58a3c0c101e"}, - {file = "pydantic_core-2.41.5-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:97aeba56665b4c3235a0e52b2c2f5ae9cd071b8a8310ad27bddb3f7fb30e9aa2"}, - {file = "pydantic_core-2.41.5-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:406bf18d345822d6c21366031003612b9c77b3e29ffdb0f612367352aab7d586"}, - {file = "pydantic_core-2.41.5-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:b93590ae81f7010dbe380cdeab6f515902ebcbefe0b9327cc4804d74e93ae69d"}, - {file = "pydantic_core-2.41.5-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:01a3d0ab748ee531f4ea6c3e48ad9dac84ddba4b0d82291f87248f2f9de8d740"}, - {file = "pydantic_core-2.41.5-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:6561e94ba9dacc9c61bce40e2d6bdc3bfaa0259d3ff36ace3b1e6901936d2e3e"}, - {file = "pydantic_core-2.41.5-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:915c3d10f81bec3a74fbd4faebe8391013ba61e5a1a8d48c4455b923bdda7858"}, - {file = "pydantic_core-2.41.5-cp313-cp313-win32.whl", hash = "sha256:650ae77860b45cfa6e2cdafc42618ceafab3a2d9a3811fcfbd3bbf8ac3c40d36"}, - {file = "pydantic_core-2.41.5-cp313-cp313-win_amd64.whl", hash = "sha256:79ec52ec461e99e13791ec6508c722742ad745571f234ea6255bed38c6480f11"}, - {file = "pydantic_core-2.41.5-cp313-cp313-win_arm64.whl", hash = "sha256:3f84d5c1b4ab906093bdc1ff10484838aca54ef08de4afa9de0f5f14d69639cd"}, - {file = "pydantic_core-2.41.5-cp314-cp314-macosx_10_12_x86_64.whl", hash = "sha256:3f37a19d7ebcdd20b96485056ba9e8b304e27d9904d233d7b1015db320e51f0a"}, - {file = "pydantic_core-2.41.5-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:1d1d9764366c73f996edd17abb6d9d7649a7eb690006ab6adbda117717099b14"}, - {file = "pydantic_core-2.41.5-cp314-cp314-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:25e1c2af0fce638d5f1988b686f3b3ea8cd7de5f244ca147c777769e798a9cd1"}, - {file = "pydantic_core-2.41.5-cp314-cp314-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:506d766a8727beef16b7adaeb8ee6217c64fc813646b424d0804d67c16eddb66"}, - {file = "pydantic_core-2.41.5-cp314-cp314-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4819fa52133c9aa3c387b3328f25c1facc356491e6135b459f1de698ff64d869"}, - {file = "pydantic_core-2.41.5-cp314-cp314-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2b761d210c9ea91feda40d25b4efe82a1707da2ef62901466a42492c028553a2"}, - {file = "pydantic_core-2.41.5-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:22f0fb8c1c583a3b6f24df2470833b40207e907b90c928cc8d3594b76f874375"}, - {file = "pydantic_core-2.41.5-cp314-cp314-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2782c870e99878c634505236d81e5443092fba820f0373997ff75f90f68cd553"}, - {file = "pydantic_core-2.41.5-cp314-cp314-musllinux_1_1_aarch64.whl", hash = "sha256:0177272f88ab8312479336e1d777f6b124537d47f2123f89cb37e0accea97f90"}, - {file = "pydantic_core-2.41.5-cp314-cp314-musllinux_1_1_armv7l.whl", hash = "sha256:63510af5e38f8955b8ee5687740d6ebf7c2a0886d15a6d65c32814613681bc07"}, - {file = "pydantic_core-2.41.5-cp314-cp314-musllinux_1_1_x86_64.whl", hash = "sha256:e56ba91f47764cc14f1daacd723e3e82d1a89d783f0f5afe9c364b8bb491ccdb"}, - {file = "pydantic_core-2.41.5-cp314-cp314-win32.whl", hash = "sha256:aec5cf2fd867b4ff45b9959f8b20ea3993fc93e63c7363fe6851424c8a7e7c23"}, - {file = "pydantic_core-2.41.5-cp314-cp314-win_amd64.whl", hash = "sha256:8e7c86f27c585ef37c35e56a96363ab8de4e549a95512445b85c96d3e2f7c1bf"}, - {file = "pydantic_core-2.41.5-cp314-cp314-win_arm64.whl", hash = "sha256:e672ba74fbc2dc8eea59fb6d4aed6845e6905fc2a8afe93175d94a83ba2a01a0"}, - {file = "pydantic_core-2.41.5-cp314-cp314t-macosx_10_12_x86_64.whl", hash = "sha256:8566def80554c3faa0e65ac30ab0932b9e3a5cd7f8323764303d468e5c37595a"}, - {file = "pydantic_core-2.41.5-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:b80aa5095cd3109962a298ce14110ae16b8c1aece8b72f9dafe81cf597ad80b3"}, - {file = "pydantic_core-2.41.5-cp314-cp314t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3006c3dd9ba34b0c094c544c6006cc79e87d8612999f1a5d43b769b89181f23c"}, - {file = "pydantic_core-2.41.5-cp314-cp314t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:72f6c8b11857a856bcfa48c86f5368439f74453563f951e473514579d44aa612"}, - {file = "pydantic_core-2.41.5-cp314-cp314t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5cb1b2f9742240e4bb26b652a5aeb840aa4b417c7748b6f8387927bc6e45e40d"}, - {file = "pydantic_core-2.41.5-cp314-cp314t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:bd3d54f38609ff308209bd43acea66061494157703364ae40c951f83ba99a1a9"}, - {file = "pydantic_core-2.41.5-cp314-cp314t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2ff4321e56e879ee8d2a879501c8e469414d948f4aba74a2d4593184eb326660"}, - {file = "pydantic_core-2.41.5-cp314-cp314t-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:d0d2568a8c11bf8225044aa94409e21da0cb09dcdafe9ecd10250b2baad531a9"}, - {file = "pydantic_core-2.41.5-cp314-cp314t-musllinux_1_1_aarch64.whl", hash = "sha256:a39455728aabd58ceabb03c90e12f71fd30fa69615760a075b9fec596456ccc3"}, - {file = "pydantic_core-2.41.5-cp314-cp314t-musllinux_1_1_armv7l.whl", hash = "sha256:239edca560d05757817c13dc17c50766136d21f7cd0fac50295499ae24f90fdf"}, - {file = "pydantic_core-2.41.5-cp314-cp314t-musllinux_1_1_x86_64.whl", hash = "sha256:2a5e06546e19f24c6a96a129142a75cee553cc018ffee48a460059b1185f4470"}, - {file = "pydantic_core-2.41.5-cp314-cp314t-win32.whl", hash = "sha256:b4ececa40ac28afa90871c2cc2b9ffd2ff0bf749380fbdf57d165fd23da353aa"}, - {file = "pydantic_core-2.41.5-cp314-cp314t-win_amd64.whl", hash = "sha256:80aa89cad80b32a912a65332f64a4450ed00966111b6615ca6816153d3585a8c"}, - {file = "pydantic_core-2.41.5-cp314-cp314t-win_arm64.whl", hash = "sha256:35b44f37a3199f771c3eaa53051bc8a70cd7b54f333531c59e29fd4db5d15008"}, - {file = "pydantic_core-2.41.5-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:8bfeaf8735be79f225f3fefab7f941c712aaca36f1128c9d7e2352ee1aa87bdf"}, - {file = "pydantic_core-2.41.5-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:346285d28e4c8017da95144c7f3acd42740d637ff41946af5ce6e5e420502dd5"}, - {file = "pydantic_core-2.41.5-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a75dafbf87d6276ddc5b2bf6fae5254e3d0876b626eb24969a574fff9149ee5d"}, - {file = "pydantic_core-2.41.5-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:7b93a4d08587e2b7e7882de461e82b6ed76d9026ce91ca7915e740ecc7855f60"}, - {file = "pydantic_core-2.41.5-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e8465ab91a4bd96d36dde3263f06caa6a8a6019e4113f24dc753d79a8b3a3f82"}, - {file = "pydantic_core-2.41.5-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:299e0a22e7ae2b85c1a57f104538b2656e8ab1873511fd718a1c1c6f149b77b5"}, - {file = "pydantic_core-2.41.5-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:707625ef0983fcfb461acfaf14de2067c5942c6bb0f3b4c99158bed6fedd3cf3"}, - {file = "pydantic_core-2.41.5-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f41eb9797986d6ebac5e8edff36d5cef9de40def462311b3eb3eeded1431e425"}, - {file = "pydantic_core-2.41.5-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:0384e2e1021894b1ff5a786dbf94771e2986ebe2869533874d7e43bc79c6f504"}, - {file = "pydantic_core-2.41.5-cp39-cp39-musllinux_1_1_armv7l.whl", hash = "sha256:f0cd744688278965817fd0839c4a4116add48d23890d468bc436f78beb28abf5"}, - {file = "pydantic_core-2.41.5-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:753e230374206729bf0a807954bcc6c150d3743928a73faffee51ac6557a03c3"}, - {file = "pydantic_core-2.41.5-cp39-cp39-win32.whl", hash = "sha256:873e0d5b4fb9b89ef7c2d2a963ea7d02879d9da0da8d9d4933dee8ee86a8b460"}, - {file = "pydantic_core-2.41.5-cp39-cp39-win_amd64.whl", hash = "sha256:e4f4a984405e91527a0d62649ee21138f8e3d0ef103be488c1dc11a80d7f184b"}, - {file = "pydantic_core-2.41.5-graalpy311-graalpy242_311_native-macosx_10_12_x86_64.whl", hash = "sha256:b96d5f26b05d03cc60f11a7761a5ded1741da411e7fe0909e27a5e6a0cb7b034"}, - {file = "pydantic_core-2.41.5-graalpy311-graalpy242_311_native-macosx_11_0_arm64.whl", hash = "sha256:634e8609e89ceecea15e2d61bc9ac3718caaaa71963717bf3c8f38bfde64242c"}, - {file = "pydantic_core-2.41.5-graalpy311-graalpy242_311_native-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:93e8740d7503eb008aa2df04d3b9735f845d43ae845e6dcd2be0b55a2da43cd2"}, - {file = "pydantic_core-2.41.5-graalpy311-graalpy242_311_native-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f15489ba13d61f670dcc96772e733aad1a6f9c429cc27574c6cdaed82d0146ad"}, - {file = "pydantic_core-2.41.5-graalpy312-graalpy250_312_native-macosx_10_12_x86_64.whl", hash = "sha256:7da7087d756b19037bc2c06edc6c170eeef3c3bafcb8f532ff17d64dc427adfd"}, - {file = "pydantic_core-2.41.5-graalpy312-graalpy250_312_native-macosx_11_0_arm64.whl", hash = "sha256:aabf5777b5c8ca26f7824cb4a120a740c9588ed58df9b2d196ce92fba42ff8dc"}, - {file = "pydantic_core-2.41.5-graalpy312-graalpy250_312_native-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c007fe8a43d43b3969e8469004e9845944f1a80e6acd47c150856bb87f230c56"}, - {file = "pydantic_core-2.41.5-graalpy312-graalpy250_312_native-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:76d0819de158cd855d1cbb8fcafdf6f5cf1eb8e470abe056d5d161106e38062b"}, - {file = "pydantic_core-2.41.5-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:b5819cd790dbf0c5eb9f82c73c16b39a65dd6dd4d1439dcdea7816ec9adddab8"}, - {file = "pydantic_core-2.41.5-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:5a4e67afbc95fa5c34cf27d9089bca7fcab4e51e57278d710320a70b956d1b9a"}, - {file = "pydantic_core-2.41.5-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ece5c59f0ce7d001e017643d8d24da587ea1f74f6993467d85ae8a5ef9d4f42b"}, - {file = "pydantic_core-2.41.5-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:16f80f7abe3351f8ea6858914ddc8c77e02578544a0ebc15b4c2e1a0e813b0b2"}, - {file = "pydantic_core-2.41.5-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:33cb885e759a705b426baada1fe68cbb0a2e68e34c5d0d0289a364cf01709093"}, - {file = "pydantic_core-2.41.5-pp310-pypy310_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:c8d8b4eb992936023be7dee581270af5c6e0697a8559895f527f5b7105ecd36a"}, - {file = "pydantic_core-2.41.5-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:242a206cd0318f95cd21bdacff3fcc3aab23e79bba5cac3db5a841c9ef9c6963"}, - {file = "pydantic_core-2.41.5-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:d3a978c4f57a597908b7e697229d996d77a6d3c94901e9edee593adada95ce1a"}, - {file = "pydantic_core-2.41.5-pp311-pypy311_pp73-macosx_10_12_x86_64.whl", hash = "sha256:b2379fa7ed44ddecb5bfe4e48577d752db9fc10be00a6b7446e9663ba143de26"}, - {file = "pydantic_core-2.41.5-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:266fb4cbf5e3cbd0b53669a6d1b039c45e3ce651fd5442eff4d07c2cc8d66808"}, - {file = "pydantic_core-2.41.5-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:58133647260ea01e4d0500089a8c4f07bd7aa6ce109682b1426394988d8aaacc"}, - {file = "pydantic_core-2.41.5-pp311-pypy311_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:287dad91cfb551c363dc62899a80e9e14da1f0e2b6ebde82c806612ca2a13ef1"}, - {file = "pydantic_core-2.41.5-pp311-pypy311_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:03b77d184b9eb40240ae9fd676ca364ce1085f203e1b1256f8ab9984dca80a84"}, - {file = "pydantic_core-2.41.5-pp311-pypy311_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:a668ce24de96165bb239160b3d854943128f4334822900534f2fe947930e5770"}, - {file = "pydantic_core-2.41.5-pp311-pypy311_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:f14f8f046c14563f8eb3f45f499cc658ab8d10072961e07225e507adb700e93f"}, - {file = "pydantic_core-2.41.5-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:56121965f7a4dc965bff783d70b907ddf3d57f6eba29b6d2e5dabfaf07799c51"}, - {file = "pydantic_core-2.41.5.tar.gz", hash = "sha256:08daa51ea16ad373ffd5e7606252cc32f07bc72b28284b6bc9c6df804816476e"}, + {file = "pydantic_core-2.46.3-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:1da3786b8018e60349680720158cc19161cc3b4bdd815beb0a321cd5ce1ad5b1"}, + {file = "pydantic_core-2.46.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:cc0988cb29d21bf4a9d5cf2ef970b5c0e38d8d8e107a493278c05dc6c1dda69f"}, + {file = "pydantic_core-2.46.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:27f9067c3bfadd04c55484b89c0d267981b2f3512850f6f66e1e74204a4e4ce3"}, + {file = "pydantic_core-2.46.3-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:a642ac886ecf6402d9882d10c405dcf4b902abeb2972cd5fb4a48c83cd59279a"}, + {file = "pydantic_core-2.46.3-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:79f561438481f28681584b89e2effb22855e2179880314bcddbf5968e935e807"}, + {file = "pydantic_core-2.46.3-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:57a973eae4665352a47cf1a99b4ee864620f2fe663a217d7a8da68a1f3a5bfda"}, + {file = "pydantic_core-2.46.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:83d002b97072a53ea150d63e0a3adfae5670cef5aa8a6e490240e482d3b22e57"}, + {file = "pydantic_core-2.46.3-cp310-cp310-manylinux_2_31_riscv64.whl", hash = "sha256:b40ddd51e7c44b28cfaef746c9d3c506d658885e0a46f9eeef2ee815cbf8e045"}, + {file = "pydantic_core-2.46.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:ac5ec7fb9b87f04ee839af2d53bcadea57ded7d229719f56c0ed895bff987943"}, + {file = "pydantic_core-2.46.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:a3b11c812f61b3129c4905781a2601dfdfdea5fe1e6c1cfb696b55d14e9c054f"}, + {file = "pydantic_core-2.46.3-cp310-cp310-musllinux_1_1_armv7l.whl", hash = "sha256:1108da631e602e5b3c38d6d04fe5bb3bfa54349e6918e3ca6cf570b2e2b2f9d4"}, + {file = "pydantic_core-2.46.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:de885175515bcfa98ae618c1df7a072f13d179f81376c8007112af20567fd08a"}, + {file = "pydantic_core-2.46.3-cp310-cp310-win32.whl", hash = "sha256:d11058e3201527d41bc6b545c79187c9e4bf85e15a236a6007f0e991518882b7"}, + {file = "pydantic_core-2.46.3-cp310-cp310-win_amd64.whl", hash = "sha256:3612edf65c8ea67ac13616c4d23af12faef1ae435a8a93e5934c2a0cbbdd1fd6"}, + {file = "pydantic_core-2.46.3-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:ab124d49d0459b2373ecf54118a45c28a1e6d4192a533fbc915e70f556feb8e5"}, + {file = "pydantic_core-2.46.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:cca67d52a5c7a16aed2b3999e719c4bcf644074eac304a5d3d62dd70ae7d4b2c"}, + {file = "pydantic_core-2.46.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5c024e08c0ba23e6fd68c771a521e9d6a792f2ebb0fa734296b36394dc30390e"}, + {file = "pydantic_core-2.46.3-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:6645ce7eec4928e29a1e3b3d5c946621d105d3e79f0c9cddf07c2a9770949287"}, + {file = "pydantic_core-2.46.3-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:a712c7118e6c5ea96562f7b488435172abb94a3c53c22c9efc1412264a45cbbe"}, + {file = "pydantic_core-2.46.3-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:69a868ef3ff206343579021c40faf3b1edc64b1cc508ff243a28b0a514ccb050"}, + {file = "pydantic_core-2.46.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cc7e8c32db809aa0f6ea1d6869ebc8518a65d5150fdfad8bcae6a49ae32a22e2"}, + {file = "pydantic_core-2.46.3-cp311-cp311-manylinux_2_31_riscv64.whl", hash = "sha256:3481bd1341dc85779ee506bc8e1196a277ace359d89d28588a9468c3ecbe63fa"}, + {file = "pydantic_core-2.46.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:8690eba565c6d68ffd3a8655525cbdd5246510b44a637ee2c6c03a7ebfe64d3c"}, + {file = "pydantic_core-2.46.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:4de88889d7e88d50d40ee5b39d5dac0bcaef9ba91f7e536ac064e6b2834ecccf"}, + {file = "pydantic_core-2.46.3-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:e480080975c1ef7f780b8f99ed72337e7cc5efea2e518a20a692e8e7b278eb8b"}, + {file = "pydantic_core-2.46.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:de3a5c376f8cd94da9a1b8fd3dd1c16c7a7b216ed31dc8ce9fd7a22bf13b836e"}, + {file = "pydantic_core-2.46.3-cp311-cp311-win32.whl", hash = "sha256:fc331a5314ffddd5385b9ee9d0d2fee0b13c27e0e02dad71b1ae5d6561f51eeb"}, + {file = "pydantic_core-2.46.3-cp311-cp311-win_amd64.whl", hash = "sha256:b5b9c6cf08a8a5e502698f5e153056d12c34b8fb30317e0c5fd06f45162a6346"}, + {file = "pydantic_core-2.46.3-cp311-cp311-win_arm64.whl", hash = "sha256:5dfd51cf457482f04ec49491811a2b8fd5b843b64b11eecd2d7a1ee596ea78a6"}, + {file = "pydantic_core-2.46.3-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:b11b59b3eee90a80a36701ddb4576d9ae31f93f05cb9e277ceaa09e6bf074a67"}, + {file = "pydantic_core-2.46.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:af8653713055ea18a3abc1537fe2ebc42f5b0bbb768d1eb79fd74eb47c0ac089"}, + {file = "pydantic_core-2.46.3-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:75a519dab6d63c514f3a81053e5266c549679e4aa88f6ec57f2b7b854aceb1b0"}, + {file = "pydantic_core-2.46.3-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:a6cd87cb1575b1ad05ba98894c5b5c96411ef678fa2f6ed2576607095b8d9789"}, + {file = "pydantic_core-2.46.3-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f80a55484b8d843c8ada81ebf70a682f3f00a3d40e378c06cf17ecb44d280d7d"}, + {file = "pydantic_core-2.46.3-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3861f1731b90c50a3266316b9044f5c9b405eecb8e299b0a7120596334e4fe9c"}, + {file = "pydantic_core-2.46.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fb528e295ed31570ac3dcc9bfdd6e0150bc11ce6168ac87a8082055cf1a67395"}, + {file = "pydantic_core-2.46.3-cp312-cp312-manylinux_2_31_riscv64.whl", hash = "sha256:367508faa4973b992b271ba1494acaab36eb7e8739d1e47be5035fb1ea225396"}, + {file = "pydantic_core-2.46.3-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:5ad3c826fe523e4becf4fe39baa44286cff85ef137c729a2c5e269afbfd0905d"}, + {file = "pydantic_core-2.46.3-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:ec638c5d194ef8af27db69f16c954a09797c0dc25015ad6123eb2c73a4d271ca"}, + {file = "pydantic_core-2.46.3-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:28ed528c45446062ee66edb1d33df5d88828ae167de76e773a3c7f64bd14e976"}, + {file = "pydantic_core-2.46.3-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:aed19d0c783886d5bd86d80ae5030006b45e28464218747dcf83dabfdd092c7b"}, + {file = "pydantic_core-2.46.3-cp312-cp312-win32.whl", hash = "sha256:06d5d8820cbbdb4147578c1fe7ffcd5b83f34508cb9f9ab76e807be7db6ff0a4"}, + {file = "pydantic_core-2.46.3-cp312-cp312-win_amd64.whl", hash = "sha256:c3212fda0ee959c1dd04c60b601ec31097aaa893573a3a1abd0a47bcac2968c1"}, + {file = "pydantic_core-2.46.3-cp312-cp312-win_arm64.whl", hash = "sha256:f1f8338dd7a7f31761f1f1a3c47503a9a3b34eea3c8b01fa6ee96408affb5e72"}, + {file = "pydantic_core-2.46.3-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:12bc98de041458b80c86c56b24df1d23832f3e166cbaff011f25d187f5c62c37"}, + {file = "pydantic_core-2.46.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:85348b8f89d2c3508b65b16c3c33a4da22b8215138d8b996912bb1532868885f"}, + {file = "pydantic_core-2.46.3-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1105677a6df914b1fb71a81b96c8cce7726857e1717d86001f29be06a25ee6f8"}, + {file = "pydantic_core-2.46.3-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:87082cd65669a33adeba5470769e9704c7cf026cc30afb9cc77fd865578ebaad"}, + {file = "pydantic_core-2.46.3-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:60e5f66e12c4f5212d08522963380eaaeac5ebd795826cfd19b2dfb0c7a52b9c"}, + {file = "pydantic_core-2.46.3-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:b6cdf19bf84128d5e7c37e8a73a0c5c10d51103a650ac585d42dd6ae233f2b7f"}, + {file = "pydantic_core-2.46.3-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:031bb17f4885a43773c8c763089499f242aee2ea85cf17154168775dccdecf35"}, + {file = "pydantic_core-2.46.3-cp313-cp313-manylinux_2_31_riscv64.whl", hash = "sha256:bcf2a8b2982a6673693eae7348ef3d8cf3979c1d63b54fca7c397a635cc68687"}, + {file = "pydantic_core-2.46.3-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:28e8cf2f52d72ced402a137145923a762cbb5081e48b34312f7a0c8f55928ec3"}, + {file = "pydantic_core-2.46.3-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:17eaface65d9fc5abb940003020309c1bf7a211f5f608d7870297c367e6f9022"}, + {file = "pydantic_core-2.46.3-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:93fd339f23408a07e98950a89644f92c54d8729719a40b30c0a30bb9ebc55d23"}, + {file = "pydantic_core-2.46.3-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:23cbdb3aaa74dfe0837975dbf69b469753bbde8eacace524519ffdb6b6e89eb7"}, + {file = "pydantic_core-2.46.3-cp313-cp313-win32.whl", hash = "sha256:610eda2e3838f401105e6326ca304f5da1e15393ae25dacae5c5c63f2c275b13"}, + {file = "pydantic_core-2.46.3-cp313-cp313-win_amd64.whl", hash = "sha256:68cc7866ed863db34351294187f9b729964c371ba33e31c26f478471c52e1ed0"}, + {file = "pydantic_core-2.46.3-cp313-cp313-win_arm64.whl", hash = "sha256:f64b5537ac62b231572879cd08ec05600308636a5d63bcbdb15063a466977bec"}, + {file = "pydantic_core-2.46.3-cp314-cp314-macosx_10_12_x86_64.whl", hash = "sha256:afa3aa644f74e290cdede48a7b0bee37d1c35e71b05105f6b340d484af536d9b"}, + {file = "pydantic_core-2.46.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:ced3310e51aa425f7f77da8bbbb5212616655bedbe82c70944320bc1dbe5e018"}, + {file = "pydantic_core-2.46.3-cp314-cp314-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e29908922ce9da1a30b4da490bd1d3d82c01dcfdf864d2a74aacee674d0bfa34"}, + {file = "pydantic_core-2.46.3-cp314-cp314-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:0c9ff69140423eea8ed2d5477df3ba037f671f5e897d206d921bc9fdc39613e7"}, + {file = "pydantic_core-2.46.3-cp314-cp314-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b675ab0a0d5b1c8fdb81195dc5bcefea3f3c240871cdd7ff9a2de8aa50772eb2"}, + {file = "pydantic_core-2.46.3-cp314-cp314-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0087084960f209a9a4af50ecd1fb063d9ad3658c07bb81a7a53f452dacbfb2ba"}, + {file = "pydantic_core-2.46.3-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ed42e6cc8e1b0e2b9b96e2276bad70ae625d10d6d524aed0c93de974ae029f9f"}, + {file = "pydantic_core-2.46.3-cp314-cp314-manylinux_2_31_riscv64.whl", hash = "sha256:f1771ce258afb3e4201e67d154edbbae712a76a6081079fe247c2f53c6322c22"}, + {file = "pydantic_core-2.46.3-cp314-cp314-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:a7610b6a5242a6c736d8ad47fd5fff87fcfe8f833b281b1c409c3d6835d9227f"}, + {file = "pydantic_core-2.46.3-cp314-cp314-musllinux_1_1_aarch64.whl", hash = "sha256:ff5e7783bcc5476e1db448bf268f11cb257b1c276d3e89f00b5727be86dd0127"}, + {file = "pydantic_core-2.46.3-cp314-cp314-musllinux_1_1_armv7l.whl", hash = "sha256:9d2e32edcc143bc01e95300671915d9ca052d4f745aa0a49c48d4803f8a85f2c"}, + {file = "pydantic_core-2.46.3-cp314-cp314-musllinux_1_1_x86_64.whl", hash = "sha256:6e42d83d1c6b87fa56b521479cff237e626a292f3b31b6345c15a99121b454c1"}, + {file = "pydantic_core-2.46.3-cp314-cp314-win32.whl", hash = "sha256:07bc6d2a28c3adb4f7c6ae46aa4f2d2929af127f587ed44057af50bf1ce0f505"}, + {file = "pydantic_core-2.46.3-cp314-cp314-win_amd64.whl", hash = "sha256:8940562319bc621da30714617e6a7eaa6b98c84e8c685bcdc02d7ed5e7c7c44e"}, + {file = "pydantic_core-2.46.3-cp314-cp314-win_arm64.whl", hash = "sha256:5dcbbcf4d22210ced8f837c96db941bdb078f419543472aca5d9a0bb7cddc7df"}, + {file = "pydantic_core-2.46.3-cp314-cp314t-macosx_10_12_x86_64.whl", hash = "sha256:d0fe3dce1e836e418f912c1ad91c73357d03e556a4d286f441bf34fed2dbeecf"}, + {file = "pydantic_core-2.46.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:9ce92e58abc722dac1bf835a6798a60b294e48eb0e625ec9fd994b932ac5feee"}, + {file = "pydantic_core-2.46.3-cp314-cp314t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a03e6467f0f5ab796a486146d1b887b2dc5e5f9b3288898c1b1c3ad974e53e4a"}, + {file = "pydantic_core-2.46.3-cp314-cp314t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:2798b6ba041b9d70acfb9071a2ea13c8456dd1e6a5555798e41ba7b0790e329c"}, + {file = "pydantic_core-2.46.3-cp314-cp314t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:9be3e221bdc6d69abf294dcf7aff6af19c31a5cdcc8f0aa3b14be29df4bd03b1"}, + {file = "pydantic_core-2.46.3-cp314-cp314t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f13936129ce841f2a5ddf6f126fea3c43cd128807b5a59588c37cf10178c2e64"}, + {file = "pydantic_core-2.46.3-cp314-cp314t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:28b5f2ef03416facccb1c6ef744c69793175fd27e44ef15669201601cf423acb"}, + {file = "pydantic_core-2.46.3-cp314-cp314t-manylinux_2_31_riscv64.whl", hash = "sha256:830d1247d77ad23852314f069e9d7ddafeec5f684baf9d7e7065ed46a049c4e6"}, + {file = "pydantic_core-2.46.3-cp314-cp314t-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:d0793c90c1a3c74966e7975eaef3ed30ebdff3260a0f815a62a22adc17e4c01c"}, + {file = "pydantic_core-2.46.3-cp314-cp314t-musllinux_1_1_aarch64.whl", hash = "sha256:d2d0aead851b66f5245ec0c4fb2612ef457f8bbafefdf65a2bf9d6bac6140f47"}, + {file = "pydantic_core-2.46.3-cp314-cp314t-musllinux_1_1_armv7l.whl", hash = "sha256:2f40e4246676beb31c5ce77c38a55ca4e465c6b38d11ea1bd935420568e0b1ab"}, + {file = "pydantic_core-2.46.3-cp314-cp314t-musllinux_1_1_x86_64.whl", hash = "sha256:cf489cf8986c543939aeee17a09c04d6ffb43bfef8ca16fcbcc5cfdcbed24dba"}, + {file = "pydantic_core-2.46.3-cp314-cp314t-win32.whl", hash = "sha256:ffe0883b56cfc05798bf994164d2b2ff03efe2d22022a2bb080f3b626176dd56"}, + {file = "pydantic_core-2.46.3-cp314-cp314t-win_amd64.whl", hash = "sha256:706d9d0ce9cf4593d07270d8e9f53b161f90c57d315aeec4fb4fd7a8b10240d8"}, + {file = "pydantic_core-2.46.3-cp314-cp314t-win_arm64.whl", hash = "sha256:77706aeb41df6a76568434701e0917da10692da28cb69d5fb6919ce5fdb07374"}, + {file = "pydantic_core-2.46.3-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:fa3eb7c2995aa443687a825bc30395c8521b7c6ec201966e55debfd1128bcceb"}, + {file = "pydantic_core-2.46.3-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:3d08782c4045f90724b44c95d35ebec0d67edb8a957a2ac81d5a8e4b8a200495"}, + {file = "pydantic_core-2.46.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:831eb19aa789a97356979e94c981e5667759301fb708d1c0d5adf1bc0098b873"}, + {file = "pydantic_core-2.46.3-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:4335e87c7afa436a0dfa899e138d57a72f8aad542e2cf19c36fb428461caabd0"}, + {file = "pydantic_core-2.46.3-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:99421e7684a60f7f3550a1d159ade5fdff1954baedb6bdd407cba6a307c9f27d"}, + {file = "pydantic_core-2.46.3-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:dd81f6907932ebac3abbe41378dac64b2380db1287e2aa64d8d88f78d170f51a"}, + {file = "pydantic_core-2.46.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9f247596366f4221af52beddd65af1218797771d6989bc891a0b86ccaa019168"}, + {file = "pydantic_core-2.46.3-cp39-cp39-manylinux_2_31_riscv64.whl", hash = "sha256:6dff8cc884679df229ebc6d8eb2321ea6f8e091bc7d4886d4dc2e0e71452843c"}, + {file = "pydantic_core-2.46.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:68ef2f623dda6d5a9067ac014e406c020c780b2a358930a7e5c1b73702900720"}, + {file = "pydantic_core-2.46.3-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:d56bdb4af1767cc15b0386b3c581fdfe659bb9ee4a4f776e92c1cd9d074000d6"}, + {file = "pydantic_core-2.46.3-cp39-cp39-musllinux_1_1_armv7l.whl", hash = "sha256:91249bcb7c165c2fb2a2f852dbc5c91636e2e218e75d96dfdd517e4078e173dd"}, + {file = "pydantic_core-2.46.3-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:4b068543bdb707f5d935dab765d99227aa2545ef2820935f2e5dd801795c7dbd"}, + {file = "pydantic_core-2.46.3-cp39-cp39-win32.whl", hash = "sha256:dcda6583921c05a40533f982321532f2d8db29326c7b95c4026941fa5074bd79"}, + {file = "pydantic_core-2.46.3-cp39-cp39-win_amd64.whl", hash = "sha256:a35cc284c8dd7edae8a31533713b4d2467dfe7c4f1b5587dd4031f28f90d1d13"}, + {file = "pydantic_core-2.46.3-graalpy311-graalpy242_311_native-macosx_10_12_x86_64.whl", hash = "sha256:9715525891ed524a0a1eb6d053c74d4d4ad5017677fb00af0b7c2644a31bae46"}, + {file = "pydantic_core-2.46.3-graalpy311-graalpy242_311_native-macosx_11_0_arm64.whl", hash = "sha256:9d2f400712a99a013aff420ef1eb9be077f8189a36c1e3ef87660b4e1088a874"}, + {file = "pydantic_core-2.46.3-graalpy311-graalpy242_311_native-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:bd2aab0e2e9dc2daf36bd2686c982535d5e7b1d930a1344a7bb6e82baab42a76"}, + {file = "pydantic_core-2.46.3-graalpy311-graalpy242_311_native-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4e9d76736da5f362fabfeea6a69b13b7f2be405c6d6966f06b2f6bfff7e64531"}, + {file = "pydantic_core-2.46.3-graalpy312-graalpy250_312_native-macosx_10_12_x86_64.whl", hash = "sha256:b12dd51f1187c2eb489af8e20f880362db98e954b54ab792fa5d92e8bcc6b803"}, + {file = "pydantic_core-2.46.3-graalpy312-graalpy250_312_native-macosx_11_0_arm64.whl", hash = "sha256:f00a0961b125f1a47af7bcc17f00782e12f4cd056f83416006b30111d941dfa3"}, + {file = "pydantic_core-2.46.3-graalpy312-graalpy250_312_native-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:57697d7c056aca4bbb680200f96563e841a6386ac1129370a0102592f4dddff5"}, + {file = "pydantic_core-2.46.3-graalpy312-graalpy250_312_native-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fd35aa21299def8db7ef4fe5c4ff862941a9a158ca7b63d61e66fe67d30416b4"}, + {file = "pydantic_core-2.46.3-pp311-pypy311_pp73-macosx_10_12_x86_64.whl", hash = "sha256:13afdd885f3d71280cf286b13b310ee0f7ccfefd1dbbb661514a474b726e2f25"}, + {file = "pydantic_core-2.46.3-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:f91c0aff3e3ee0928edd1232c57f643a7a003e6edf1860bc3afcdc749cb513f3"}, + {file = "pydantic_core-2.46.3-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6529d1d128321a58d30afcc97b49e98836542f68dd41b33c2e972bb9e5290536"}, + {file = "pydantic_core-2.46.3-pp311-pypy311_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:975c267cff4f7e7272eacbe50f6cc03ca9a3da4c4fbd66fffd89c94c1e311aa1"}, + {file = "pydantic_core-2.46.3-pp311-pypy311_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:2b8e4f2bbdf71415c544b4b1138b8060db7b6611bc927e8064c769f64bed651c"}, + {file = "pydantic_core-2.46.3-pp311-pypy311_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:e61ea8e9fff9606d09178f577ff8ccdd7206ff73d6552bcec18e1033c4254b85"}, + {file = "pydantic_core-2.46.3-pp311-pypy311_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:b504bda01bafc69b6d3c7a0c7f039dcf60f47fab70e06fe23f57b5c75bdc82b8"}, + {file = "pydantic_core-2.46.3-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:b00b76f7142fc60c762ce579bd29c8fa44aaa56592dd3c54fab3928d0d4ca6ff"}, + {file = "pydantic_core-2.46.3.tar.gz", hash = "sha256:41c178f65b8c29807239d47e6050262eb6bf84eb695e41101e62e38df4a5bc2c"}, ] [package.dependencies] @@ -1638,14 +1581,14 @@ typing-extensions = ">=4.14.1" [[package]] name = "pygments" -version = "2.19.2" +version = "2.20.0" description = "Pygments is a syntax highlighting package written in Python." optional = false -python-versions = ">=3.8" +python-versions = ">=3.9" groups = ["main", "dev"] files = [ - {file = "pygments-2.19.2-py3-none-any.whl", hash = "sha256:86540386c03d588bb81d44bc3928634ff26449851e99741617ecb9037ee5ec0b"}, - {file = "pygments-2.19.2.tar.gz", hash = "sha256:636cb2477cec7f8952536970bc533bc43743542f70392ae026374600add5b887"}, + {file = "pygments-2.20.0-py3-none-any.whl", hash = "sha256:81a9e26dd42fd28a23a2d169d86d7ac03b46e2f8b59ed4698fb4785f946d0176"}, + {file = "pygments-2.20.0.tar.gz", hash = "sha256:6757cd03768053ff99f3039c1a36d6c0aa0b263438fcab17520b30a303a82b5f"}, ] [package.extras] @@ -1710,16 +1653,44 @@ packaging = ">=25" docs = ["furo (>=2025.9.25)", "sphinx-autodoc-typehints (>=3.5.1)"] testing = ["covdefaults (>=2.3)", "pytest (>=8.4.2)", "pytest-cov (>=7)", "pytest-mock (>=3.15.1)", "setuptools (>=80.9)"] +[[package]] +name = "pystow" +version = "0.8.5" +description = "Easily pick a place to store data for your Python code" +optional = false +python-versions = ">=3.10" +groups = ["main"] +files = [ + {file = "pystow-0.8.5-py3-none-any.whl", hash = "sha256:a8593a22ec6a16c39ee0458b393db30abbba1e9f95f2865c5793df7e81c51e9e"}, + {file = "pystow-0.8.5.tar.gz", hash = "sha256:c918ead173ed5d0234a888e3d480e00d3fe3ee608c9fc0722796d72aa4e44438"}, +] + +[package.dependencies] +tqdm = "*" +typing-extensions = "*" + +[package.extras] +aws = ["boto3"] +bs4 = ["bs4", "requests"] +cli = ["click"] +pandas = ["pandas"] +pydantic = ["pydantic"] +ratelimit = ["ratelimit", "requests"] +rdf = ["rdflib"] +requests = ["requests"] +xml = ["lxml"] +yaml = ["pyyaml"] + [[package]] name = "pytest" -version = "9.0.2" +version = "9.0.3" description = "pytest: simple powerful testing with Python" optional = false python-versions = ">=3.10" groups = ["main", "dev"] files = [ - {file = "pytest-9.0.2-py3-none-any.whl", hash = "sha256:711ffd45bf766d5264d487b917733b453d917afd2b0ad65223959f59089f875b"}, - {file = "pytest-9.0.2.tar.gz", hash = "sha256:75186651a92bd89611d1d9fc20f0b4345fd827c41ccd5c299a868a05d70edf11"}, + {file = "pytest-9.0.3-py3-none-any.whl", hash = "sha256:2c5efc453d45394fdd706ade797c0a81091eccd1d6e4bccfcd476e2b8e0ab5d9"}, + {file = "pytest-9.0.3.tar.gz", hash = "sha256:b86ada508af81d19edeb213c681b1d48246c1a91d304c6c81a427674c17eb91c"}, ] [package.dependencies] @@ -1804,14 +1775,14 @@ six = ">=1.5" [[package]] name = "python-discovery" -version = "1.1.0" +version = "1.2.2" description = "Python interpreter discovery" optional = false python-versions = ">=3.8" groups = ["dev"] files = [ - {file = "python_discovery-1.1.0-py3-none-any.whl", hash = "sha256:a162893b8809727f54594a99ad2179d2ede4bf953e12d4c7abc3cc9cdbd1437b"}, - {file = "python_discovery-1.1.0.tar.gz", hash = "sha256:447941ba1aed8cc2ab7ee3cb91be5fc137c5bdbb05b7e6ea62fbdcb66e50b268"}, + {file = "python_discovery-1.2.2-py3-none-any.whl", hash = "sha256:e1ae95d9af875e78f15e19aed0c6137ab1bb49c200f21f5061786490c9585c7a"}, + {file = "python_discovery-1.2.2.tar.gz", hash = "sha256:876e9c57139eb757cb5878cbdd9ae5379e5d96266c99ef731119e04fffe533bb"}, ] [package.dependencies] @@ -2008,14 +1979,14 @@ rdf4j = ["httpx (>=0.28.1,<0.29.0)"] [[package]] name = "redis" -version = "7.2.1" +version = "7.4.0" description = "Python client for Redis database and key-value store" optional = false python-versions = ">=3.10" groups = ["main", "dev"] files = [ - {file = "redis-7.2.1-py3-none-any.whl", hash = "sha256:49e231fbc8df2001436ae5252b3f0f3dc930430239bfeb6da4c7ee92b16e5d33"}, - {file = "redis-7.2.1.tar.gz", hash = "sha256:6163c1a47ee2d9d01221d8456bc1c75ab953cbda18cfbc15e7140e9ba16ca3a5"}, + {file = "redis-7.4.0-py3-none-any.whl", hash = "sha256:a9c74a5c893a5ef8455a5adb793a31bb70feb821c86eccb62eebef5a19c429ec"}, + {file = "redis-7.4.0.tar.gz", hash = "sha256:64a6ea7bf567ad43c964d2c30d82853f8df927c5c9017766c55a1d1ed95d18ad"}, ] [package.extras] @@ -2045,36 +2016,36 @@ typing-extensions = {version = ">=4.4.0", markers = "python_version < \"3.13\""} [[package]] name = "requests" -version = "2.32.5" +version = "2.33.1" description = "Python HTTP for Humans." optional = false -python-versions = ">=3.9" +python-versions = ">=3.10" groups = ["main", "dev"] files = [ - {file = "requests-2.32.5-py3-none-any.whl", hash = "sha256:2462f94637a34fd532264295e186976db0f5d453d1cdd31473c85a6a161affb6"}, - {file = "requests-2.32.5.tar.gz", hash = "sha256:dbba0bac56e100853db0ea71b82b4dfd5fe2bf6d3754a8893c3af500cec7d7cf"}, + {file = "requests-2.33.1-py3-none-any.whl", hash = "sha256:4e6d1ef462f3626a1f0a0a9c42dd93c63bad33f9f1c1937509b8c5c8718ab56a"}, + {file = "requests-2.33.1.tar.gz", hash = "sha256:18817f8c57c6263968bc123d237e3b8b08ac046f5456bd1e307ee8f4250d3517"}, ] [package.dependencies] -certifi = ">=2017.4.17" +certifi = ">=2023.5.7" charset_normalizer = ">=2,<4" idna = ">=2.5,<4" -urllib3 = ">=1.21.1,<3" +urllib3 = ">=1.26,<3" [package.extras] socks = ["PySocks (>=1.5.6,!=1.5.7)"] -use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"] +use-chardet-on-py3 = ["chardet (>=3.0.2,<8)"] [[package]] name = "rich" -version = "14.3.3" +version = "15.0.0" description = "Render rich text, tables, progress bars, syntax highlighting, markdown and more to the terminal" optional = false -python-versions = ">=3.8.0" +python-versions = ">=3.9.0" groups = ["dev"] files = [ - {file = "rich-14.3.3-py3-none-any.whl", hash = "sha256:793431c1f8619afa7d3b52b2cdec859562b950ea0d4b6b505397612db8d5362d"}, - {file = "rich-14.3.3.tar.gz", hash = "sha256:b8daa0b9e4eef54dd8cf7c86c03713f53241884e814f4e2f5fb342fe520f639b"}, + {file = "rich-15.0.0-py3-none-any.whl", hash = "sha256:33bd4ef74232fb73fe9279a257718407f169c09b78a87ad3d296f548e27de0bb"}, + {file = "rich-15.0.0.tar.gz", hash = "sha256:edd07a4824c6b40189fb7ac9bc4c52536e9780fbbfbddf6f1e2502c31b068c36"}, ] [package.dependencies] @@ -2211,30 +2182,30 @@ files = [ [[package]] name = "ruff" -version = "0.15.4" +version = "0.15.11" description = "An extremely fast Python linter and code formatter, written in Rust." optional = false python-versions = ">=3.7" groups = ["dev"] files = [ - {file = "ruff-0.15.4-py3-none-linux_armv6l.whl", hash = "sha256:a1810931c41606c686bae8b5b9a8072adac2f611bb433c0ba476acba17a332e0"}, - {file = "ruff-0.15.4-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:5a1632c66672b8b4d3e1d1782859e98d6e0b4e70829530666644286600a33992"}, - {file = "ruff-0.15.4-py3-none-macosx_11_0_arm64.whl", hash = "sha256:a4386ba2cd6c0f4ff75252845906acc7c7c8e1ac567b7bc3d373686ac8c222ba"}, - {file = "ruff-0.15.4-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b2496488bdfd3732747558b6f95ae427ff066d1fcd054daf75f5a50674411e75"}, - {file = "ruff-0.15.4-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:3f1c4893841ff2d54cbda1b2860fa3260173df5ddd7b95d370186f8a5e66a4ac"}, - {file = "ruff-0.15.4-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:820b8766bd65503b6c30aaa6331e8ef3a6e564f7999c844e9a547c40179e440a"}, - {file = "ruff-0.15.4-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c9fb74bab47139c1751f900f857fa503987253c3ef89129b24ed375e72873e85"}, - {file = "ruff-0.15.4-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f80c98765949c518142b3a50a5db89343aa90f2c2bf7799de9986498ae6176db"}, - {file = "ruff-0.15.4-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:451a2e224151729b3b6c9ffb36aed9091b2996fe4bdbd11f47e27d8f2e8888ec"}, - {file = "ruff-0.15.4-py3-none-manylinux_2_31_riscv64.whl", hash = "sha256:a8f157f2e583c513c4f5f896163a93198297371f34c04220daf40d133fdd4f7f"}, - {file = "ruff-0.15.4-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:917cc68503357021f541e69b35361c99387cdbbf99bd0ea4aa6f28ca99ff5338"}, - {file = "ruff-0.15.4-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:e9737c8161da79fd7cfec19f1e35620375bd8b2a50c3e77fa3d2c16f574105cc"}, - {file = "ruff-0.15.4-py3-none-musllinux_1_2_i686.whl", hash = "sha256:291258c917539e18f6ba40482fe31d6f5ac023994ee11d7bdafd716f2aab8a68"}, - {file = "ruff-0.15.4-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:3f83c45911da6f2cd5936c436cf86b9f09f09165f033a99dcf7477e34041cbc3"}, - {file = "ruff-0.15.4-py3-none-win32.whl", hash = "sha256:65594a2d557d4ee9f02834fcdf0a28daa8b3b9f6cb2cb93846025a36db47ef22"}, - {file = "ruff-0.15.4-py3-none-win_amd64.whl", hash = "sha256:04196ad44f0df220c2ece5b0e959c2f37c777375ec744397d21d15b50a75264f"}, - {file = "ruff-0.15.4-py3-none-win_arm64.whl", hash = "sha256:60d5177e8cfc70e51b9c5fad936c634872a74209f934c1e79107d11787ad5453"}, - {file = "ruff-0.15.4.tar.gz", hash = "sha256:3412195319e42d634470cc97aa9803d07e9d5c9223b99bcb1518f0c725f26ae1"}, + {file = "ruff-0.15.11-py3-none-linux_armv6l.whl", hash = "sha256:e927cfff503135c558eb581a0c9792264aae9507904eb27809cdcff2f2c847b7"}, + {file = "ruff-0.15.11-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:7a1b5b2938d8f890b76084d4fa843604d787a912541eae85fd7e233398bbb73e"}, + {file = "ruff-0.15.11-py3-none-macosx_11_0_arm64.whl", hash = "sha256:d4176f3d194afbdaee6e41b9ccb1a2c287dba8700047df474abfbe773825d1cb"}, + {file = "ruff-0.15.11-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3b17c886fb88203ced3afe7f14e8d5ae96e9d2f4ccc0ee66aa19f2c2675a27e4"}, + {file = "ruff-0.15.11-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:49fafa220220afe7758a487b048de4c8f9f767f37dfefad46b9dd06759d003eb"}, + {file = "ruff-0.15.11-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f2ab8427e74a00d93b8bda1307b1e60970d40f304af38bccb218e056c220120d"}, + {file = "ruff-0.15.11-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:195072c0c8e1fc8f940652073df082e37a5d9cb43b4ab1e4d0566ab8977a13b7"}, + {file = "ruff-0.15.11-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a3a0996d486af3920dec930a2e7daed4847dfc12649b537a9335585ada163e9e"}, + {file = "ruff-0.15.11-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:1bef2cb556d509259f1fe440bb9cd33c756222cf0a7afe90d15edf0866702431"}, + {file = "ruff-0.15.11-py3-none-manylinux_2_31_riscv64.whl", hash = "sha256:030d921a836d7d4a12cf6e8d984a88b66094ccb0e0f17ddd55067c331191bf19"}, + {file = "ruff-0.15.11-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:0e783b599b4577788dbbb66b9addcef87e9a8832f4ce0c19e34bf55543a2f890"}, + {file = "ruff-0.15.11-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:ae90592246625ba4a34349d68ec28d4400d75182b71baa196ddb9f82db025ef5"}, + {file = "ruff-0.15.11-py3-none-musllinux_1_2_i686.whl", hash = "sha256:1f111d62e3c983ed20e0ca2e800f8d77433a5b1161947df99a5c2a3fb60514f0"}, + {file = "ruff-0.15.11-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:06f483d6646f59eaffba9ae30956370d3a886625f511a3108994000480621d1c"}, + {file = "ruff-0.15.11-py3-none-win32.whl", hash = "sha256:476a2aa56b7da0b73a3ee80b6b2f0e19cce544245479adde7baa65466664d5f3"}, + {file = "ruff-0.15.11-py3-none-win_amd64.whl", hash = "sha256:8b6756d88d7e234fb0c98c91511aae3cd519d5e3ed271cae31b20f39cb2a12a3"}, + {file = "ruff-0.15.11-py3-none-win_arm64.whl", hash = "sha256:063fed18cc1bbe0ee7393957284a6fe8b588c6a406a285af3ee3f46da2391ee4"}, + {file = "ruff-0.15.11.tar.gz", hash = "sha256:f092b21708bf0e7437ce9ada249dfe688ff9a0954fc94abab05dcea7dcd29c33"}, ] [[package]] @@ -2251,14 +2222,14 @@ files = [ [[package]] name = "splink" -version = "4.0.15" +version = "4.0.16" description = "Fast probabilistic data linkage at scale" optional = false python-versions = "<4.0.0,>=3.9.0" groups = ["main"] files = [ - {file = "splink-4.0.15-py3-none-any.whl", hash = "sha256:828a86a05433bec5c1b22a04e6558610602d31f0ca35003918b303a9af9188b7"}, - {file = "splink-4.0.15.tar.gz", hash = "sha256:7d3769d5771e5b91970511479fc1771882ac2f9c02e24e8882f8e898d223afa5"}, + {file = "splink-4.0.16-py3-none-any.whl", hash = "sha256:32b0ad5ab171fb4524337c64b3265ef7432fc7bcb3fff4245d172a042c354bb8"}, + {file = "splink-4.0.16.tar.gz", hash = "sha256:a4be4ab1ed4de350667418ed29ad98aac9239e97f5135cc666c45f3043f3aae8"}, ] [package.dependencies] @@ -2278,94 +2249,76 @@ spark = ["pyspark (>=3.5.0)"] [[package]] name = "sqlglot" -version = "29.0.1" +version = "30.6.0" description = "An easily customizable SQL parser and transpiler" optional = false python-versions = ">=3.9" groups = ["main"] files = [ - {file = "sqlglot-29.0.1-py3-none-any.whl", hash = "sha256:06a473ea6c2b3632ac67bd38e687a6860265bf4156e66b54adeda15d07f00c65"}, - {file = "sqlglot-29.0.1.tar.gz", hash = "sha256:0010b4f77fb996c8d25dd4b16f3654e6da163ff1866ceabc70b24e791c203048"}, -] - -[package.extras] -c = ["sqlglotc"] -dev = ["duckdb (>=0.6)", "mypy", "pandas", "pandas-stubs", "pdoc", "pre-commit", "pyperf", "python-dateutil", "pytz", "ruff (==0.7.2)", "types-python-dateutil", "types-pytz", "typing_extensions"] -rs = ["sqlglotrs (==0.13.0)"] - -[[package]] -name = "starlette" -version = "0.52.1" -description = "The little ASGI library that shines." -optional = false -python-versions = ">=3.10" -groups = ["dev"] -files = [ - {file = "starlette-0.52.1-py3-none-any.whl", hash = "sha256:0029d43eb3d273bc4f83a08720b4912ea4b071087a3b48db01b7c839f7954d74"}, - {file = "starlette-0.52.1.tar.gz", hash = "sha256:834edd1b0a23167694292e94f597773bc3f89f362be6effee198165a35d62933"}, + {file = "sqlglot-30.6.0-py3-none-any.whl", hash = "sha256:e005fc2f47994f90d7d8df341f1cbe937518497b0b7b1507d4c03c4c9dfd2778"}, + {file = "sqlglot-30.6.0.tar.gz", hash = "sha256:246d34d39927422a50a3fa155f37b2f6346fba85f1a755b13c941eb32ef93361"}, ] -[package.dependencies] -anyio = ">=3.6.2,<5" -typing-extensions = {version = ">=4.10.0", markers = "python_version < \"3.13\""} - [package.extras] -full = ["httpx (>=0.27.0,<0.29.0)", "itsdangerous", "jinja2", "python-multipart (>=0.0.18)", "pyyaml"] +c = ["sqlglotc (==30.6.0)"] +dev = ["duckdb (>=0.6)", "pandas", "pandas-stubs", "pdoc", "pre-commit", "pyperf", "python-dateutil", "pytz", "ruff (==0.15.6)", "setuptools_scm", "sqlglot-mypy", "types-python-dateutil", "types-pytz", "typing_extensions"] +rs = ["sqlglotc (==30.6.0)", "sqlglotrs (==0.13.0)"] [[package]] name = "testcontainers" -version = "4.14.1" +version = "4.14.2" description = "Python library for throwaway instances of anything that can run in a Docker container" optional = false python-versions = ">=3.10" groups = ["dev"] files = [ - {file = "testcontainers-4.14.1-py3-none-any.whl", hash = "sha256:03dfef4797b31c82e7b762a454b6afec61a2a512ad54af47ab41e4fa5415f891"}, - {file = "testcontainers-4.14.1.tar.gz", hash = "sha256:316f1bb178d829c003acd650233e3ff3c59a833a08d8661c074f58a4fbd42a64"}, + {file = "testcontainers-4.14.2-py3-none-any.whl", hash = "sha256:0d0522c3cd8f8d9627cda41f7a6b51b639fa57bdc492923c045117933c668d68"}, + {file = "testcontainers-4.14.2.tar.gz", hash = "sha256:1340ccf16fe3acd9389a6c9e1d9ab21d9fe99a8afdf8165f89c3e69c1967d239"}, ] [package.dependencies] docker = "*" python-dotenv = "*" -redis = {version = ">=7,<8", optional = true, markers = "extra == \"generic\" or extra == \"redis\""} +redis = {version = ">=7", optional = true, markers = "extra == \"redis\""} typing-extensions = "*" urllib3 = "*" wrapt = "*" [package.extras] -arangodb = ["python-arango (>=8,<9)"] -aws = ["boto3 (>=1,<2)", "httpx"] -azurite = ["azure-storage-blob (>=12,<13)"] -chroma = ["chromadb-client (>=1,<2)"] -cosmosdb = ["azure-cosmos (>=4,<5)"] -db2 = ["ibm_db_sa ; platform_machine != \"aarch64\" and platform_machine != \"arm64\"", "sqlalchemy (>=2,<3)"] -generic = ["httpx", "redis (>=7,<8)"] -google = ["google-cloud-datastore (>=2,<3)", "google-cloud-pubsub (>=2,<3)"] -influxdb = ["influxdb (>=5,<6)", "influxdb-client (>=1,<2)"] +arangodb = ["python-arango (>=8)"] +aws = ["boto3 (>=1)", "httpx"] +azurite = ["azure-storage-blob (>=12)"] +chroma = ["chromadb-client (>=1)"] +clickhouse = ["clickhouse-driver"] +cosmosdb = ["azure-cosmos (>=4)"] +db2 = ["ibm-db-sa ; platform_machine != \"aarch64\" and platform_machine != \"arm64\"", "sqlalchemy (>=2)"] +generic = ["httpx", "redis (>=7)"] +google = ["google-cloud-datastore (>=2)", "google-cloud-pubsub (>=2)"] +influxdb = ["influxdb (>=5)", "influxdb-client (>=1)"] k3s = ["kubernetes", "pyyaml (>=6.0.3)"] -keycloak = ["python-keycloak (>=6,<7) ; python_version < \"4.0\""] -localstack = ["boto3 (>=1,<2)"] +keycloak = ["python-keycloak (>=6) ; python_version < \"4.0\""] +localstack = ["boto3 (>=1)"] mailpit = ["cryptography"] -minio = ["minio (>=7,<8)"] -mongodb = ["pymongo (>=4,<5)"] -mssql = ["pymssql (>=2,<3)", "sqlalchemy (>=2,<3)"] -mysql = ["pymysql[rsa] (>=1,<2)", "sqlalchemy (>=2,<3)"] -nats = ["nats-py (>=2,<3)"] -neo4j = ["neo4j (>=6,<7)"] +minio = ["minio (>=7)"] +mongodb = ["pymongo (>=4)"] +mssql = ["pymssql (>=2)", "sqlalchemy (>=2)"] +mysql = ["pymysql[rsa] (>=1)", "sqlalchemy (>=2)"] +nats = ["nats-py (>=2)"] +neo4j = ["neo4j (>=6)"] openfga = ["openfga-sdk"] -opensearch = ["opensearch-py (>=3,<4) ; python_version < \"4.0\""] -oracle = ["oracledb (>=3,<4)", "sqlalchemy (>=2,<3)"] -oracle-free = ["oracledb (>=3,<4)", "sqlalchemy (>=2,<3)"] -qdrant = ["qdrant-client (>=1,<2)"] -rabbitmq = ["pika (>=1,<2)"] -redis = ["redis (>=7,<8)"] -registry = ["bcrypt (>=5,<6)"] -scylla = ["cassandra-driver (>=3,<4)"] -selenium = ["selenium (>=4,<5)"] +opensearch = ["opensearch-py (>=3) ; python_version < \"4.0\""] +oracle = ["oracledb (>=3)", "sqlalchemy (>=2)"] +oracle-free = ["oracledb (>=3)", "sqlalchemy (>=2)"] +qdrant = ["qdrant-client (>=1)"] +rabbitmq = ["pika (>=1)"] +redis = ["redis (>=7)"] +registry = ["bcrypt (>=5)"] +scylla = ["cassandra-driver (>=3)"] +selenium = ["selenium (>=4)"] sftp = ["cryptography"] test-module-import = ["httpx"] trino = ["trino"] -weaviate = ["weaviate-client (>=4,<5)"] +weaviate = ["weaviate-client (>=4)"] [[package]] name = "texttable" @@ -2379,6 +2332,18 @@ files = [ {file = "texttable-1.7.0.tar.gz", hash = "sha256:2d2068fb55115807d3ac77a4ca68fa48803e84ebb0ee2340f858107a36522638"}, ] +[[package]] +name = "tomli-w" +version = "1.2.0" +description = "A lil' TOML writer" +optional = false +python-versions = ">=3.9" +groups = ["dev"] +files = [ + {file = "tomli_w-1.2.0-py3-none-any.whl", hash = "sha256:188306098d013b691fcadc011abd66727d3c414c571bb01b1a174ba8c983cf90"}, + {file = "tomli_w-1.2.0.tar.gz", hash = "sha256:2dd14fac5a47c27be9cd4c976af5a12d87fb1f0b4512f81d69cce3b35ae25021"}, +] + [[package]] name = "tomlkit" version = "0.14.0" @@ -2393,29 +2358,53 @@ files = [ [[package]] name = "tox" -version = "4.47.3" +version = "4.53.0" description = "tox is a generic virtualenv management and test command line tool" optional = false python-versions = ">=3.10" groups = ["dev"] files = [ - {file = "tox-4.47.3-py3-none-any.whl", hash = "sha256:e447862a6821b421bbbfb8cbac071818c0a6884907a4c964d8322516d0b19b34"}, - {file = "tox-4.47.3.tar.gz", hash = "sha256:57643508d4c218ad312457a3b0ce3135c50fa1f9f1e4d40867683d880cad1c37"}, + {file = "tox-4.53.0-py3-none-any.whl", hash = "sha256:cc4e716d18c4889aa179d785175c438fa60c35deef20ce689ec288d8fb656096"}, + {file = "tox-4.53.0.tar.gz", hash = "sha256:62c780e42f87d34ee60f2ea20342156253794fdcbd6885fd797d98ee05009f22"}, ] [package.dependencies] -cachetools = ">=7.0.1" +cachetools = ">=7.0.3" colorama = ">=0.4.6" -filelock = ">=3.24.3" +filelock = ">=3.25" packaging = ">=26" -platformdirs = ">=4.9.2" +platformdirs = ">=4.9.4" pluggy = ">=1.6" pyproject-api = ">=1.10" -virtualenv = ">=20.39" +python-discovery = ">=1.2.2" +tomli-w = ">=1.2" +virtualenv = ">=21.1" [package.extras] completion = ["argcomplete (>=3.6.3)"] +[[package]] +name = "tqdm" +version = "4.67.3" +description = "Fast, Extensible Progress Meter" +optional = false +python-versions = ">=3.7" +groups = ["main"] +files = [ + {file = "tqdm-4.67.3-py3-none-any.whl", hash = "sha256:ee1e4c0e59148062281c49d80b25b67771a127c85fc9676d3be5f243206826bf"}, + {file = "tqdm-4.67.3.tar.gz", hash = "sha256:7d825f03f89244ef73f1d4ce193cb1774a8179fd96f31d7e1dcde62092b960bb"}, +] + +[package.dependencies] +colorama = {version = "*", markers = "platform_system == \"Windows\""} + +[package.extras] +dev = ["nbval", "pytest (>=6)", "pytest-asyncio (>=0.24)", "pytest-cov", "pytest-timeout"] +discord = ["requests"] +notebook = ["ipywidgets (>=6)"] +slack = ["slack-sdk"] +telegram = ["requests"] + [[package]] name = "typing-extensions" version = "4.15.0" @@ -2434,7 +2423,7 @@ version = "0.4.2" description = "Runtime typing introspection tools" optional = false python-versions = ">=3.9" -groups = ["main", "dev"] +groups = ["main"] files = [ {file = "typing_inspection-0.4.2-py3-none-any.whl", hash = "sha256:4ed1cacbdc298c220f1bd249ed5287caa16f34d44ef4e9c3d0cbad5b521545e7"}, {file = "typing_inspection-0.4.2.tar.gz", hash = "sha256:ba561c48a67c5958007083d386c3295464928b01faa735ab8547c5692e87f464"}, @@ -2445,14 +2434,14 @@ typing-extensions = ">=4.12.0" [[package]] name = "tzdata" -version = "2025.3" +version = "2026.1" description = "Provider of IANA time zone data" optional = false python-versions = ">=2" groups = ["main"] files = [ - {file = "tzdata-2025.3-py2.py3-none-any.whl", hash = "sha256:06a47e5700f3081aab02b2e513160914ff0694bce9947d6b76ebd6bf57cfc5d1"}, - {file = "tzdata-2025.3.tar.gz", hash = "sha256:de39c2ca5dc7b0344f2eba86f49d614019d29f060fc4ebc8a417896a620b56a7"}, + {file = "tzdata-2026.1-py2.py3-none-any.whl", hash = "sha256:4b1d2be7ac37ceafd7327b961aa3a54e467efbdb563a23655fbfe0d39cfc42a9"}, + {file = "tzdata-2026.1.tar.gz", hash = "sha256:67658a1903c75917309e753fdc349ac0efd8c27db7a0cb406a25be4840f87f98"}, ] [[package]] @@ -2473,125 +2462,122 @@ h2 = ["h2 (>=4,<5)"] socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"] zstd = ["backports-zstd (>=1.0.0) ; python_version < \"3.14\""] -[[package]] -name = "uvicorn" -version = "0.41.0" -description = "The lightning-fast ASGI server." -optional = false -python-versions = ">=3.10" -groups = ["dev"] -files = [ - {file = "uvicorn-0.41.0-py3-none-any.whl", hash = "sha256:29e35b1d2c36a04b9e180d4007ede3bcb32a85fbdfd6c6aeb3f26839de088187"}, - {file = "uvicorn-0.41.0.tar.gz", hash = "sha256:09d11cf7008da33113824ee5a1c6422d89fbc2ff476540d69a34c87fab8b571a"}, -] - -[package.dependencies] -click = ">=7.0" -h11 = ">=0.8" - -[package.extras] -standard = ["colorama (>=0.4) ; sys_platform == \"win32\"", "httptools (>=0.6.3)", "python-dotenv (>=0.13)", "pyyaml (>=5.1)", "uvloop (>=0.15.1) ; sys_platform != \"win32\" and sys_platform != \"cygwin\" and platform_python_implementation != \"PyPy\"", "watchfiles (>=0.20)", "websockets (>=10.4)"] - [[package]] name = "virtualenv" -version = "21.1.0" +version = "21.2.4" description = "Virtual Python Environment builder" optional = false python-versions = ">=3.8" groups = ["dev"] files = [ - {file = "virtualenv-21.1.0-py3-none-any.whl", hash = "sha256:164f5e14c5587d170cf98e60378eb91ea35bf037be313811905d3a24ea33cc07"}, - {file = "virtualenv-21.1.0.tar.gz", hash = "sha256:1990a0188c8f16b6b9cf65c9183049007375b26aad415514d377ccacf1e4fb44"}, + {file = "virtualenv-21.2.4-py3-none-any.whl", hash = "sha256:29d21e941795206138d0f22f4e45ff7050e5da6c6472299fb7103318763861ac"}, + {file = "virtualenv-21.2.4.tar.gz", hash = "sha256:b294ef68192638004d72524ce7ef303e9d0cf5a44c95ce2e54a7500a6381cada"}, ] [package.dependencies] distlib = ">=0.3.7,<1" filelock = {version = ">=3.24.2,<4", markers = "python_version >= \"3.10\""} platformdirs = ">=3.9.1,<5" -python-discovery = ">=1" +python-discovery = ">=1.2.2" [[package]] name = "wrapt" -version = "2.1.1" +version = "2.1.2" description = "Module for decorators, wrappers and monkey patching." optional = false python-versions = ">=3.9" groups = ["main", "dev"] files = [ - {file = "wrapt-2.1.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:7e927375e43fd5a985b27a8992327c22541b6dede1362fc79df337d26e23604f"}, - {file = "wrapt-2.1.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:e1c99544b6a7d40ca22195563b6d8bc3986ee8bb82f272f31f0670fe9440c869"}, - {file = "wrapt-2.1.1-cp310-cp310-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:b2be3fa5f4efaf16ee7c77d0556abca35f5a18ad4ac06f0ef3904c3399010ce9"}, - {file = "wrapt-2.1.1-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:67c90c1ae6489a6cb1a82058902caa8006706f7b4e8ff766f943e9d2c8e608d0"}, - {file = "wrapt-2.1.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:05c0db35ccffd7480143e62df1e829d101c7b86944ae3be7e4869a7efa621f53"}, - {file = "wrapt-2.1.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:0c2ec9f616755b2e1e0bf4d0961f59bb5c2e7a77407e7e2c38ef4f7d2fdde12c"}, - {file = "wrapt-2.1.1-cp310-cp310-win32.whl", hash = "sha256:203ba6b3f89e410e27dbd30ff7dccaf54dcf30fda0b22aa1b82d560c7f9fe9a1"}, - {file = "wrapt-2.1.1-cp310-cp310-win_amd64.whl", hash = "sha256:6f9426d9cfc2f8732922fc96198052e55c09bb9db3ddaa4323a18e055807410e"}, - {file = "wrapt-2.1.1-cp310-cp310-win_arm64.whl", hash = "sha256:69c26f51b67076b40714cff81bdd5826c0b10c077fb6b0678393a6a2f952a5fc"}, - {file = "wrapt-2.1.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6c366434a7fb914c7a5de508ed735ef9c133367114e1a7cb91dfb5cd806a1549"}, - {file = "wrapt-2.1.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:5d6a2068bd2e1e19e5a317c8c0b288267eec4e7347c36bc68a6e378a39f19ee7"}, - {file = "wrapt-2.1.1-cp311-cp311-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:891ab4713419217b2aed7dd106c9200f64e6a82226775a0d2ebd6bef2ebd1747"}, - {file = "wrapt-2.1.1-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c8ef36a0df38d2dc9d907f6617f89e113c5892e0a35f58f45f75901af0ce7d81"}, - {file = "wrapt-2.1.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:76e9af3ebd86f19973143d4d592cbf3e970cf3f66ddee30b16278c26ae34b8ab"}, - {file = "wrapt-2.1.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:ff562067485ebdeaef2fa3fe9b1876bc4e7b73762e0a01406ad81e2076edcebf"}, - {file = "wrapt-2.1.1-cp311-cp311-win32.whl", hash = "sha256:9e60a30aa0909435ec4ea2a3c53e8e1b50ac9f640c0e9fe3f21fd248a22f06c5"}, - {file = "wrapt-2.1.1-cp311-cp311-win_amd64.whl", hash = "sha256:7d79954f51fcf84e5ec4878ab4aea32610d70145c5bbc84b3370eabfb1e096c2"}, - {file = "wrapt-2.1.1-cp311-cp311-win_arm64.whl", hash = "sha256:d3ffc6b0efe79e08fd947605fd598515aebefe45e50432dc3b5cd437df8b1ada"}, - {file = "wrapt-2.1.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:ab8e3793b239db021a18782a5823fcdea63b9fe75d0e340957f5828ef55fcc02"}, - {file = "wrapt-2.1.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:7c0300007836373d1c2df105b40777986accb738053a92fe09b615a7a4547e9f"}, - {file = "wrapt-2.1.1-cp312-cp312-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:2b27c070fd1132ab23957bcd4ee3ba707a91e653a9268dc1afbd39b77b2799f7"}, - {file = "wrapt-2.1.1-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8b0e36d845e8b6f50949b6b65fc6cd279f47a1944582ed4ec8258cd136d89a64"}, - {file = "wrapt-2.1.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:4aeea04a9889370fcfb1ef828c4cc583f36a875061505cd6cd9ba24d8b43cc36"}, - {file = "wrapt-2.1.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:d88b46bb0dce9f74b6817bc1758ff2125e1ca9e1377d62ea35b6896142ab6825"}, - {file = "wrapt-2.1.1-cp312-cp312-win32.whl", hash = "sha256:63decff76ca685b5c557082dfbea865f3f5f6d45766a89bff8dc61d336348833"}, - {file = "wrapt-2.1.1-cp312-cp312-win_amd64.whl", hash = "sha256:b828235d26c1e35aca4107039802ae4b1411be0fe0367dd5b7e4d90e562fcbcd"}, - {file = "wrapt-2.1.1-cp312-cp312-win_arm64.whl", hash = "sha256:75128507413a9f1bcbe2db88fd18fbdbf80f264b82fa33a6996cdeaf01c52352"}, - {file = "wrapt-2.1.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:ce9646e17fa7c3e2e7a87e696c7de66512c2b4f789a8db95c613588985a2e139"}, - {file = "wrapt-2.1.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:428cfc801925454395aa468ba7ddb3ed63dc0d881df7b81626cdd433b4e2b11b"}, - {file = "wrapt-2.1.1-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:5797f65e4d58065a49088c3b32af5410751cd485e83ba89e5a45e2aa8905af98"}, - {file = "wrapt-2.1.1-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5a2db44a71202c5ae4bb5f27c6d3afbc5b23053f2e7e78aa29704541b5dad789"}, - {file = "wrapt-2.1.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:8d5350c3590af09c1703dd60ec78a7370c0186e11eaafb9dda025a30eee6492d"}, - {file = "wrapt-2.1.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:2d9b076411bed964e752c01b49fd224cc385f3a96f520c797d38412d70d08359"}, - {file = "wrapt-2.1.1-cp313-cp313-win32.whl", hash = "sha256:0bb7207130ce6486727baa85373503bf3334cc28016f6928a0fa7e19d7ecdc06"}, - {file = "wrapt-2.1.1-cp313-cp313-win_amd64.whl", hash = "sha256:cbfee35c711046b15147b0ae7db9b976f01c9520e6636d992cd9e69e5e2b03b1"}, - {file = "wrapt-2.1.1-cp313-cp313-win_arm64.whl", hash = "sha256:7d2756061022aebbf57ba14af9c16e8044e055c22d38de7bf40d92b565ecd2b0"}, - {file = "wrapt-2.1.1-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:4814a3e58bc6971e46baa910ecee69699110a2bf06c201e24277c65115a20c20"}, - {file = "wrapt-2.1.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:106c5123232ab9b9f4903692e1fa0bdc231510098f04c13c3081f8ad71c3d612"}, - {file = "wrapt-2.1.1-cp313-cp313t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:1a40b83ff2535e6e56f190aff123821eea89a24c589f7af33413b9c19eb2c738"}, - {file = "wrapt-2.1.1-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:789cea26e740d71cf1882e3a42bb29052bc4ada15770c90072cb47bf73fb3dbf"}, - {file = "wrapt-2.1.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:ba49c14222d5e5c0ee394495a8655e991dc06cbca5398153aefa5ac08cd6ccd7"}, - {file = "wrapt-2.1.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:ac8cda531fe55be838a17c62c806824472bb962b3afa47ecbd59b27b78496f4e"}, - {file = "wrapt-2.1.1-cp313-cp313t-win32.whl", hash = "sha256:b8af75fe20d381dd5bcc9db2e86a86d7fcfbf615383a7147b85da97c1182225b"}, - {file = "wrapt-2.1.1-cp313-cp313t-win_amd64.whl", hash = "sha256:45c5631c9b6c792b78be2d7352129f776dd72c605be2c3a4e9be346be8376d83"}, - {file = "wrapt-2.1.1-cp313-cp313t-win_arm64.whl", hash = "sha256:da815b9263947ac98d088b6414ac83507809a1d385e4632d9489867228d6d81c"}, - {file = "wrapt-2.1.1-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:9aa1765054245bb01a37f615503290d4e207e3fd59226e78341afb587e9c1236"}, - {file = "wrapt-2.1.1-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:feff14b63a6d86c1eee33a57f77573649f2550935981625be7ff3cb7342efe05"}, - {file = "wrapt-2.1.1-cp314-cp314-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:81fc5f22d5fcfdbabde96bb3f5379b9f4476d05c6d524d7259dc5dfb501d3281"}, - {file = "wrapt-2.1.1-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:951b228ecf66def855d22e006ab9a1fc12535111ae7db2ec576c728f8ddb39e8"}, - {file = "wrapt-2.1.1-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:0ddf582a95641b9a8c8bd643e83f34ecbbfe1b68bc3850093605e469ab680ae3"}, - {file = "wrapt-2.1.1-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:fc5c500966bf48913f795f1984704e6d452ba2414207b15e1f8c339a059d5b16"}, - {file = "wrapt-2.1.1-cp314-cp314-win32.whl", hash = "sha256:4aa4baadb1f94b71151b8e44a0c044f6af37396c3b8bcd474b78b49e2130a23b"}, - {file = "wrapt-2.1.1-cp314-cp314-win_amd64.whl", hash = "sha256:860e9d3fd81816a9f4e40812f28be4439ab01f260603c749d14be3c0a1170d19"}, - {file = "wrapt-2.1.1-cp314-cp314-win_arm64.whl", hash = "sha256:3c59e103017a2c1ea0ddf589cbefd63f91081d7ce9d491d69ff2512bb1157e23"}, - {file = "wrapt-2.1.1-cp314-cp314t-macosx_10_15_x86_64.whl", hash = "sha256:9fa7c7e1bee9278fc4f5dd8275bc8d25493281a8ec6c61959e37cc46acf02007"}, - {file = "wrapt-2.1.1-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:39c35e12e8215628984248bd9c8897ce0a474be2a773db207eb93414219d8469"}, - {file = "wrapt-2.1.1-cp314-cp314t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:94ded4540cac9125eaa8ddf5f651a7ec0da6f5b9f248fe0347b597098f8ec14c"}, - {file = "wrapt-2.1.1-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:da0af328373f97ed9bdfea24549ac1b944096a5a71b30e41c9b8b53ab3eec04a"}, - {file = "wrapt-2.1.1-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:4ad839b55f0bf235f8e337ce060572d7a06592592f600f3a3029168e838469d3"}, - {file = "wrapt-2.1.1-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:0d89c49356e5e2a50fa86b40e0510082abcd0530f926cbd71cf25bee6b9d82d7"}, - {file = "wrapt-2.1.1-cp314-cp314t-win32.whl", hash = "sha256:f4c7dd22cf7f36aafe772f3d88656559205c3af1b7900adfccb70edeb0d2abc4"}, - {file = "wrapt-2.1.1-cp314-cp314t-win_amd64.whl", hash = "sha256:f76bc12c583ab01e73ba0ea585465a41e48d968f6d1311b4daec4f8654e356e3"}, - {file = "wrapt-2.1.1-cp314-cp314t-win_arm64.whl", hash = "sha256:7ea74fc0bec172f1ae5f3505b6655c541786a5cabe4bbc0d9723a56ac32eb9b9"}, - {file = "wrapt-2.1.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:9e03b3d486eb39f5d3f562839f59094dcee30c4039359ea15768dc2214d9e07c"}, - {file = "wrapt-2.1.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:0fdf3073f488ce4d929929b7799e3b8c52b220c9eb3f4a5a51e2dc0e8ff07881"}, - {file = "wrapt-2.1.1-cp39-cp39-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:0cb4f59238c6625fae2eeb72278da31c9cfba0ff4d9cbe37446b73caa0e9bcf7"}, - {file = "wrapt-2.1.1-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:7f794a1c148871b714cb566f5466ec8288e0148a1c417550983864b3981737cd"}, - {file = "wrapt-2.1.1-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:95ef3866631c6da9ce1fc0f1e17b90c4c0aa6d041fc70a11bc90733aee122e1a"}, - {file = "wrapt-2.1.1-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:66bc1b2446f01cbbd3c56b79a3a8435bcd4178ac4e06b091913f7751a7f528b8"}, - {file = "wrapt-2.1.1-cp39-cp39-win32.whl", hash = "sha256:1b9e08e57cabc32972f7c956d10e85093c5da9019faa24faf411e7dd258e528c"}, - {file = "wrapt-2.1.1-cp39-cp39-win_amd64.whl", hash = "sha256:e75ad48c3cca739f580b5e14c052993eb644c7fa5b4c90aa51193280b30875ae"}, - {file = "wrapt-2.1.1-cp39-cp39-win_arm64.whl", hash = "sha256:9ccd657873b7f964711447d004563a2bc08d1476d7a1afcad310f3713e6f50f4"}, - {file = "wrapt-2.1.1-py3-none-any.whl", hash = "sha256:3b0f4629eb954394a3d7c7a1c8cca25f0b07cefe6aa8545e862e9778152de5b7"}, - {file = "wrapt-2.1.1.tar.gz", hash = "sha256:5fdcb09bf6db023d88f312bd0767594b414655d58090fc1c46b3414415f67fac"}, + {file = "wrapt-2.1.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:4b7a86d99a14f76facb269dc148590c01aaf47584071809a70da30555228158c"}, + {file = "wrapt-2.1.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:a819e39017f95bf7aede768f75915635aa8f671f2993c036991b8d3bfe8dbb6f"}, + {file = "wrapt-2.1.2-cp310-cp310-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:5681123e60aed0e64c7d44f72bbf8b4ce45f79d81467e2c4c728629f5baf06eb"}, + {file = "wrapt-2.1.2-cp310-cp310-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:2b8b28e97a44d21836259739ae76284e180b18abbb4dcfdff07a415cf1016c3e"}, + {file = "wrapt-2.1.2-cp310-cp310-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:cef91c95a50596fcdc31397eb6955476f82ae8a3f5a8eabdc13611b60ee380ba"}, + {file = "wrapt-2.1.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:dad63212b168de8569b1c512f4eac4b57f2c6934b30df32d6ee9534a79f1493f"}, + {file = "wrapt-2.1.2-cp310-cp310-musllinux_1_2_riscv64.whl", hash = "sha256:d307aa6888d5efab2c1cde09843d48c843990be13069003184b67d426d145394"}, + {file = "wrapt-2.1.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:c87cf3f0c85e27b3ac7d9ad95da166bf8739ca215a8b171e8404a2d739897a45"}, + {file = "wrapt-2.1.2-cp310-cp310-win32.whl", hash = "sha256:d1c5fea4f9fe3762e2b905fdd67df51e4be7a73b7674957af2d2ade71a5c075d"}, + {file = "wrapt-2.1.2-cp310-cp310-win_amd64.whl", hash = "sha256:d8f7740e1af13dff2684e4d56fe604a7e04d6c94e737a60568d8d4238b9a0c71"}, + {file = "wrapt-2.1.2-cp310-cp310-win_arm64.whl", hash = "sha256:1c6cc827c00dc839350155f316f1f8b4b0c370f52b6a19e782e2bda89600c7dc"}, + {file = "wrapt-2.1.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:96159a0ee2b0277d44201c3b5be479a9979cf154e8c82fa5df49586a8e7679bb"}, + {file = "wrapt-2.1.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:98ba61833a77b747901e9012072f038795de7fc77849f1faa965464f3f87ff2d"}, + {file = "wrapt-2.1.2-cp311-cp311-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:767c0dbbe76cae2a60dd2b235ac0c87c9cccf4898aef8062e57bead46b5f6894"}, + {file = "wrapt-2.1.2-cp311-cp311-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:9c691a6bc752c0cc4711cc0c00896fcd0f116abc253609ef64ef930032821842"}, + {file = "wrapt-2.1.2-cp311-cp311-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:f3b7d73012ea75aee5844de58c88f44cf62d0d62711e39da5a82824a7c4626a8"}, + {file = "wrapt-2.1.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:577dff354e7acd9d411eaf4bfe76b724c89c89c8fc9b7e127ee28c5f7bcb25b6"}, + {file = "wrapt-2.1.2-cp311-cp311-musllinux_1_2_riscv64.whl", hash = "sha256:3d7b6fd105f8b24e5bd23ccf41cb1d1099796524bcc6f7fbb8fe576c44befbc9"}, + {file = "wrapt-2.1.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:866abdbf4612e0b34764922ef8b1c5668867610a718d3053d59e24a5e5fcfc15"}, + {file = "wrapt-2.1.2-cp311-cp311-win32.whl", hash = "sha256:5a0a0a3a882393095573344075189eb2d566e0fd205a2b6414e9997b1b800a8b"}, + {file = "wrapt-2.1.2-cp311-cp311-win_amd64.whl", hash = "sha256:64a07a71d2730ba56f11d1a4b91f7817dc79bc134c11516b75d1921a7c6fcda1"}, + {file = "wrapt-2.1.2-cp311-cp311-win_arm64.whl", hash = "sha256:b89f095fe98bc12107f82a9f7d570dc83a0870291aeb6b1d7a7d35575f55d98a"}, + {file = "wrapt-2.1.2-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:ff2aad9c4cda28a8f0653fc2d487596458c2a3f475e56ba02909e950a9efa6a9"}, + {file = "wrapt-2.1.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:6433ea84e1cfacf32021d2a4ee909554ade7fd392caa6f7c13f1f4bf7b8e8748"}, + {file = "wrapt-2.1.2-cp312-cp312-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:c20b757c268d30d6215916a5fa8461048d023865d888e437fab451139cad6c8e"}, + {file = "wrapt-2.1.2-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:79847b83eb38e70d93dc392c7c5b587efe65b3e7afcc167aa8abd5d60e8761c8"}, + {file = "wrapt-2.1.2-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:f8fba1bae256186a83d1875b2b1f4e2d1242e8fac0f58ec0d7e41b26967b965c"}, + {file = "wrapt-2.1.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:e3d3b35eedcf5f7d022291ecd7533321c4775f7b9cd0050a31a68499ba45757c"}, + {file = "wrapt-2.1.2-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:6f2c5390460de57fa9582bc8a1b7a6c86e1a41dfad74c5225fc07044c15cc8d1"}, + {file = "wrapt-2.1.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:7dfa9f2cf65d027b951d05c662cc99ee3bd01f6e4691ed39848a7a5fffc902b2"}, + {file = "wrapt-2.1.2-cp312-cp312-win32.whl", hash = "sha256:eba8155747eb2cae4a0b913d9ebd12a1db4d860fc4c829d7578c7b989bd3f2f0"}, + {file = "wrapt-2.1.2-cp312-cp312-win_amd64.whl", hash = "sha256:1c51c738d7d9faa0b3601708e7e2eda9bf779e1b601dce6c77411f2a1b324a63"}, + {file = "wrapt-2.1.2-cp312-cp312-win_arm64.whl", hash = "sha256:c8e46ae8e4032792eb2f677dbd0d557170a8e5524d22acc55199f43efedd39bf"}, + {file = "wrapt-2.1.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:787fd6f4d67befa6fe2abdffcbd3de2d82dfc6fb8a6d850407c53332709d030b"}, + {file = "wrapt-2.1.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:4bdf26e03e6d0da3f0e9422fd36bcebf7bc0eeb55fdf9c727a09abc6b9fe472e"}, + {file = "wrapt-2.1.2-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:bbac24d879aa22998e87f6b3f481a5216311e7d53c7db87f189a7a0266dafffb"}, + {file = "wrapt-2.1.2-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:16997dfb9d67addc2e3f41b62a104341e80cac52f91110dece393923c0ebd5ca"}, + {file = "wrapt-2.1.2-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:162e4e2ba7542da9027821cb6e7c5e068d64f9a10b5f15512ea28e954893a267"}, + {file = "wrapt-2.1.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:f29c827a8d9936ac320746747a016c4bc66ef639f5cd0d32df24f5eacbf9c69f"}, + {file = "wrapt-2.1.2-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:a9dd9813825f7ecb018c17fd147a01845eb330254dff86d3b5816f20f4d6aaf8"}, + {file = "wrapt-2.1.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:6f8dbdd3719e534860d6a78526aafc220e0241f981367018c2875178cf83a413"}, + {file = "wrapt-2.1.2-cp313-cp313-win32.whl", hash = "sha256:5c35b5d82b16a3bc6e0a04349b606a0582bc29f573786aebe98e0c159bc48db6"}, + {file = "wrapt-2.1.2-cp313-cp313-win_amd64.whl", hash = "sha256:f8bc1c264d8d1cf5b3560a87bbdd31131573eb25f9f9447bb6252b8d4c44a3a1"}, + {file = "wrapt-2.1.2-cp313-cp313-win_arm64.whl", hash = "sha256:3beb22f674550d5634642c645aba4c72a2c66fb185ae1aebe1e955fae5a13baf"}, + {file = "wrapt-2.1.2-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:0fc04bc8664a8bc4c8e00b37b5355cffca2535209fba1abb09ae2b7c76ddf82b"}, + {file = "wrapt-2.1.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:a9b9d50c9af998875a1482a038eb05755dfd6fe303a313f6a940bb53a83c3f18"}, + {file = "wrapt-2.1.2-cp313-cp313t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:2d3ff4f0024dd224290c0eabf0240f1bfc1f26363431505fb1b0283d3b08f11d"}, + {file = "wrapt-2.1.2-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3278c471f4468ad544a691b31bb856374fbdefb7fee1a152153e64019379f015"}, + {file = "wrapt-2.1.2-cp313-cp313t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:a8914c754d3134a3032601c6984db1c576e6abaf3fc68094bb8ab1379d75ff92"}, + {file = "wrapt-2.1.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:ff95d4264e55839be37bafe1536db2ab2de19da6b65f9244f01f332b5286cfbf"}, + {file = "wrapt-2.1.2-cp313-cp313t-musllinux_1_2_riscv64.whl", hash = "sha256:76405518ca4e1b76fbb1b9f686cff93aebae03920cc55ceeec48ff9f719c5f67"}, + {file = "wrapt-2.1.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:c0be8b5a74c5824e9359b53e7e58bef71a729bacc82e16587db1c4ebc91f7c5a"}, + {file = "wrapt-2.1.2-cp313-cp313t-win32.whl", hash = "sha256:f01277d9a5fc1862f26f7626da9cf443bebc0abd2f303f41c5e995b15887dabd"}, + {file = "wrapt-2.1.2-cp313-cp313t-win_amd64.whl", hash = "sha256:84ce8f1c2104d2f6daa912b1b5b039f331febfeee74f8042ad4e04992bd95c8f"}, + {file = "wrapt-2.1.2-cp313-cp313t-win_arm64.whl", hash = "sha256:a93cd767e37faeddbe07d8fc4212d5cba660af59bdb0f6372c93faaa13e6e679"}, + {file = "wrapt-2.1.2-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:1370e516598854e5b4366e09ce81e08bfe94d42b0fd569b88ec46cc56d9164a9"}, + {file = "wrapt-2.1.2-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:6de1a3851c27e0bd6a04ca993ea6f80fc53e6c742ee1601f486c08e9f9b900a9"}, + {file = "wrapt-2.1.2-cp314-cp314-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:de9f1a2bbc5ac7f6012ec24525bdd444765a2ff64b5985ac6e0692144838542e"}, + {file = "wrapt-2.1.2-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:970d57ed83fa040d8b20c52fe74a6ae7e3775ae8cff5efd6a81e06b19078484c"}, + {file = "wrapt-2.1.2-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:3969c56e4563c375861c8df14fa55146e81ac11c8db49ea6fb7f2ba58bc1ff9a"}, + {file = "wrapt-2.1.2-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:57d7c0c980abdc5f1d98b11a2aa3bb159790add80258c717fa49a99921456d90"}, + {file = "wrapt-2.1.2-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:776867878e83130c7a04237010463372e877c1c994d449ca6aaafeab6aab2586"}, + {file = "wrapt-2.1.2-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:fab036efe5464ec3291411fabb80a7a39e2dd80bae9bcbeeca5087fdfa891e19"}, + {file = "wrapt-2.1.2-cp314-cp314-win32.whl", hash = "sha256:e6ed62c82ddf58d001096ae84ce7f833db97ae2263bff31c9b336ba8cfe3f508"}, + {file = "wrapt-2.1.2-cp314-cp314-win_amd64.whl", hash = "sha256:467e7c76315390331c67073073d00662015bb730c566820c9ca9b54e4d67fd04"}, + {file = "wrapt-2.1.2-cp314-cp314-win_arm64.whl", hash = "sha256:da1f00a557c66225d53b095a97eace0fc5349e3bfda28fa34ffae238978ee575"}, + {file = "wrapt-2.1.2-cp314-cp314t-macosx_10_15_x86_64.whl", hash = "sha256:62503ffbc2d3a69891cf29beeaccdb4d5e0a126e2b6a851688d4777e01428dbb"}, + {file = "wrapt-2.1.2-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:c7e6cd120ef837d5b6f860a6ea3745f8763805c418bb2f12eeb1fa6e25f22d22"}, + {file = "wrapt-2.1.2-cp314-cp314t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:3769a77df8e756d65fbc050333f423c01ae012b4f6731aaf70cf2bef61b34596"}, + {file = "wrapt-2.1.2-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:a76d61a2e851996150ba0f80582dd92a870643fa481f3b3846f229de88caf044"}, + {file = "wrapt-2.1.2-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:6f97edc9842cf215312b75fe737ee7c8adda75a89979f8e11558dfff6343cc4b"}, + {file = "wrapt-2.1.2-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:4006c351de6d5007aa33a551f600404ba44228a89e833d2fadc5caa5de8edfbf"}, + {file = "wrapt-2.1.2-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:a9372fc3639a878c8e7d87e1556fa209091b0a66e912c611e3f833e2c4202be2"}, + {file = "wrapt-2.1.2-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:3144b027ff30cbd2fca07c0a87e67011adb717eb5f5bd8496325c17e454257a3"}, + {file = "wrapt-2.1.2-cp314-cp314t-win32.whl", hash = "sha256:3b8d15e52e195813efe5db8cec156eebe339aaf84222f4f4f051a6c01f237ed7"}, + {file = "wrapt-2.1.2-cp314-cp314t-win_amd64.whl", hash = "sha256:08ffa54146a7559f5b8df4b289b46d963a8e74ed16ba3687f99896101a3990c5"}, + {file = "wrapt-2.1.2-cp314-cp314t-win_arm64.whl", hash = "sha256:72aaa9d0d8e4ed0e2e98019cea47a21f823c9dd4b43c7b77bba6679ffcca6a00"}, + {file = "wrapt-2.1.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5e0fa9cc32300daf9eb09a1f5bdc6deb9a79defd70d5356ba453bcd50aef3742"}, + {file = "wrapt-2.1.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:710f6e5dfaf6a5d5c397d2d6758a78fecd9649deb21f1b645f5b57a328d63050"}, + {file = "wrapt-2.1.2-cp39-cp39-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:305d8a1755116bfdad5dda9e771dcb2138990a1d66e9edd81658816edf51aed1"}, + {file = "wrapt-2.1.2-cp39-cp39-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f0d8fc30a43b5fe191cf2b1a0c82bab2571dadd38e7c0062ee87d6df858dd06e"}, + {file = "wrapt-2.1.2-cp39-cp39-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:a5d516e22aedb7c9c1d47cba1c63160b1a6f61ec2f3948d127cd38d5cfbb556f"}, + {file = "wrapt-2.1.2-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:45914e8efbe4b9d5102fcf0e8e2e3258b83a5d5fba9f8f7b6d15681e9d29ffe0"}, + {file = "wrapt-2.1.2-cp39-cp39-musllinux_1_2_riscv64.whl", hash = "sha256:478282ebd3795a089154fb16d3db360e103aa13d3b2ad30f8f6aac0d2207de0e"}, + {file = "wrapt-2.1.2-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:3756219045f73fb28c5d7662778e4156fbd06cf823c4d2d4b19f97305e52819c"}, + {file = "wrapt-2.1.2-cp39-cp39-win32.whl", hash = "sha256:b8aefb4dbb18d904b96827435a763fa42fc1f08ea096a391710407a60983ced8"}, + {file = "wrapt-2.1.2-cp39-cp39-win_amd64.whl", hash = "sha256:e5aeab8fe15c3dff75cfee94260dcd9cded012d4ff06add036c28fae7718593b"}, + {file = "wrapt-2.1.2-cp39-cp39-win_arm64.whl", hash = "sha256:f069e113743a21a3defac6677f000068ebb931639f789b5b226598e247a4c89e"}, + {file = "wrapt-2.1.2-py3-none-any.whl", hash = "sha256:b8fd6fa2b2c4e7621808f8c62e8317f4aae56e59721ad933bac5239d913cf0e8"}, + {file = "wrapt-2.1.2.tar.gz", hash = "sha256:3996a67eecc2c68fd47b4e3c564405a5777367adfd9b8abb58387b63ee83b21e"}, ] [package.extras] @@ -2617,4 +2603,4 @@ requests = ">=2.0,<3.0" [metadata] lock-version = "2.1" python-versions = ">=3.12,<3.15" -content-hash = "2c1b6a212c3df0c246a0034fc078744f439083bcde884c372775e7cae24c532c" +content-hash = "20ffe085a5cbd9c52761d87a22e43ea196582dea345b0c5e8253c7000c87b6c0" diff --git a/pyproject.toml b/src/pyproject.toml similarity index 88% rename from pyproject.toml rename to src/pyproject.toml index 7520dbf..af654bf 100644 --- a/pyproject.toml +++ b/src/pyproject.toml @@ -1,11 +1,11 @@ [project] -name = "ere" -version = "0.1.0" +name = "ere-basic" +version = "1.0.0" description = "A basic implementation of the Entity Resolution Engine (ERE)." authors = [ {name = "Meaningfy",email = "hi@meaningfy.ws"} ] -readme = "README.md" +readme = "../README.md" requires-python = ">=3.12,<3.15" @@ -13,9 +13,9 @@ requires-python = ">=3.12,<3.15" requires = ["poetry-core>=2.0.0,<3.0.0"] build-backend = "poetry.core.masonry.api" -# Needed when the root doesn't contain $project_name +[tool.poetry] packages = [ - { include = "ere", from = "src" } + { include = "ere" } ] @@ -48,18 +48,18 @@ pandas = ">=2.0,<3.0" splink = ">=4.0,<5.0" # TODO: should we have a registry? -ers-spec = { git = "https://github.com/OP-TED/entity-resolution-spec.git", branch = "0.3.0-rc.1" } +ers-spec = { git = "https://github.com/OP-TED/entity-resolution-spec.git", branch = "release/1.0.0", subdirectory = "src" } [tool.pytest.ini_options] addopts = [ "-v", "--basetemp=/tmp/pytest", - "--cov=src", + "--cov=ere", "--cov-report=term-missing", "--cov-fail-under=80", ] -testpaths = ["test"] +testpaths = ["../test"] # Skips warning from 3rd party libs, such as rdflib filterwarnings = [ "once", @@ -86,7 +86,7 @@ warn_return_any = true [tool.coverage.run] -source = ["src"] +source = ["ere"] omit = ["*/__init__.py"] [tool.coverage.report] diff --git a/test/conftest.py b/test/conftest.py index 4cdc4d2..ee4c9ba 100644 --- a/test/conftest.py +++ b/test/conftest.py @@ -30,8 +30,8 @@ def pytest_configure(config: pytest.Config): # Setup logging from YAML config file cfg_path = str(TEST_RESOURCES_DIR / "logging-test.yml") with open(cfg_path, encoding="utf-8") as f: - config = yaml.safe_load(f) - logging.config.dictConfig(config) + logging_cfg = yaml.safe_load(f) + logging.config.dictConfig(logging_cfg) # ============================================================================ @@ -215,6 +215,7 @@ def rdf_mapper(rdf_mapping_path): # pylint: disable=redefined-outer-name # pyt # Redis fixture # ============================================================================ + @pytest.fixture(scope="module") def redis_client(): """ diff --git a/test/e2e/test_ere.py b/test/e2e/test_ere.py index e5bb5ee..c8d05e5 100644 --- a/test/e2e/test_ere.py +++ b/test/e2e/test_ere.py @@ -141,7 +141,9 @@ def test_single_request_resolution_flow(redis_client, redis_queues, queue_worker redis_client.rpush(request_queue, request_bytes) # 2. Process message using worker - assert queue_worker.process_single_message() is True, "Worker should process message" + assert queue_worker.process_single_message() is True, ( + "Worker should process message" + ) # 3. Verify response in queue result = redis_client.brpop(response_queue, timeout=1) diff --git a/test/features/steps/test_direct_service_resolution_steps.py b/test/features/steps/test_direct_service_resolution_steps.py index 678efea..b39d6b8 100644 --- a/test/features/steps/test_direct_service_resolution_steps.py +++ b/test/features/steps/test_direct_service_resolution_steps.py @@ -2,6 +2,7 @@ Tests resolve_entity_mention(EntityMention) -> ClusterReference directly. """ + import pytest from assertpy import assert_that from erspec.models.core import ClusterReference, EntityMention, EntityMentionIdentifier @@ -41,6 +42,7 @@ def outcome(): # store either "result" or "exception" return {"result": None, "exception": None} + # --------------------------------------------------------------------------- # Background # --------------------------------------------------------------------------- @@ -58,11 +60,23 @@ def fresh_service(entity_resolution_service): @given( - parsers.parse('entity mention "{mention_id}" of type "{entity_type}" was already resolved with content from "{rdf_file_first}"'), + parsers.parse( + 'entity mention "{mention_id}" of type "{entity_type}" was already resolved with content from "{rdf_file_first}"' + ), target_fixture="seed_result", ) -def pre_resolve(mention_id: str, entity_type: str, rdf_file_first: str, entity_resolution_service, rdf_mapper) -> ClusterReference: - return resolve_entity_mention(_make_mention(mention_id, entity_type, load_rdf(rdf_file_first)), entity_resolution_service, rdf_mapper) +def pre_resolve( + mention_id: str, + entity_type: str, + rdf_file_first: str, + entity_resolution_service, + rdf_mapper, +) -> ClusterReference: + return resolve_entity_mention( + _make_mention(mention_id, entity_type, load_rdf(rdf_file_first)), + entity_resolution_service, + rdf_mapper, + ) # --------------------------------------------------------------------------- @@ -71,19 +85,43 @@ def pre_resolve(mention_id: str, entity_type: str, rdf_file_first: str, entity_r @when( - parsers.parse('I resolve the first entity mention "{mention_id}" of type "{entity_type}" with content from "{rdf_file}"'), + parsers.parse( + 'I resolve the first entity mention "{mention_id}" of type "{entity_type}" with content from "{rdf_file}"' + ), target_fixture="first_result", ) -def resolve_first(mention_id: str, entity_type: str, rdf_file: str, entity_resolution_service, rdf_mapper) -> ClusterReference: - return resolve_entity_mention(_make_mention(mention_id, entity_type, load_rdf(rdf_file)), entity_resolution_service, rdf_mapper) +def resolve_first( + mention_id: str, + entity_type: str, + rdf_file: str, + entity_resolution_service, + rdf_mapper, +) -> ClusterReference: + return resolve_entity_mention( + _make_mention(mention_id, entity_type, load_rdf(rdf_file)), + entity_resolution_service, + rdf_mapper, + ) @when( - parsers.parse('I resolve the second entity mention "{mention_id}" of type "{entity_type}" with content from "{rdf_file}"'), + parsers.parse( + 'I resolve the second entity mention "{mention_id}" of type "{entity_type}" with content from "{rdf_file}"' + ), target_fixture="second_result", ) -def resolve_second(mention_id: str, entity_type: str, rdf_file: str, entity_resolution_service, rdf_mapper) -> ClusterReference: - return resolve_entity_mention(_make_mention(mention_id, entity_type, load_rdf(rdf_file)), entity_resolution_service, rdf_mapper) +def resolve_second( + mention_id: str, + entity_type: str, + rdf_file: str, + entity_resolution_service, + rdf_mapper, +) -> ClusterReference: + return resolve_entity_mention( + _make_mention(mention_id, entity_type, load_rdf(rdf_file)), + entity_resolution_service, + rdf_mapper, + ) # --------------------------------------------------------------------------- @@ -92,20 +130,40 @@ def resolve_second(mention_id: str, entity_type: str, rdf_file: str, entity_reso @when( - parsers.parse('I resolve entity mention "{mention_id}" of type "{entity_type}" with content from "{rdf_file}"'), + parsers.parse( + 'I resolve entity mention "{mention_id}" of type "{entity_type}" with content from "{rdf_file}"' + ), target_fixture="first_result", ) -def resolve_mention(mention_id: str, entity_type: str, rdf_file: str, entity_resolution_service, rdf_mapper) -> ClusterReference: +def resolve_mention( + mention_id: str, + entity_type: str, + rdf_file: str, + entity_resolution_service, + rdf_mapper, +) -> ClusterReference: mention = _make_mention(mention_id, entity_type, load_rdf(rdf_file)) return resolve_entity_mention(mention, entity_resolution_service, rdf_mapper) @when( - parsers.parse('I resolve entity mention "{mention_id}" of type "{entity_type}" with content from "{rdf_file}" again'), + parsers.parse( + 'I resolve entity mention "{mention_id}" of type "{entity_type}" with content from "{rdf_file}" again' + ), target_fixture="second_result", ) -def resolve_mention_again(mention_id: str, entity_type: str, rdf_file: str, entity_resolution_service, rdf_mapper) -> ClusterReference: - return resolve_entity_mention(_make_mention(mention_id, entity_type, load_rdf(rdf_file)), entity_resolution_service, rdf_mapper) +def resolve_mention_again( + mention_id: str, + entity_type: str, + rdf_file: str, + entity_resolution_service, + rdf_mapper, +) -> ClusterReference: + return resolve_entity_mention( + _make_mention(mention_id, entity_type, load_rdf(rdf_file)), + entity_resolution_service, + rdf_mapper, + ) # --------------------------------------------------------------------------- @@ -114,12 +172,25 @@ def resolve_mention_again(mention_id: str, entity_type: str, rdf_file: str, enti @when( - parsers.parse('I try to resolve entity mention "{mention_id}" of type "{entity_type}" with content from "{rdf_file}"'), + parsers.parse( + 'I try to resolve entity mention "{mention_id}" of type "{entity_type}" with content from "{rdf_file}"' + ), target_fixture="raised_exception", ) -def try_resolve_conflict(mention_id: str, entity_type: str, rdf_file: str, outcome, entity_resolution_service, rdf_mapper) -> Exception | None: +def try_resolve_conflict( + mention_id: str, + entity_type: str, + rdf_file: str, + outcome, + entity_resolution_service, + rdf_mapper, +) -> Exception | None: try: - outcome["result"] = resolve_entity_mention(_make_mention(mention_id, entity_type, load_rdf(rdf_file)), entity_resolution_service, rdf_mapper) + outcome["result"] = resolve_entity_mention( + _make_mention(mention_id, entity_type, load_rdf(rdf_file)), + entity_resolution_service, + rdf_mapper, + ) return None except Exception as exc: outcome["exception"] = exc @@ -128,12 +199,25 @@ def try_resolve_conflict(mention_id: str, entity_type: str, rdf_file: str, outco @when( # parsers.re required: parsers.parse cannot match an empty string for {bad_content} - parsers.re(r'I try to resolve entity mention "(?P[^"]+)" of type "(?P[^"]+)" with invalid content "(?P.*)"'), + parsers.re( + r'I try to resolve entity mention "(?P[^"]+)" of type "(?P[^"]+)" with invalid content "(?P.*)"' + ), target_fixture="raised_exception", ) -def try_resolve_malformed(mention_id: str, entity_type: str, bad_content: str, outcome, entity_resolution_service, rdf_mapper) -> Exception | None: +def try_resolve_malformed( + mention_id: str, + entity_type: str, + bad_content: str, + outcome, + entity_resolution_service, + rdf_mapper, +) -> Exception | None: try: - outcome["result"] = resolve_entity_mention(_make_mention(mention_id, entity_type, bad_content), entity_resolution_service, rdf_mapper) + outcome["result"] = resolve_entity_mention( + _make_mention(mention_id, entity_type, bad_content), + entity_resolution_service, + rdf_mapper, + ) return None except Exception as exc: outcome["exception"] = exc @@ -146,7 +230,9 @@ def try_resolve_malformed(mention_id: str, entity_type: str, bad_content: str, o @then("both results are ClusterReference instances") -def check_cluster_reference_type(first_result: ClusterReference, second_result: ClusterReference): +def check_cluster_reference_type( + first_result: ClusterReference, second_result: ClusterReference +): assert_that(first_result).is_instance_of(ClusterReference) assert_that(second_result).is_instance_of(ClusterReference) @@ -157,12 +243,16 @@ def check_same_cluster(first_result: ClusterReference, second_result: ClusterRef @then("the cluster_ids are different") -def check_different_clusters(first_result: ClusterReference, second_result: ClusterReference): +def check_different_clusters( + first_result: ClusterReference, second_result: ClusterReference +): assert_that(first_result.cluster_id).is_not_equal_to(second_result.cluster_id) @then("both ClusterReference results are identical") -def check_identical_results(first_result: ClusterReference, second_result: ClusterReference): +def check_identical_results( + first_result: ClusterReference, second_result: ClusterReference +): assert_that(first_result).is_equal_to(second_result) assert_that(first_result).is_equal_to(second_result) @@ -183,7 +273,9 @@ def check_exception_raised(outcome): ) elif isinstance(raised_exception, ConflictError): # Conflict errors should contain mention_id and indicate content mismatch - assert_that(str(raised_exception)).contains("was already resolved with different content") + assert_that(str(raised_exception)).contains( + "was already resolved with different content" + ) @then("the result is a ClusterReference") @@ -193,7 +285,9 @@ def check_single_result_type(first_result: ClusterReference): @then("the cluster_id matches the seed cluster") -def check_matches_seed_cluster(first_result: ClusterReference, seed_result: ClusterReference): +def check_matches_seed_cluster( + first_result: ClusterReference, seed_result: ClusterReference +): """Verify new mention joined the pre-established cluster (not a new one).""" assert_that(first_result.cluster_id).is_equal_to(seed_result.cluster_id) @@ -207,4 +301,6 @@ def check_unsupported_entity_type_exception(outcome): f"Result was: {outcome['result']!r}" ) assert_that(raised_exception).is_instance_of(ValueError) - assert_that(str(raised_exception)).matches(r"No rdf_mapping configured for entity_type") + assert_that(str(raised_exception)).matches( + r"No rdf_mapping configured for entity_type" + ) diff --git a/test/features/steps/test_entity_resolution_algorithm_steps.py b/test/features/steps/test_entity_resolution_algorithm_steps.py index 89d0984..c41e42c 100644 --- a/test/features/steps/test_entity_resolution_algorithm_steps.py +++ b/test/features/steps/test_entity_resolution_algorithm_steps.py @@ -81,7 +81,7 @@ def resolve_mention(mention_id: str, algorithm_context): # Create mention mention = Mention( id=MentionId(value=mention_id), - attributes={"legal_name": f"Company {mention_id}", "country_code": "US"} + attributes={"legal_name": f"Company {mention_id}", "country_code": "US"}, ) # Update linker with new similarities @@ -102,7 +102,9 @@ def resolve_mention(mention_id: str, algorithm_context): algorithm_context["last_result"] = result -@when(parsers.parse('I set similarity between "{left_id}" and "{right_id}" to {score:f}')) +@when( + parsers.parse('I set similarity between "{left_id}" and "{right_id}" to {score:f}') +) def set_similarity(left_id: str, right_id: str, score: float, algorithm_context): """Set similarity between two mentions.""" pair_set = frozenset([left_id, right_id]) @@ -114,8 +116,14 @@ def set_similarity(left_id: str, right_id: str, score: float, algorithm_context) # =============================================================================== -@then(parsers.parse('mention "{mention_id}" is in cluster "{cluster_id}" with score {score:f}')) -def check_mention_cluster(mention_id: str, cluster_id: str, score: float, algorithm_context): +@then( + parsers.parse( + 'mention "{mention_id}" is in cluster "{cluster_id}" with score {score:f}' + ) +) +def check_mention_cluster( + mention_id: str, cluster_id: str, score: float, algorithm_context +): """Verify that a mention is assigned to a cluster with the expected score.""" result = algorithm_context["last_result"] assert_that(result.top.cluster_id.value).is_equal_to(cluster_id) @@ -129,7 +137,9 @@ def check_candidate_count(count: int, algorithm_context): assert_that(len(result.candidates)).is_equal_to(count) -@then(parsers.parse('candidate {index:d} is cluster "{cluster_id}" with score {score:f}')) +@then( + parsers.parse('candidate {index:d} is cluster "{cluster_id}" with score {score:f}') +) def check_candidate(index: int, cluster_id: str, score: float, algorithm_context): """Verify a specific candidate cluster and its score.""" result = algorithm_context["last_result"] @@ -139,7 +149,9 @@ def check_candidate(index: int, cluster_id: str, score: float, algorithm_context assert_that(candidate.score).is_close_to(score, 0.01) -@then(parsers.parse('the cluster assignment for mention "{mention_id}" is "{cluster_id}"')) +@then( + parsers.parse('the cluster assignment for mention "{mention_id}" is "{cluster_id}"') +) def check_cluster_assignment(mention_id: str, cluster_id: str, algorithm_context): """Verify the cluster assignment from state.""" service = algorithm_context["service"] diff --git a/test/integration/test_entity_resolver.py b/test/integration/test_entity_resolver.py index 5470190..abf5e7e 100644 --- a/test/integration/test_entity_resolver.py +++ b/test/integration/test_entity_resolver.py @@ -122,7 +122,9 @@ def test_first_mention_resolves_to_singleton(service, con): # Verify persistence mention_count = con.execute("SELECT COUNT(*) FROM mentions").fetchone()[0] assert mention_count == 1 - cluster_count = con.execute("SELECT COUNT(DISTINCT cluster_id) FROM clusters").fetchone()[0] + cluster_count = con.execute( + "SELECT COUNT(DISTINCT cluster_id) FROM clusters" + ).fetchone()[0] assert cluster_count == 1 @@ -169,7 +171,9 @@ def test_below_threshold_creates_new_cluster(service, con): assert mention_count == 2 # Verify cluster assignments persist - cluster_count = con.execute("SELECT COUNT(DISTINCT cluster_id) FROM clusters").fetchone()[0] + cluster_count = con.execute( + "SELECT COUNT(DISTINCT cluster_id) FROM clusters" + ).fetchone()[0] assert cluster_count >= 1 @@ -243,7 +247,9 @@ def test_train_succeeds_with_sufficient_records(service, con): service.train() # Verify linker is still functional - query = Mention(mention_id="test_q", legal_name="Acme Technologies", country_code="US") + query = Mention( + mention_id="test_q", legal_name="Acme Technologies", country_code="US" + ) result = service.resolve(query) assert result.top is not None @@ -436,11 +442,15 @@ def test_multiple_resolves_accumulate_state(service, con): state = service.state() # Verify state accumulates - assert state.mention_count == i, f"After resolving {i} mentions, should have {i} in DB" + assert state.mention_count == i, ( + f"After resolving {i} mentions, should have {i} in DB" + ) # Later mentions should see earlier mentions in results if i > 1: - assert len(result.candidates) >= 1, "Should see candidates from earlier mentions" + assert len(result.candidates) >= 1, ( + "Should see candidates from earlier mentions" + ) @pytest.mark.integration @@ -452,14 +462,22 @@ def test_end_to_end_realistic_scenario(service, con): # Stream of mentions: 3 companies with variants mentions = [ # Company A - Mention(mention_id="acme_1", legal_name="Acme Corporation Ltd", country_code="US"), + Mention( + mention_id="acme_1", legal_name="Acme Corporation Ltd", country_code="US" + ), Mention(mention_id="acme_2", legal_name="Acme Corp", country_code="US"), Mention(mention_id="acme_3", legal_name="Acme", country_code="US"), # Company B - Mention(mention_id="bestco_1", legal_name="BestCo Industries Inc", country_code="US"), + Mention( + mention_id="bestco_1", legal_name="BestCo Industries Inc", country_code="US" + ), Mention(mention_id="bestco_2", legal_name="BestCo Inc", country_code="US"), # Company C - Mention(mention_id="techsoft_1", legal_name="TechSoft Solutions Limited", country_code="US"), + Mention( + mention_id="techsoft_1", + legal_name="TechSoft Solutions Limited", + country_code="US", + ), Mention(mention_id="techsoft_2", legal_name="TechSoft Ltd", country_code="US"), Mention(mention_id="techsoft_3", legal_name="TechSoft", country_code="US"), ] @@ -481,9 +499,14 @@ def test_end_to_end_realistic_scenario(service, con): # Verify all mentions are assigned assert set(mention_to_cluster.keys()) == { - "acme_1", "acme_2", "acme_3", - "bestco_1", "bestco_2", - "techsoft_1", "techsoft_2", "techsoft_3" + "acme_1", + "acme_2", + "acme_3", + "bestco_1", + "bestco_2", + "techsoft_1", + "techsoft_2", + "techsoft_3", }, "All mentions should be assigned to clusters" # Verify different companies are in different clusters @@ -492,5 +515,6 @@ def test_end_to_end_realistic_scenario(service, con): bestco_cluster = mention_to_cluster["bestco_1"] techsoft_cluster = mention_to_cluster["techsoft_1"] - assert len({acme_cluster, bestco_cluster, techsoft_cluster}) == 3, \ + assert len({acme_cluster, bestco_cluster, techsoft_cluster}) == 3, ( "Different companies should be in different clusters" + ) diff --git a/test/integration/test_redis_integration.py b/test/integration/test_redis_integration.py index 2b22234..2b0fac4 100644 --- a/test/integration/test_redis_integration.py +++ b/test/integration/test_redis_integration.py @@ -15,7 +15,9 @@ import pytest -def create_test_request(request_id: str = "test-001", content: str = "John Smith") -> dict: +def create_test_request( + request_id: str = "test-001", content: str = "John Smith" +) -> dict: """Create a valid EntityMentionResolutionRequest for testing.""" return { "type": "EntityMentionResolutionRequest", @@ -80,14 +82,20 @@ def test_receive_response(self, redis_client): if new_response_count == 0: pytest.skip("ERE service not running — skipping response test") - assert new_response_count == 1, f"Expected 1 new response, got {new_response_count}" + assert new_response_count == 1, ( + f"Expected 1 new response, got {new_response_count}" + ) # Retrieve and verify response format (latest response is at index 0) response_raw = redis_client.lindex("ere_responses", 0) assert response_raw is not None, "Response is empty" # response_raw is bytes, decode it - response_str = response_raw.decode("utf-8") if isinstance(response_raw, bytes) else response_raw + response_str = ( + response_raw.decode("utf-8") + if isinstance(response_raw, bytes) + else response_raw + ) response = json.loads(response_str) # Verify response structure @@ -115,7 +123,9 @@ def test_multiple_requests(self, redis_client): if new_response_count == 0: pytest.skip("ERE service not running — skipping response verification") - assert new_response_count == 3, f"Expected 3 new responses, got {new_response_count}" + assert new_response_count == 3, ( + f"Expected 3 new responses, got {new_response_count}" + ) def test_redis_authentication(self, redis_client): """Test: Verify Redis connection works with authentication.""" @@ -140,4 +150,4 @@ def test_malformed_request_handling(self, redis_client): if __name__ == "__main__": """Allow running tests directly: python test/integration/test_redis_integration.py""" - pytest.main([__file__, "-v"]) \ No newline at end of file + pytest.main([__file__, "-v"]) diff --git a/test/stress/README.md b/test/stress/README.md index 0d69f7b..cd33711 100644 --- a/test/stress/README.md +++ b/test/stress/README.md @@ -45,7 +45,7 @@ poetry run python3 test/stress/stress_test.py \ ### Optional **`--config PATH`** -- Path to resolver config YAML (default: `infra/config/resolver.yaml`) +- Path to resolver config YAML (default: `config/resolver.yaml`) - Determines blocking rules, thresholds, and Splink settings **`--seed N`** diff --git a/test/stress/stress_test.py b/test/stress/stress_test.py index 4588c63..96ce807 100644 --- a/test/stress/stress_test.py +++ b/test/stress/stress_test.py @@ -14,7 +14,7 @@ --dataset test/stress/data/org-mid.csv \ --seed 200 \ --records 500 \ - --config infra/config/resolver.yaml \ + --config src/config/resolver.yaml \ --output /tmp/stress_mid.json """ @@ -141,7 +141,10 @@ def create_resolver( def seed_and_train( - resolver: EntityResolver, mentions: list[Mention], n_seed: int, skip_train: bool = False + resolver: EntityResolver, + mentions: list[Mention], + n_seed: int, + skip_train: bool = False, ): """ Seed resolver with first n_seed mentions and optionally trigger training. @@ -409,7 +412,7 @@ def main(): ) parser.add_argument( "--config", - default="infra/config/resolver.yaml", + default="src/config/resolver.yaml", help="Path to resolver config YAML", ) parser.add_argument( diff --git a/test/unit/adapters/stubs.py b/test/unit/adapters/stubs.py index 5529b81..d2054bc 100644 --- a/test/unit/adapters/stubs.py +++ b/test/unit/adapters/stubs.py @@ -2,6 +2,9 @@ from typing import Protocol, runtime_checkable +from erspec.models.core import EntityMention + +from ere.adapters.rdf_mapper_port import RDFMapper from ere.models.resolver import ( ClusterId, ClusterMembership, @@ -15,12 +18,14 @@ def _get_repository_types(): """Lazy import to avoid circular dependency with services.__init__.""" from ere.adapters import repositories + return repositories def _get_linker_type(): """Lazy import to avoid circular dependency.""" from ere.services import linker + return linker @@ -76,6 +81,9 @@ def save(self, mention: Mention) -> None: def load_all(self) -> list[Mention]: return list(self._mentions.values()) + def find_by_id(self, mention_id: MentionId) -> Mention | None: + return self._mentions.get(mention_id) + def count(self) -> int: return len(self._mentions) @@ -95,9 +103,7 @@ def count(self) -> int: def find_for(self, mention_id: MentionId) -> list[MentionLink]: """Find all links involving the given mention (either side).""" return [ - link - for link in self._links - if mention_id in (link.left_id, link.right_id) + link for link in self._links if mention_id in (link.left_id, link.right_id) ] @@ -193,3 +199,28 @@ def register_mention(self, mention: Mention) -> None: def train(self) -> None: """No-op for fixed linker (scores are pre-configured).""" pass + + +class StubRDFMapper(RDFMapper): + """ + RDFMapper stub for unit testing. + + Returns a pre-configured Mention without performing any RDF parsing. + Optionally raises a configured exception to test error paths. + """ + + def __init__( + self, + mention_to_return: Mention = None, + error: Exception = None, + ): + self._mention = mention_to_return or Mention( + id=MentionId(value="stub-mention-id"), + attributes={"legal_name": "Stub Corp", "country_code": "US"}, + ) + self._error = error + + def map_entity_mention_to_domain(self, entity_mention: EntityMention) -> Mention: + if self._error is not None: + raise self._error + return self._mention diff --git a/test/unit/adapters/test_adapter_factories.py b/test/unit/adapters/test_adapter_factories.py new file mode 100644 index 0000000..e339df4 --- /dev/null +++ b/test/unit/adapters/test_adapter_factories.py @@ -0,0 +1,18 @@ +"""Unit tests for adapters.factories: RDFMapper construction.""" + +from pathlib import Path + +from ere.adapters.factories import build_rdf_mapper +from ere.adapters.rdf_mapper_port import RDFMapper + +TEST_RDF_MAPPING = Path(__file__).parent.parent.parent / "resources" / "rdf_mapping.yaml" + + +def test_build_rdf_mapper_with_explicit_path_returns_mapper(): + mapper = build_rdf_mapper(rdf_mapping_path=TEST_RDF_MAPPING) + assert isinstance(mapper, RDFMapper) + + +def test_build_rdf_mapper_without_path_uses_default(): + mapper = build_rdf_mapper() + assert isinstance(mapper, RDFMapper) diff --git a/test/unit/adapters/test_duckdb_adapters.py b/test/unit/adapters/test_duckdb_adapters.py index 8087bc9..fa6cb4b 100644 --- a/test/unit/adapters/test_duckdb_adapters.py +++ b/test/unit/adapters/test_duckdb_adapters.py @@ -80,7 +80,7 @@ def test_resolve_first_mention_persists_to_db(service, con): """ mention = Mention( id=MentionId(value="m1"), - attributes={"legal_name": "Acme Corp", "country_code": "US"} + attributes={"legal_name": "Acme Corp", "country_code": "US"}, ) result = service.resolve(mention) @@ -89,7 +89,9 @@ def test_resolve_first_mention_persists_to_db(service, con): mention_count = con.execute("SELECT COUNT(*) FROM mentions").fetchone()[0] assert mention_count == 1 - cluster_count = con.execute("SELECT COUNT(DISTINCT cluster_id) FROM clusters").fetchone()[0] + cluster_count = con.execute( + "SELECT COUNT(DISTINCT cluster_id) FROM clusters" + ).fetchone()[0] assert cluster_count == 1 # Check state @@ -109,11 +111,11 @@ def test_resolve_strong_match_joins_cluster_in_db(service, con): """ m1 = Mention( id=MentionId(value="m1"), - attributes={"legal_name": "Acme", "country_code": "US"} + attributes={"legal_name": "Acme", "country_code": "US"}, ) m2 = Mention( id=MentionId(value="m2"), - attributes={"legal_name": "Acme", "country_code": "US"} + attributes={"legal_name": "Acme", "country_code": "US"}, ) # Set up linker to return high score @@ -144,11 +146,11 @@ def test_resolve_weak_match_creates_separate_cluster(service, con): """ m1 = Mention( id=MentionId(value="m1"), - attributes={"legal_name": "Acme", "country_code": "US"} + attributes={"legal_name": "Acme", "country_code": "US"}, ) m2 = Mention( id=MentionId(value="m2"), - attributes={"legal_name": "Similar but different", "country_code": "US"} + attributes={"legal_name": "Similar but different", "country_code": "US"}, ) # Linker returns score below clustering threshold (0.8) @@ -179,11 +181,11 @@ def test_resolve_no_match_creates_singleton_cluster(service, con): """ m1 = Mention( id=MentionId(value="m1"), - attributes={"legal_name": "Acme", "country_code": "US"} + attributes={"legal_name": "Acme", "country_code": "US"}, ) m2 = Mention( id=MentionId(value="m2"), - attributes={"legal_name": "Completely Different", "country_code": "UK"} + attributes={"legal_name": "Completely Different", "country_code": "UK"}, ) # No similarity map entry = no match @@ -202,12 +204,10 @@ def test_resolve_no_match_creates_singleton_cluster(service, con): def test_state_returns_correct_counts(service, con): """Verify that service.state() returns accurate counts.""" m1 = Mention( - id=MentionId(value="m1"), - attributes={"legal_name": "A", "country_code": "US"} + id=MentionId(value="m1"), attributes={"legal_name": "A", "country_code": "US"} ) m2 = Mention( - id=MentionId(value="m2"), - attributes={"legal_name": "B", "country_code": "US"} + id=MentionId(value="m2"), attributes={"legal_name": "B", "country_code": "US"} ) service._linker._similarity_map = {frozenset(["m1", "m2"]): 0.9} @@ -224,12 +224,10 @@ def test_state_returns_correct_counts(service, con): def test_cluster_membership_mapping(service, con): """Verify cluster_membership dict is correctly structured.""" m1 = Mention( - id=MentionId(value="m1"), - attributes={"legal_name": "A", "country_code": "US"} + id=MentionId(value="m1"), attributes={"legal_name": "A", "country_code": "US"} ) m2 = Mention( - id=MentionId(value="m2"), - attributes={"legal_name": "B", "country_code": "US"} + id=MentionId(value="m2"), attributes={"legal_name": "B", "country_code": "US"} ) service._linker._similarity_map = {frozenset(["m1", "m2"]): 0.9} @@ -246,3 +244,29 @@ def test_cluster_membership_mapping(service, con): assert len(memberships[cluster_id]) == 2 assert MentionId(value="m1") in memberships[cluster_id] assert MentionId(value="m2") in memberships[cluster_id] + + +def test_mention_repository_load_all_returns_persisted_mentions(con, entity_fields): + """load_all should return all mentions previously saved.""" + repo = DuckDBMentionRepository(con, entity_fields) + m1 = Mention(id=MentionId(value="la1"), attributes={"legal_name": "Alpha", "country_code": "DE"}) + m2 = Mention(id=MentionId(value="la2"), attributes={"legal_name": "Beta", "country_code": "FR"}) + + repo.save(m1) + repo.save(m2) + + loaded = repo.load_all() + + assert len(loaded) == 2 + ids = {m.id.value for m in loaded} + assert ids == {"la1", "la2"} + + +def test_similarity_repository_save_all_empty_is_noop(con): + """save_all with an empty list should not raise and not write any rows.""" + repo = DuckDBSimilarityRepository(con) + + repo.save_all([]) # must not raise + + count = con.execute("SELECT COUNT(*) FROM similarities").fetchone()[0] + assert count == 0 diff --git a/test/unit/adapters/test_utils.py b/test/unit/adapters/test_utils.py new file mode 100644 index 0000000..80fb431 --- /dev/null +++ b/test/unit/adapters/test_utils.py @@ -0,0 +1,74 @@ +"""Unit tests for adapters.utils: message parsing utilities.""" + +import json +from datetime import datetime, timezone + +import pytest +from erspec.models.core import EntityMention, EntityMentionIdentifier +from erspec.models.ere import ( + EREErrorResponse, + EntityMentionResolutionRequest, + EntityMentionResolutionResponse, +) +from linkml_runtime.dumpers import JSONDumper + +from ere.adapters.utils import ( + get_message_object, + get_request_from_message, + get_response_from_message, +) + +_dumper = JSONDumper() + + +def _make_request(request_id: str = "utils-test-001") -> EntityMentionResolutionRequest: + return EntityMentionResolutionRequest( + entity_mention=EntityMention( + identifiedBy=EntityMentionIdentifier( + request_id=request_id, + source_id="utils-test-src", + entity_type="http://test.org/Org", + ), + content_type="text/turtle", + content="<>", + ), + ere_request_id=request_id, + timestamp=datetime.now(timezone.utc).isoformat(), + ) + + +def _serialise(obj) -> bytes: + return _dumper.dumps(obj).encode("utf-8") + + +def test_get_request_from_message_returns_request(): + raw = _serialise(_make_request("req-parse-01")) + result = get_request_from_message(raw) + assert isinstance(result, EntityMentionResolutionRequest) + assert result.ere_request_id == "req-parse-01" + + +def test_get_response_from_message_returns_error_response(): + response = EREErrorResponse( + ere_request_id="resp-parse-01", + error_type="TestError", + error_title="Test", + error_detail="detail", + timestamp=datetime.now(timezone.utc).isoformat(), + ) + raw = _serialise(response) + result = get_response_from_message(raw) + assert isinstance(result, EREErrorResponse) + assert result.ere_request_id == "resp-parse-01" + + +def test_get_message_object_raises_on_missing_type(): + raw = json.dumps({"ere_request_id": "no-type"}).encode("utf-8") + with pytest.raises(ValueError, match="message without 'type' field"): + get_message_object(raw, {}) + + +def test_get_message_object_raises_on_unsupported_type(): + raw = json.dumps({"type": "UnknownClass", "ere_request_id": "x"}).encode("utf-8") + with pytest.raises(ValueError, match='unsupported message class: "UnknownClass"'): + get_message_object(raw, {}) diff --git a/test/unit/entrypoints/__init__.py b/test/unit/entrypoints/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/test/unit/entrypoints/test_queue_worker.py b/test/unit/entrypoints/test_queue_worker.py new file mode 100644 index 0000000..b55f143 --- /dev/null +++ b/test/unit/entrypoints/test_queue_worker.py @@ -0,0 +1,124 @@ +"""Unit tests for RedisQueueWorker entrypoint (mocked Redis and service).""" + +import json +from datetime import datetime, timezone +from unittest.mock import MagicMock + +import pytest +from erspec.models.core import EntityMention, EntityMentionIdentifier +from erspec.models.ere import ( + EREErrorResponse, + EntityMentionResolutionRequest, + EntityMentionResolutionResponse, +) +from linkml_runtime.dumpers import JSONDumper + +from ere.entrypoints.queue_worker import RedisQueueWorker + +_dumper = JSONDumper() + + +def _make_request(request_id: str = "qw-test-001") -> EntityMentionResolutionRequest: + return EntityMentionResolutionRequest( + entity_mention=EntityMention( + identifiedBy=EntityMentionIdentifier( + request_id=request_id, + source_id="qw-src", + entity_type="http://test.org/Org", + ), + content_type="text/turtle", + content="<>", + ), + ere_request_id=request_id, + timestamp=datetime.now(timezone.utc).isoformat(), + ) + + +def _make_response(request_id: str = "qw-test-001") -> EntityMentionResolutionResponse: + return EntityMentionResolutionResponse( + entity_mention_id=EntityMentionIdentifier( + request_id=request_id, + source_id="qw-src", + entity_type="http://test.org/Org", + ), + candidates=[], + ere_request_id=request_id, + timestamp=datetime.now(timezone.utc).isoformat(), + ) + + +@pytest.fixture +def mock_redis(): + return MagicMock() + + +@pytest.fixture +def mock_service(): + return MagicMock() + + +@pytest.fixture +def worker(mock_redis, mock_service) -> RedisQueueWorker: + return RedisQueueWorker( + redis_client=mock_redis, + entity_resolution_service=mock_service, + request_queue="ere_requests", + response_queue="ere_responses", + queue_timeout=1, + ) + + +def test_process_single_message_returns_false_on_timeout(worker, mock_redis): + mock_redis.brpop.return_value = None + + result = worker.process_single_message() + + assert result is False + + +def test_process_single_message_returns_true_on_success(worker, mock_redis, mock_service): + request = _make_request("qw-happy") + raw_msg = _dumper.dumps(request).encode("utf-8") + mock_redis.brpop.return_value = ("ere_requests", raw_msg) + mock_service.process_request.return_value = _make_response("qw-happy") + + result = worker.process_single_message() + + assert result is True + mock_service.process_request.assert_called_once() + mock_redis.lpush.assert_called_once() + + +def test_process_single_message_sends_error_response_on_parse_failure( + worker, mock_redis, mock_service +): + mock_redis.brpop.return_value = ("ere_requests", b"not valid json at all") + + result = worker.process_single_message() + + assert result is True + mock_redis.lpush.assert_called_once() + pushed_payload = mock_redis.lpush.call_args[0][1] + pushed_json = json.loads(pushed_payload) + assert pushed_json.get("error_type") == "ProcessingError" + + +def test_send_response_logs_error_on_redis_failure(worker, mock_redis): + mock_redis.lpush.side_effect = ConnectionError("redis down") + response = EREErrorResponse( + ere_request_id="err-resp", + error_type="TestError", + error_title="Test", + error_detail="detail", + timestamp=datetime.now(timezone.utc).isoformat(), + ) + worker._send_response(response) # must not raise + + +def test_build_error_response_returns_ere_error_response(): + response = RedisQueueWorker._build_error_response("something broke", "req-err") + + assert isinstance(response, EREErrorResponse) + assert response.ere_request_id == "req-err" + assert response.error_type == "ProcessingError" + assert "something broke" in response.error_detail diff --git a/test/unit/services/test_entity_resolution_service.py b/test/unit/services/test_entity_resolution_service.py index dfca7d2..617cd42 100644 --- a/test/unit/services/test_entity_resolution_service.py +++ b/test/unit/services/test_entity_resolution_service.py @@ -1,6 +1,14 @@ -"""Unit tests for EntityResolver (no DuckDB, no Splink).""" +"""Unit tests for EntityResolver and EntityResolutionService (no DuckDB, no Splink).""" import pytest +from datetime import datetime, timezone + +from erspec.models.core import EntityMention, EntityMentionIdentifier +from erspec.models.ere import ( + EREErrorResponse, + EntityMentionResolutionRequest, + EntityMentionResolutionResponse, +) from ere.models.resolver import ( ClusterId, @@ -8,13 +16,18 @@ MentionId, MentionLink, ) -from ere.services.entity_resolution_service import EntityResolver +from ere.services.entity_resolution_service import ( + EntityResolutionService, + EntityResolver, + resolve_entity_mention, +) from ere.services.resolver_config import DuckDBConfig, ResolverConfig from test.unit.adapters.stubs import ( FixedSimilarityLinker, InMemoryClusterRepository, InMemoryMentionRepository, InMemorySimilarityRepository, + StubRDFMapper, ) @@ -57,7 +70,7 @@ def test_first_mention_is_singleton(service): """Resolving the first mention should create a singleton cluster.""" mention = Mention( id=MentionId(value="m1"), - attributes={"legal_name": "Acme Corp", "country_code": "US"} + attributes={"legal_name": "Acme Corp", "country_code": "US"}, ) result = service.resolve(mention) @@ -79,7 +92,7 @@ def test_strong_match_joins_cluster(service): # Resolve m1 first m1 = Mention( id=MentionId(value="m1"), - attributes={"legal_name": "Acme", "country_code": "US"} + attributes={"legal_name": "Acme", "country_code": "US"}, ) result1 = service.resolve(m1) assert result1.top.cluster_id.value == "m1" @@ -87,7 +100,7 @@ def test_strong_match_joins_cluster(service): # Now resolve m2 with strong match to m1 m2 = Mention( id=MentionId(value="m2"), - attributes={"legal_name": "Acme Corp", "country_code": "US"} + attributes={"legal_name": "Acme Corp", "country_code": "US"}, ) # Set up the linker to return a strong match (m1, m2, 0.95) @@ -116,14 +129,14 @@ def test_below_threshold_becomes_singleton(service): # Resolve m1 first m1 = Mention( id=MentionId(value="m1"), - attributes={"legal_name": "Acme", "country_code": "US"} + attributes={"legal_name": "Acme", "country_code": "US"}, ) service.resolve(m1) # Resolve m2 with weak match to m1 m2 = Mention( id=MentionId(value="m2"), - attributes={"legal_name": "ACME Inc", "country_code": "US"} + attributes={"legal_name": "ACME Inc", "country_code": "US"}, ) # Set up weak match (0.7 < threshold 0.8) @@ -136,7 +149,9 @@ def test_below_threshold_becomes_singleton(service): # m2 should be assigned to its own cluster (cluster "m2"), # but genCand still includes m1's cluster (via the below-threshold link) - assert result2.top.cluster_id.value == "m1" # Still top by score, but own cluster also present + assert ( + result2.top.cluster_id.value == "m1" + ) # Still top by score, but own cluster also present assert result2.top.score == pytest.approx(0.7, abs=0.01) # Verify the new invariant: own cluster is always included @@ -165,11 +180,11 @@ def test_gen_cand_includes_below_threshold_links(service): # Resolve m1 and m3 in cluster 1, m3 in cluster 3 m1 = Mention( id=MentionId(value="m1"), - attributes={"legal_name": "Acme", "country_code": "US"} + attributes={"legal_name": "Acme", "country_code": "US"}, ) m3 = Mention( id=MentionId(value="m3"), - attributes={"legal_name": "Globex", "country_code": "US"} + attributes={"legal_name": "Globex", "country_code": "US"}, ) service.resolve(m1) service.resolve(m3) # m3 forms its own cluster @@ -179,7 +194,7 @@ def test_gen_cand_includes_below_threshold_links(service): # - weak link (0.7) to m3 (cluster "m3") -> below threshold m2 = Mention( id=MentionId(value="m2"), - attributes={"legal_name": "Acme Corp", "country_code": "US"} + attributes={"legal_name": "Acme Corp", "country_code": "US"}, ) service._linker = FixedSimilarityLinker( @@ -210,11 +225,11 @@ def test_gen_cand_groups_by_cluster(service): # Cluster 1: m1, m2 m1 = Mention( id=MentionId(value="m1"), - attributes={"legal_name": "Acme", "country_code": "US"} + attributes={"legal_name": "Acme", "country_code": "US"}, ) m2 = Mention( id=MentionId(value="m2"), - attributes={"legal_name": "Acme Corp", "country_code": "US"} + attributes={"legal_name": "Acme Corp", "country_code": "US"}, ) service.resolve(m1) service._linker = FixedSimilarityLinker({frozenset(["m1", "m2"]): 0.95}) @@ -224,7 +239,7 @@ def test_gen_cand_groups_by_cluster(service): # m3 has weak links to both m1 (0.75) and m2 (0.85) in the same cluster m3 = Mention( id=MentionId(value="m3"), - attributes={"legal_name": "Acme Industries", "country_code": "US"} + attributes={"legal_name": "Acme Industries", "country_code": "US"}, ) service._linker = FixedSimilarityLinker( @@ -258,7 +273,7 @@ def test_train_can_be_called_anytime(service): attributes={ "legal_name": "Company 1", "country_code": "US", - } + }, ) service.resolve(mention) @@ -313,7 +328,7 @@ def counting_train(): attributes={ "legal_name": f"Company {i}", "country_code": "US", - } + }, ) service.resolve(mention) service._linker.register_mention(mention) @@ -326,11 +341,11 @@ def test_state_reflects_mentions(service): """State should reflect all resolved mentions.""" m1 = Mention( id=MentionId(value="m1"), - attributes={"legal_name": "Acme", "country_code": "US"} + attributes={"legal_name": "Acme", "country_code": "US"}, ) m2 = Mention( id=MentionId(value="m2"), - attributes={"legal_name": "Acme Corp", "country_code": "US"} + attributes={"legal_name": "Acme Corp", "country_code": "US"}, ) service.resolve(m1) @@ -348,7 +363,7 @@ def test_state_reflects_clusters(service): """State should reflect cluster membership.""" m1 = Mention( id=MentionId(value="m1"), - attributes={"legal_name": "Acme", "country_code": "US"} + attributes={"legal_name": "Acme", "country_code": "US"}, ) service.resolve(m1) @@ -362,11 +377,11 @@ def test_state_reflects_similarities(service): """State should reflect all stored similarities.""" m1 = Mention( id=MentionId(value="m1"), - attributes={"legal_name": "Acme", "country_code": "US"} + attributes={"legal_name": "Acme", "country_code": "US"}, ) m2 = Mention( id=MentionId(value="m2"), - attributes={"legal_name": "Acme Corp", "country_code": "US"} + attributes={"legal_name": "Acme Corp", "country_code": "US"}, ) service.resolve(m1) @@ -392,7 +407,7 @@ def test_resolution_result_never_empty(service): """Every resolve() call should return non-empty ResolutionResult.""" m1 = Mention( id=MentionId(value="m1"), - attributes={"legal_name": "Acme", "country_code": "US"} + attributes={"legal_name": "Acme", "country_code": "US"}, ) result = service.resolve(m1) @@ -437,7 +452,7 @@ def test_resolution_result_always_top_n_pruned(service): for i in range(2, 7): mention = Mention( id=MentionId(value=f"m{i}"), - attributes={"legal_name": f"Company {i}", "country_code": "US"} + attributes={"legal_name": f"Company {i}", "country_code": "US"}, ) service.resolve(mention) @@ -447,7 +462,7 @@ def test_resolution_result_always_top_n_pruned(service): m1 = Mention( id=MentionId(value="m1"), - attributes={"legal_name": "Company 1", "country_code": "US"} + attributes={"legal_name": "Company 1", "country_code": "US"}, ) result = service.resolve(m1) @@ -459,15 +474,15 @@ def test_multiple_independent_clusters(service): """Mentions with no links should form independent clusters.""" m1 = Mention( id=MentionId(value="m1"), - attributes={"legal_name": "Acme", "country_code": "US"} + attributes={"legal_name": "Acme", "country_code": "US"}, ) m2 = Mention( id=MentionId(value="m2"), - attributes={"legal_name": "Globex", "country_code": "US"} + attributes={"legal_name": "Globex", "country_code": "US"}, ) m3 = Mention( id=MentionId(value="m3"), - attributes={"legal_name": "Initech", "country_code": "US"} + attributes={"legal_name": "Initech", "country_code": "US"}, ) # No links between any of them @@ -482,3 +497,105 @@ def test_multiple_independent_clusters(service): state = service.state() assert state.cluster_count == 3 assert state.mention_count == 3 + + +# =============================================================================== +# resolve_entity_mention guard tests +# =============================================================================== + + +def test_resolve_entity_mention_raises_when_resolver_is_none(): + mention = EntityMention( + identifiedBy=EntityMentionIdentifier( + request_id="m1", + source_id="src", + entity_type="http://test.org/Org", + ), + content_type="text/turtle", + content="<>", + ) + with pytest.raises(ValueError, match="resolver must be provided"): + resolve_entity_mention(mention, resolver=None, mapper=StubRDFMapper()) + + +def test_resolve_entity_mention_raises_when_mapper_is_none(service): + mention = EntityMention( + identifiedBy=EntityMentionIdentifier( + request_id="m1", + source_id="src", + entity_type="http://test.org/Org", + ), + content_type="text/turtle", + content="<>", + ) + with pytest.raises(ValueError, match="mapper must be provided"): + resolve_entity_mention(mention, resolver=service, mapper=None) + + +# =============================================================================== +# EntityResolutionService tests +# =============================================================================== + + +@pytest.fixture +def stub_mapper() -> StubRDFMapper: + return StubRDFMapper() + + +@pytest.fixture +def resolution_service(service: EntityResolver, stub_mapper: StubRDFMapper) -> EntityResolutionService: + return EntityResolutionService(resolver=service, mapper=stub_mapper) + + +def _make_request(request_id: str = "req-001") -> EntityMentionResolutionRequest: + return EntityMentionResolutionRequest( + entity_mention=EntityMention( + identifiedBy=EntityMentionIdentifier( + request_id=request_id, + source_id="test-src", + entity_type="http://test.org/Org", + ), + content_type="text/turtle", + content="<>", + ), + ere_request_id=request_id, + timestamp=datetime.now(timezone.utc).isoformat(), + ) + + +def test_process_request_unsupported_type_returns_error_response(resolution_service): + class UnknownRequest: + ere_request_id = "unknown-001" + + response = resolution_service.process_request(UnknownRequest()) + + assert isinstance(response, EREErrorResponse) + assert response.error_type == "UnsupportedRequestType" + + +def test_process_request_happy_path_returns_resolution_response(resolution_service): + request = _make_request("req-happy") + + response = resolution_service.process_request(request) + + assert isinstance(response, EntityMentionResolutionResponse) + assert response.ere_request_id == "req-happy" + assert len(response.candidates) >= 1 + + +def test_process_request_mapper_error_returns_error_response(service: EntityResolver): + failing_mapper = StubRDFMapper(error=ValueError("RDF parse failure")) + svc = EntityResolutionService(resolver=service, mapper=failing_mapper) + + response = svc.process_request(_make_request("req-fail")) + + assert isinstance(response, EREErrorResponse) + assert response.error_type == "ValueError" + assert "RDF parse failure" in response.error_detail + + +def test_call_delegates_to_process_request(resolution_service): + request = _make_request("req-call") + response = resolution_service(request) + assert isinstance(response, EntityMentionResolutionResponse) + assert response.ere_request_id == "req-call" diff --git a/test/unit/services/test_services_factories.py b/test/unit/services/test_services_factories.py new file mode 100644 index 0000000..46398d8 --- /dev/null +++ b/test/unit/services/test_services_factories.py @@ -0,0 +1,67 @@ +"""Unit tests for services.factories: construction of resolver and service.""" + +from pathlib import Path + +import pytest +import yaml + +from ere.services.entity_resolution_service import EntityResolutionService, EntityResolver +from ere.services.factories import build_entity_resolution_service, build_entity_resolver +from test.unit.adapters.stubs import StubRDFMapper + +TEST_RESOLVER_CONFIG = Path(__file__).parent.parent.parent / "resources" / "resolver.yaml" + + +def test_build_entity_resolver_returns_entity_resolver(): + resolver = build_entity_resolver(resolver_config_path=TEST_RESOLVER_CONFIG) + assert isinstance(resolver, EntityResolver) + + +def test_build_entity_resolver_uses_default_config_when_no_path_given(): + resolver = build_entity_resolver() + assert isinstance(resolver, EntityResolver) + + +def test_build_entity_resolver_with_explicit_entity_fields(): + resolver = build_entity_resolver( + entity_fields=["legal_name"], + resolver_config_path=TEST_RESOLVER_CONFIG, + ) + assert isinstance(resolver, EntityResolver) + + +def test_build_entity_resolver_with_persistent_duckdb(tmp_path): + db_file = str(tmp_path / "test.duckdb") + with open(TEST_RESOLVER_CONFIG, encoding="utf-8") as f: + raw = yaml.safe_load(f) + raw["duckdb"] = {"type": "persistent", "path": db_file} + config = tmp_path / "persistent.yaml" + config.write_text(yaml.dump(raw), encoding="utf-8") + + resolver = build_entity_resolver(resolver_config_path=config, duckdb_path=db_file) + assert isinstance(resolver, EntityResolver) + + +def test_build_entity_resolver_raises_on_invalid_duckdb_type(tmp_path): + bad_config = tmp_path / "bad.yaml" + bad_config.write_text( + "threshold: 0.8\n" + "match_weight_threshold: -10\n" + "top_n: 10\n" + "entity_fields: [legal_name]\n" + "duckdb:\n" + " type: invalid_type\n" + " path: ':memory:'\n", + encoding="utf-8", + ) + with pytest.raises(ValueError, match="Invalid duckdb type"): + build_entity_resolver(resolver_config_path=bad_config) + + +def test_build_entity_resolution_service_returns_service(): + resolver = build_entity_resolver(resolver_config_path=TEST_RESOLVER_CONFIG) + mapper = StubRDFMapper() + + service = build_entity_resolution_service(resolver, mapper) + + assert isinstance(service, EntityResolutionService) diff --git a/test/unit/test_models.py b/test/unit/test_models.py new file mode 100644 index 0000000..d984596 --- /dev/null +++ b/test/unit/test_models.py @@ -0,0 +1,111 @@ +"""Unit tests for domain model edge cases (error paths and utility methods).""" + +import pytest +from unittest.mock import MagicMock, patch + +from ere.models.resolver import ClusterId, MentionId +from ere.models.resolver.cluster import CandidateCluster, ResolutionResult +from ere.models.resolver.similarity import MentionLink + + +# ============================================================================ +# MentionLink +# ============================================================================ + + +def test_mention_link_rejects_same_left_and_right_id(): + m = MentionId(value="x") + with pytest.raises(ValueError, match="left_id and right_id must differ"): + MentionLink(left_id=m, right_id=m, score=0.9) + + +def test_mention_link_other_returns_right_when_from_is_left(): + left = MentionId(value="a") + right = MentionId(value="b") + link = MentionLink(left_id=left, right_id=right, score=0.5) + assert link.other(left) == right + + +def test_mention_link_other_returns_left_when_from_is_right(): + left = MentionId(value="a") + right = MentionId(value="b") + link = MentionLink(left_id=left, right_id=right, score=0.5) + assert link.other(right) == left + + +def test_mention_link_other_raises_when_id_not_in_link(): + left = MentionId(value="a") + right = MentionId(value="b") + unknown = MentionId(value="z") + link = MentionLink(left_id=left, right_id=right, score=0.5) + with pytest.raises(ValueError): + link.other(unknown) + + +# ============================================================================ +# ResolutionResult / CandidateCluster +# ============================================================================ + + +def test_resolution_result_rejects_empty_candidates(): + with pytest.raises(ValueError, match="must be non-empty"): + ResolutionResult(candidates=()) + + +def test_candidate_cluster_as_tuple_returns_id_and_score(): + c = CandidateCluster(cluster_id=ClusterId(value="c1"), score=0.75) + assert c.as_tuple() == ("c1", 0.75) + + +def test_resolution_result_as_tuples_returns_list(): + candidates = ( + CandidateCluster(cluster_id=ClusterId(value="c1"), score=0.9), + CandidateCluster(cluster_id=ClusterId(value="c2"), score=0.6), + ) + result = ResolutionResult(candidates=candidates) + assert result.as_tuples() == [("c1", 0.9), ("c2", 0.6)] + + +# ============================================================================ +# app.main() failure paths +# ============================================================================ + + +def test_main_exits_when_redis_connection_fails(monkeypatch): + monkeypatch.setattr("sys.argv", ["ere"]) + with patch("redis.Redis") as mock_redis_cls, \ + patch("ere.entrypoints.app.configure_logging"): + mock_redis_cls.return_value.ping.side_effect = ConnectionError("no redis") + with pytest.raises(SystemExit) as exc: + from ere.entrypoints.app import main + main() + assert exc.value.code == 1 + + +def test_main_exits_when_service_build_fails(monkeypatch): + monkeypatch.setattr("sys.argv", ["ere"]) + with patch("redis.Redis") as mock_redis_cls, \ + patch("ere.entrypoints.app.configure_logging"), \ + patch("ere.entrypoints.app.build_entity_resolver", side_effect=RuntimeError("build fail")): + mock_redis_cls.return_value.ping.return_value = True + with pytest.raises(SystemExit) as exc: + from ere.entrypoints.app import main + main() + assert exc.value.code == 1 + + +def test_main_runs_loop_until_keyboard_interrupt(monkeypatch): + monkeypatch.setattr("sys.argv", ["ere"]) + mock_resolver = MagicMock() + mock_resolver._mention_repo._con = MagicMock() + + with patch("redis.Redis") as mock_redis_cls, \ + patch("ere.entrypoints.app.configure_logging"), \ + patch("ere.entrypoints.app.build_entity_resolver", return_value=mock_resolver), \ + patch("ere.entrypoints.app.build_rdf_mapper", return_value=MagicMock()), \ + patch("ere.entrypoints.app.build_entity_resolution_service", return_value=MagicMock()), \ + patch("ere.entrypoints.app.RedisQueueWorker") as mock_worker_cls: + mock_redis_cls.return_value.ping.return_value = True + mock_worker_cls.return_value.process_single_message.side_effect = KeyboardInterrupt() + from ere.entrypoints.app import main + main() # must return cleanly (KeyboardInterrupt caught internally) diff --git a/test/unit/utils/__init__.py b/test/unit/utils/__init__.py new file mode 100644 index 0000000..04dba96 --- /dev/null +++ b/test/unit/utils/__init__.py @@ -0,0 +1 @@ +# pylint: disable=disallowed-name # mirrors src/ere/utils/ package structure diff --git a/test/unit/utils/test_logging.py b/test/unit/utils/test_logging.py new file mode 100644 index 0000000..f84762c --- /dev/null +++ b/test/unit/utils/test_logging.py @@ -0,0 +1,52 @@ +"""Unit tests for utils.logging: log-level setup and TRACE level.""" + +import logging +from unittest.mock import patch + +from ere.utils.logging import TRACE_LEVEL_NUM, configure_logging + + +def test_configure_logging_passes_warning_level_to_basicconfig(): + with patch("logging.basicConfig") as mock_bc: + configure_logging("WARNING") + mock_bc.assert_called_once() + assert mock_bc.call_args[1]["level"] == logging.WARNING + + +def test_configure_logging_passes_trace_level_to_basicconfig(): + with patch("logging.basicConfig") as mock_bc: + configure_logging("TRACE") + mock_bc.assert_called_once() + assert mock_bc.call_args[1]["level"] == TRACE_LEVEL_NUM + + +def test_configure_logging_reads_env_var(monkeypatch): + monkeypatch.setenv("ERE_LOG_LEVEL", "ERROR") + with patch("logging.basicConfig") as mock_bc: + configure_logging() + assert mock_bc.call_args[1]["level"] == logging.ERROR + + +def test_configure_logging_defaults_to_info(monkeypatch): + monkeypatch.delenv("ERE_LOG_LEVEL", raising=False) + with patch("logging.basicConfig") as mock_bc: + configure_logging() + assert mock_bc.call_args[1]["level"] == logging.INFO + + +def test_trace_method_exists_on_logger(): + log = logging.getLogger("test.trace") + assert callable(getattr(log, "trace", None)) + + +def test_trace_method_logs_when_enabled(caplog): + log = logging.getLogger("test.trace.enabled") + with caplog.at_level(TRACE_LEVEL_NUM, logger="test.trace.enabled"): + log.trace("trace message sent") + assert "trace message sent" in caplog.text + + +def test_trace_method_does_not_log_when_disabled(): + log = logging.getLogger("test.trace.silent") + log.setLevel(logging.INFO) + log.trace("this should not explode") diff --git a/tox.ini b/tox.ini index 0100926..f70414f 100644 --- a/tox.ini +++ b/tox.ini @@ -14,6 +14,7 @@ isolated_build = True envlist = py312, architecture, clean-code skip_missing_interpreters = True +package_root = src [testenv] description = Base environment configuration @@ -29,7 +30,7 @@ setenv = allowlist_externals = poetry commands_pre = - poetry sync + poetry -C {toxinidir}/src sync #============================================================================= # py312: Unit Tests + Coverage @@ -38,17 +39,19 @@ commands_pre = [testenv:py312] description = Run unit tests with coverage analysis commands = - pytest test \ - --cov={toxinidir}/src/ere \ + pytest test -m "not integration" \ + --cov=src/ere \ --cov-report=term \ --cov-report=term-missing:skip-covered \ - --cov-report=xml:coverage.xml \ + --cov-report=xml:{toxinidir}/coverage.xml \ -v \ {posargs} [coverage:run] branch = True source = src/ere +relative_files = True +omit = src/demo/* [coverage:report] precision = 2 @@ -68,6 +71,7 @@ fail_under = 80 [coverage:xml] output = coverage.xml +# Note: actual output path is set explicitly via --cov-report=xml:{toxinidir}/coverage.xml #============================================================================= # pytest: Shared Configuration @@ -104,17 +108,17 @@ deps = xenon>=0.9.3 commands = # Pylint: Check code style, naming conventions, SOLID principles - pylint --rcfile=.pylintrc src/ test/ + pylint --rcfile=.pylintrc src/ere/ test/ # Radon: Cyclomatic Complexity - show report - radon cc src/ -a --total-average --show-complexity + radon cc src/ere/ -a --total-average --show-complexity # Radon: Maintainability Index - higher is better (A=best, C=worst) - radon mi src/ --show --sort + radon mi src/ere/ --show --sort # Xenon: Enforce complexity thresholds and fail if exceeded # A = 1-5 (simple), B = 6-10 (manageable), C = 11-20 (complex) - xenon src/ \ + xenon src/ere/ \ --max-absolute C \ --max-modules C \ --max-average B \