Give your AI agent eyes to see the entire internet. Read & search Twitter, Reddit, YouTube, GitHub, Bilibili, XiaoHongShu — one CLI, zero API fees.
-
Updated
Apr 2, 2026 - Python
Give your AI agent eyes to see the entire internet. Read & search Twitter, Reddit, YouTube, GitHub, Bilibili, XiaoHongShu — one CLI, zero API fees.
🔥 Official Firecrawl MCP Server - Adds powerful web scraping and search to Cursor, Claude and any other LLM clients.
The open-source LLMOps platform: prompt playground, prompt management, LLM evaluation, and LLM observability all in one place.
Templates and workflow for generating PRDs, Tech Designs, and MVP and more using LLMs for AI IDEs
14-stage Fusion Pipeline for LLM token compression — reversible compression, AST-aware code analysis, intelligent content routing. Zero LLM inference cost. MIT licensed.
Control Gmail, Google Calendar, Docs, Sheets, Slides, Chat, Forms, Tasks, Search & Drive with AI - Comprehensive Google Workspace / G Suite MCP Server & CLI Tool
Automated TDD enforcement for Claude Code
The LLM Anti-Framework
Your autonomous engineering team in a CLI. Point Zeroshot at an issue, walk away, and return to production-grade code. Supports Claude Code, OpenAI Codex, OpenCode, and Gemini CLI.
Full computer-use for AI agents. Self-learning workflows. Native macOS. No screenshots required.
NyaProxy acts like a smart, central manager for accessing various online services (APIs) – think AI tools (like OpenAI, Gemini, Anthropic), image generators, or almost any web service that uses access keys. It helps you use these services more reliably, efficiently, and securely.
The best way to create, deploy, and share MCP Servers
Easily create LLM tools and agents using plain Bash/JavaScript/Python functions.
Agent Skills for Solopreneurs
Prismer Cloud
Open‑WebUI Tools is a modular toolkit designed to extend and enrich your Open WebUI instance, turning it into a powerful AI workstation. With a suite of over 15 specialized tools, function pipelines, and filters, this project supports academic research, agentic autonomy, multimodal creativity, workflows, and more
MCP-NixOS - Model Context Protocol Server for NixOS resources
A command-line interface tool for serving LLM using vLLM.
Run coding agents in hardened Incus containers with real-time network threat detection, automatic threat response (pause/kill), credential isolation, protected paths, session persistence, and multi-slot support.
Add a description, image, and links to the llm-tools topic page so that developers can more easily learn about it.
To associate your repository with the llm-tools topic, visit your repo's landing page and select "manage topics."