PraisonAI
io.github.MervinPraison/praisonai · v2.3.42
AI Agents Framework with Self Reflection and MCP support
Install PraisonAI
Ready-to-paste config for every major MCP client
claude mcp add praisonai -- npx io.github.MervinPraison/praisonai # or uvx io.github.MervinPraison/praisonai for Python
Why Frontier?
500+ GitHub stars · battle-tested, widely adopted, safe default.
Alternatives
Same category · same transport · ranked by stars
More AI / ML servers
Top picks in the same category
Embeddable badge
Drop into your README · auto-updates when the rating changes
[](https://benchgecko.ai/mcp/praisonai)
FAQ
Auto-generated from the dataset · updated daily
What is PraisonAI?
PraisonAI is an MCP (Model Context Protocol) server in the AI / ML category. AI Agents Framework with Self Reflection and MCP support
How is PraisonAI rated?
BenchGecko rates PraisonAI Frontier on the Gecko Rating ladder (85/100). 500+ GitHub stars · battle-tested, widely adopted, safe default. The underlying GitHub repository has 6,899 stars.
How do I install PraisonAI?
PraisonAI supports Claude Code, Cursor, Windsurf, and VS Code. Each client needs a different config format · BenchGecko shows ready-to-paste snippets for all four above. Stdio servers run locally via npx; streamable-http and SSE servers connect over the network.
See also
Keep exploring the BenchGecko graph