Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Show HN: Simple to build MCP servers that easily connect with custom LLM calls (mirascope.com)
57 points by wbakst 10 months ago | hide | past | favorite | 25 comments
Hi!

After learning about MCP, I'm really excited about the future of provider-agnostic, re-usable tooling.

Unfortunately I've found that while it's easy to implement an MCP server for use with tools that support it (such as Claude Desktop), it's not as easy to implement your own support (such as integrating an MCP server into your own LLM application).

We implemented a thin MCP wrapper that easily integrates with Mirascope calls so that you can hook up an MCP server and client super easily to any supported LLM provider.

Excited to see what people build with this!



I hadn't heard of miracope before, so thanks for sharing. I like how the website shows examples of source code for interscting with different providers using miracope and not. That was exactly what I was interested in.

With respect to MCP servers, I saw the comment below about integration with librechat. Is there any other documentation on integrations with other self host able solutions? Be it ollama or other.


We have some docs for local / OSS models:

https://mirascope.com/learn/local_models/

You can also set a custom client to use any model that’s compatible with e.g. OpenAI:

https://mirascope.com/learn/calls/


I came here to share the same link as I found it, and saw you already provided it :)

Thanks!


You got it!


This is great! I personally found the original Anthropic MCP documentation pretty lacking in terms of how Claude Desktop used the MCP server(s), constraints, etc. For example, there is a pretty hard timeout which will cause the MCP server to crash.

Glad there's a simple to use solution for creating my own server where I can make some different design choices!


Totally! When it was first released, I really liked the idea, but it felt super early (which is to be expected!)

I immediately wanted to write a custom agent that used an MCP server and that inspired building this. Glad you like it!


I’ve recently started moving all our structured output calls over from instructor to mirascope and will not be looking back. So this is gonna give me the push to take MCP more seriously especially on our internal workflows and tasks.


Best part of MCP is that it’ll just work with anything else that supports it (like Claude, Cursor, etc.)


I won’t tell Jason…


Just chiming in to say how much I've enjoyed using mirascope! It really seems to nail that sweet spot of abstracting juuuust enough to get the busy work out of the way, without ever being a pain in the rear.

Two thumbs up!


This put such a big smile on my face!

“Abstractions that aren’t obstructions” is always the goal, feels good to hear you say we’ve hit that sweet spot :)


I’m glad I got to know about Miracope just when I needed to implement custom support for MCP. Thank you!


I hope Mirascope makes it easy!

What are you building?


We are building AI Agent "Giselle".

https://giselles.ai/ https://github.com/giselles-ai/giselle

Currently, it's in public beta. I'll also do Show HN around the time of the product release!


super cool!

excited for the release and to see how you’re using Mirascope!


This looks very useful.

Unimportantly, I wonder how the copyright date is 2023 when MCP didn't exist until 2024.


We released the first version of Mirascope in Dec 2023. I’m under the impression that would make it the right date?


Did Mirascope exist before MCP? MCP was announced in November, 2024.


Yeah we built support for MCP as part of the existing library that contains other LLM related abstractions


Aren't LLMs smart enough now to communicate with __any__ API?


Unfortunately not quite :(

You can definitely have them generate tools for APIs on the fly, but it’s not nearly robust enough yet


Can you integrate thiz with LibreChat?



sweeeeet love that it's supported, standardization for the win!


if they support MCP servers it should work!

can't guarantee as i haven't tried connecting them myself




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: