Our Stance on Privacy
Let's be clear: our business is selling API access, not user data. We built Conduit.im as a tool we wanted to use ourselves, which means we have a strict, minimalist approach to data.
LLM Prompts & Responses
We do not store the content of your prompts or AI responses. Period.
Our system acts as a pass-through proxy.
We do meter your usage (counting input/output tokens per API key) for two essential reasons:
- To charge you the right amount.
- To see which models are popular so we can optimize the service.
Optional Performance Caching
To help you reduce costs and latency, we offer an optional caching feature. When enabled, it stores the results of identical requests so we can serve them instantly without re-calling the LLM.
This feature is entirely within your control.
Website Analytics
We use Microsoft Clarity and LogRocket to see how people use our marketing site and dashboard. It tells us things like "a lot of people are clicking this button" or "this page is confusing."
This is aggregated data that helps us improve the UI. If you're not comfortable with it, you can opt-out using the preferences link in the footer.
Don't Trust, Verify.
We're an open-source company. You can read the code that powers this entire platform and verify that we do what we say we do.
Check it out on GitHubQuestions About Our Privacy Practices?
We believe in transparency. If you have any questions about how we handle your data, don't hesitate to reach out to our team.