JavaScript / Node.js
Use the official OpenAI SDK with Conduit.im — just change two lines of configuration.
Installation
Conduit.im is fully compatible with the official OpenAI Node.js SDK. Install it with your preferred package manager:
npm install openaiConfiguration
Point the SDK at the Conduit.im API by setting the baseURL and using your Conduit.im API key:
import OpenAI from "openai";
const client = new OpenAI({
apiKey: process.env.CONDUIT_API_KEY,
baseURL: "https://api.conduit.im/v1",
});Note: If you have existing code that uses the OpenAI SDK, these two properties (apiKey and baseURL) are the only changes needed to switch to Conduit.im.
Chat Completions
Send a chat completion request just like you would with the OpenAI API:
const response = await client.chat.completions.create({
model: "gpt-4",
messages: [
{ role: "system", content: "You are a helpful assistant." },
{ role: "user", content: "What is the capital of France?" },
],
});
console.log(response.choices[0].message.content);Streaming
The SDK has built-in support for streaming. Set stream: true and iterate over the response:
const stream = await client.chat.completions.create({
model: "gpt-4",
messages: [{ role: "user", content: "Write a short poem about APIs." }],
stream: true,
});
for await (const chunk of stream) {
const token = chunk.choices[0]?.delta?.content || "";
process.stdout.write(token);
}
console.log(); // trailing newlineError Handling
The SDK throws typed errors that you can catch and handle individually:
import OpenAI from "openai";
try {
const response = await client.chat.completions.create({
model: "gpt-4",
messages: [{ role: "user", content: "Hello!" }],
});
console.log(response.choices[0].message.content);
} catch (error) {
if (error instanceof OpenAI.APIError) {
console.error("API error:", error.status, error.message);
if (error.status === 401) {
console.error("Invalid API key. Check your CONDUIT_API_KEY.");
} else if (error.status === 429) {
console.error("Rate limited. Retry after a short delay.");
} else if (error.status === 402) {
console.error("Insufficient balance. Add funds to your account.");
}
} else {
throw error;
}
}Using Fetch Directly
If you prefer not to use the SDK, you can call the API directly with fetch:
const response = await fetch("https://api.conduit.im/v1/chat/completions", {
method: "POST",
headers: {
"Authorization": `Bearer ${process.env.CONDUIT_API_KEY}`,
"Content-Type": "application/json",
},
body: JSON.stringify({
model: "gpt-4",
messages: [{ role: "user", content: "Hello!" }],
}),
});
const data = await response.json();
console.log(data.choices[0].message.content);TypeScript Support
The OpenAI SDK ships with full TypeScript type definitions out of the box. All request parameters and response objects are fully typed:
import OpenAI from "openai";
import type { ChatCompletionMessageParam } from "openai/resources";
const messages: ChatCompletionMessageParam[] = [
{ role: "system", content: "You are a helpful assistant." },
{ role: "user", content: "Hello!" },
];
const response = await client.chat.completions.create({
model: "gpt-4",
messages,
});Next Steps
You're ready to build with Conduit.im and Node.js. Explore further: