Get Started with NixAPI in 5 Minutes: Zero-friction Access to GPT-4o and Claude
A step-by-step guide to integrating GPT-4o, Claude 3.5, and other leading LLMs through NixAPI, with complete Python and Node.js code examples.
NixAPI is an LLM API relay service that is 100% compatible with the OpenAI API format. All you need to do is replace one line — base_url — and you can start using GPT-4o, Claude 3.5 Sonnet, and other leading models in your existing project at a lower cost.
This guide walks you through from registration to your first API call — about 5 minutes end-to-end.
Step 1: Register and Get an API Key
- Visit nixapi.com and create an account
- Go to the console → API Keys → click “New”
- Copy the generated key (format:
nix-xxxxxxxxxxxx)
Note: NixAPI charges per token with no monthly fee. Free credits are included on registration — more than enough to get started.
Step 2: Python Integration
Using the official openai SDK — only the base_url needs to change:
from openai import OpenAI
client = OpenAI(
api_key="your-NixAPI-key", # nix-xxxx format
base_url="https://api.nixapi.com/v1", # replace this line
)
response = client.chat.completions.create(
model="gpt-4o",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Hello, give me a brief introduction of yourself."},
],
temperature=0.7,
max_tokens=500,
)
print(response.choices[0].message.content)
That’s it! Your original OpenAI code works unchanged — just swap base_url and api_key.
Step 3: Node.js / TypeScript Integration
import OpenAI from 'openai';
const client = new OpenAI({
apiKey: 'your-NixAPI-key',
baseURL: 'https://api.nixapi.com/v1',
});
async function main() {
const response = await client.chat.completions.create({
model: 'claude-3-5-sonnet-20241022', // Claude works the same way!
messages: [
{ role: 'user', content: 'Explain Retrieval-Augmented Generation in 100 words' },
],
});
console.log(response.choices[0].message.content);
}
main();
Supported Models
NixAPI supports all major LLMs. Pass the model ID directly in your requests:
| Model | Model ID | Highlights |
|---|---|---|
| GPT-4o | gpt-4o | OpenAI flagship, vision + text |
| GPT-4o mini | gpt-4o-mini | Low cost, fast |
| Claude 3.5 Sonnet | claude-3-5-sonnet-20241022 | Excellent coding ability |
| Claude 3.5 Haiku | claude-3-5-haiku-20241022 | Ultra-fast responses |
| Gemini 1.5 Pro | gemini-1.5-pro | Extra-long context window |
The full model list is available in the NixAPI console and is updated regularly.
FAQ
Q: I’m currently using an OpenAI key. Can I simply swap it for a NixAPI key?
A: Yes. The interface is 100% compatible — just change base_url and api_key.
Q: Does it support streaming output?
A: Yes. Add stream: true to your request parameters. The format is identical to OpenAI.
Q: Is there a rate limit? A: Limits vary by plan. Check the NixAPI console for your current plan details.
Next Steps
- 👉 Register at NixAPI — free to start
- 📖 Streaming Tutorial — implement typewriter-style output
- 💸 Cost Optimization Guide — cut your API bill by 60%
Try NixAPI Now
Reliable LLM API relay for OpenAI, Claude, Gemini, DeepSeek, Qwen, and Grok with ¥1 = $1 top-up
Sign Up Free