kforge

Custom Provider (OpenAI-compatible Endpoints)

Last updated: 28/01/2026


What is the Custom provider?

The Custom provider lets you connect any OpenAI-compatible API endpoint to KForge.

If your provider supports:

Then it will work with Custom.

This makes KForge future-proof and vendor-agnostic:

Custom is designed for:


What Custom is NOT

Custom does not:

All of this depends entirely on:

KForge is honest about this by design.


Endpoint requirements

Your endpoint must be OpenAI-compatible.

Required

KForge will automatically call:

If you enter a URL ending in /v1, KForge will normalize it for you.


How to configure Custom in KForge

  1. Select Custom Endpoint (OpenAI-compatible) as the provider
  2. Open Settings
  3. Enter:
    • API Key: your provider’s API key
    • Endpoint URL: base URL (no /v1)
  4. Save settings
  5. Add or select a model ID supported by your endpoint

That’s it.


Suggested models (examples only)

These are suggestions, not presets.

Availability, pricing, and limits may change at any time depending on the provider. Use them as a starting point, then adjust based on your usage.


OpenRouter

Endpoint:

Example models:

Notes:


Groq

Endpoint:

Example models:

Notes:


DeepSeek

Endpoint:

Example models:

Notes:


Mistral

Endpoint:

Example models:

Notes:


Model tags and usage modes (important)

Model tags in KForge are organizational hints, not guarantees.

They reflect how you intend to use the model — not what the provider promises.

Suggested interpretation

For Custom endpoints, you are the source of truth.


Why Custom matters

Custom exists so that:

Many tools hard-code providers. KForge deliberately does not.


When should I use Custom?

Use Custom if:

If you want “plug-and-play”, use a built-in provider instead.


Final note

Custom is powerful by design.

With power comes responsibility — and flexibility.

That trade-off is intentional.