Skip to main content

OpenRouter

How to Integrate OpenRouter with Jan

OpenRouter is a tool that gathers AI models. Developers can utilize its API to engage with diverse large language models, generative image models, and generative 3D object models.

To connect Jan with OpenRouter for accessing remote Large Language Models (LLMs) through OpenRouter, you can follow the steps below:

Step 1: Configure OpenRouter API key

  1. Find your API keys in the OpenRouter API Key.
  2. Set the OpenRouter API key in ~/jan/engines/openai.json file.

Step 2: MModel Configuration

  1. Go to the directory ~/jan/models.
  2. Make a new folder called openrouter-(modelname), like openrouter-dolphin-mixtral-8x7b.
  3. Inside the folder, create a model.json file with the following settings:
  • Set the id property to the model id obtained from OpenRouter.
  • Set the format property to api.
  • Set the engine property to openai.
  • Ensure the state property is set to ready.
~/jan/models/openrouter-dolphin-mixtral-8x7b/model.json
{
"sources": [
{
"filename": "openrouter",
"url": "https://openrouter.ai/"
}
],
"id": "cognitivecomputations/dolphin-mixtral-8x7b",
"object": "model",
"name": "Dolphin 2.6 Mixtral 8x7B",
"version": "1.0",
"description": "This is a 16k context fine-tune of Mixtral-8x7b. It excels in coding tasks due to extensive training with coding data and is known for its obedience, although it lacks DPO tuning. The model is uncensored and is stripped of alignment and bias. It requires an external alignment layer for ethical use. Users are cautioned to use this highly compliant model responsibly, as detailed in a blog post about uncensored models at erichartford.com/uncensored-models.",
"format": "api",
"settings": {},
"parameters": {},
"metadata": {
"tags": ["General", "Big Context Length"]
},
"engine": "openai"
}
note

For more details regarding the model.json settings and parameters fields, please see here.

Step 3 : Start the Model

  1. Restart Jan and go to the Hub.
  2. Find your model and click on the Use button.