|
| 1 | +--- |
| 2 | +title: Using your own LLM models in GitHub Copilot CLI |
| 3 | +shortTitle: Use your own model provider |
| 4 | +intro: 'Use a model from an external provider of your choice in {% data variables.product.prodname_copilot_short %} by supplying your own API key.' |
| 5 | +allowTitleToDifferFromFilename: true |
| 6 | +versions: |
| 7 | + feature: copilot |
| 8 | +contentType: how-tos |
| 9 | +category: |
| 10 | + - Configure Copilot |
| 11 | + - Configure Copilot CLI |
| 12 | +--- |
| 13 | + |
| 14 | +You can configure {% data variables.copilot.copilot_cli_short %} to use your own LLM provider, also called BYOK (Bring Your Own Key), instead of {% data variables.product.github %}-hosted models. This lets you connect to OpenAI-compatible endpoints, Azure OpenAI, or Anthropic, including locally running models such as Ollama. |
| 15 | + |
| 16 | +## Prerequisites |
| 17 | + |
| 18 | +* {% data variables.copilot.copilot_cli_short %} is installed. See [AUTOTITLE](/copilot/how-tos/copilot-cli/set-up-copilot-cli/install-copilot-cli). |
| 19 | +* You have an API key from a supported LLM provider, or you have a local model running (such as Ollama). |
| 20 | + |
| 21 | +## Supported providers |
| 22 | + |
| 23 | +{% data variables.copilot.copilot_cli_short %} supports three provider types: |
| 24 | + |
| 25 | +| Provider type | Compatible services | |
| 26 | +|---|---| |
| 27 | +| `openai` | OpenAI, Ollama, vLLM, Foundry Local, and any other OpenAI Chat Completions API-compatible endpoint. This is the default provider type. | |
| 28 | +| `azure` | Azure OpenAI Service. | |
| 29 | +| `anthropic` | Anthropic (Claude models). | |
| 30 | + |
| 31 | +For additional examples, run `copilot help providers` in your terminal. |
| 32 | + |
| 33 | +## Model requirements |
| 34 | + |
| 35 | +Models must support **tool calling** (also called function calling) and **streaming**. If a model does not support either capability, {% data variables.copilot.copilot_cli_short %} returns an error. For best results, use a model with a context window of at least 128k tokens. |
| 36 | + |
| 37 | +## Configuring your provider |
| 38 | + |
| 39 | +You configure your model provider by setting environment variables before starting {% data variables.copilot.copilot_cli_short %}. |
| 40 | + |
| 41 | +| Environment variable | Required | Description | |
| 42 | +|---|---|---| |
| 43 | +| `COPILOT_PROVIDER_BASE_URL` | Yes | The base URL of your model provider's API endpoint. | |
| 44 | +| `COPILOT_PROVIDER_TYPE` | No | The provider type: `openai` (default), `azure`, or `anthropic`. | |
| 45 | +| `COPILOT_PROVIDER_API_KEY` | No | Your API key for the provider. Not required for providers that do not use authentication, such as a local Ollama instance. | |
| 46 | +| `COPILOT_MODEL` | Yes | The model identifier to use. You can also set this with the `--model` command-line flag. | |
| 47 | + |
| 48 | +## Connecting to an OpenAI-compatible endpoint |
| 49 | + |
| 50 | +Use the following steps if you are connecting to OpenAI, Ollama, vLLM, Foundry Local, or any other endpoint that is compatible with the OpenAI Chat Completions API. |
| 51 | + |
| 52 | +1. Set environment variables for your provider. For example, for a local Ollama instance: |
| 53 | + |
| 54 | + ```shell |
| 55 | + export COPILOT_PROVIDER_BASE_URL=http://localhost:11434 |
| 56 | + export COPILOT_MODEL=YOUR-MODEL-NAME |
| 57 | + ``` |
| 58 | + |
| 59 | + Replace `YOUR-MODEL-NAME` with the name of the model you have pulled in Ollama (for example, `llama3.2`). |
| 60 | + |
| 61 | +1. For a remote OpenAI endpoint, also set your API key. |
| 62 | + |
| 63 | + ```shell |
| 64 | + export COPILOT_PROVIDER_BASE_URL=https://api.openai.com |
| 65 | + export COPILOT_PROVIDER_API_KEY=YOUR-OPENAI-API-KEY |
| 66 | + export COPILOT_MODEL=YOUR-MODEL-NAME |
| 67 | + ``` |
| 68 | + |
| 69 | + Replace `YOUR-OPENAI-API-KEY` with your OpenAI API key and `YOUR-MODEL-NAME` with the model you want to use (for example, `gpt-4o`). |
| 70 | + |
| 71 | +{% data reusables.copilot.copilot-cli.start-cli %} |
| 72 | + |
| 73 | +## Connecting to Azure OpenAI |
| 74 | + |
| 75 | +1. Set the environment variables for Azure OpenAI. |
| 76 | + |
| 77 | + ```shell |
| 78 | + export COPILOT_PROVIDER_BASE_URL=https://YOUR-RESOURCE-NAME.openai.azure.com/openai/deployments/YOUR-DEPLOYMENT-NAME |
| 79 | + export COPILOT_PROVIDER_TYPE=azure |
| 80 | + export COPILOT_PROVIDER_API_KEY=YOUR-AZURE-API-KEY |
| 81 | + export COPILOT_MODEL=YOUR-DEPLOYMENT-NAME |
| 82 | + ``` |
| 83 | + |
| 84 | + Replace the following placeholders: |
| 85 | + |
| 86 | + * `YOUR-RESOURCE-NAME`: your Azure OpenAI resource name |
| 87 | + * `YOUR-DEPLOYMENT-NAME`: the name of your model deployment |
| 88 | + * `YOUR-AZURE-API-KEY`: your Azure OpenAI API key |
| 89 | + |
| 90 | +{% data reusables.copilot.copilot-cli.start-cli %} |
| 91 | + |
| 92 | +## Connecting to Anthropic |
| 93 | + |
| 94 | +1. Set the environment variables for Anthropic: |
| 95 | + |
| 96 | + ```shell |
| 97 | + export COPILOT_PROVIDER_TYPE=anthropic |
| 98 | + export COPILOT_PROVIDER_API_KEY=YOUR-ANTHROPIC-API-KEY |
| 99 | + export COPILOT_MODEL=YOUR-MODEL-NAME |
| 100 | + ``` |
| 101 | + |
| 102 | + Replace `YOUR-ANTHROPIC-API-KEY` with your Anthropic API key and YOUR-MODEL-NAME with the Claude model you want to use (for example, `claude-opus-4-5`). |
| 103 | + |
| 104 | +{% data reusables.copilot.copilot-cli.start-cli %} |
| 105 | + |
| 106 | +## Running in offline mode |
| 107 | + |
| 108 | +You can run {% data variables.copilot.copilot_cli_short %} in offline mode to prevent it from contacting {% data variables.product.github %}'s servers. This is designed for isolated environments where the CLI should communicate only with your local or on-premises model provider. |
| 109 | + |
| 110 | +> [!IMPORTANT] |
| 111 | +> Offline mode only guarantees full network isolation if your provider is also local or within the same isolated environment. If `COPILOT_PROVIDER_BASE_URL` points to a remote endpoint, your prompts and code context are still sent over the network to that provider. |
| 112 | +
|
| 113 | +1. Configure your provider environment variables as described in Configuring your provider. |
| 114 | + |
| 115 | +1. Set the offline mode environment variable: |
| 116 | + |
| 117 | + ```shell |
| 118 | + export COPILOT_OFFLINE=true |
| 119 | + ``` |
| 120 | + |
| 121 | +{% data reusables.copilot.copilot-cli.start-cli %} |
0 commit comments