Setting Up Your Provider
AI models can live in the cloud or run locally on your machine. Cloud providers like Anthropic host models on their servers and charge per use. Local options like Ollama let you run models directly on your hardware for complete privacy and no per-request costs. Here’s how to configure any provider in Cline:1
Open Cline Settings
Click the settings icon in the top-right corner of the Cline panel. This opens the configuration view where you’ll set up your provider and model.
2
Select Your Provider
Click the API Provider dropdown at the top. You’ll see a list of all supported providers. Select the one you want to use.If you’re not sure which to pick:
- Choose Cline if you want the simplest setup with no API key management
- Choose OpenRouter if you want access to many models through one API key
- Choose Anthropic if you want direct access to Claude models
- Choose Ollama or LM Studio if you want to run models locally on your machine
3
Enter Your API Key
Most cloud providers require an API key. After selecting a provider, you’ll see an API key field.If you’re using a local provider like Ollama, you don’t need an API key. Just make sure the local server is running and Cline will connect to it automatically.
- Go to your provider’s website and create an account
- Navigate to their API keys section (usually under Settings or API)
- Generate a new API key
- Copy and paste it into the API key field in Cline
API keys are stored securely in your system’s credential manager. Cline never sends your keys anywhere except directly to the provider you’ve selected.
4
Choose Your Model
Once your provider is configured, the Model dropdown populates with available options. Click it to see all models your provider offers.Things to consider when choosing a model:
- Context window size determines how much code and conversation the model can “see” at once. Larger is generally better for complex tasks.
- Speed varies significantly between models. Smaller models respond faster but may be less capable.
- Cost differs across cloud models. Local models are free to run after initial download.
5
Verify Your Setup
Type a simple message and press Enter. If you see a response, you’re all set.If you get an error:
- Double-check that your API key is correct (no extra spaces or missing characters)
- Verify your provider account has billing set up or credits available
- For local models, confirm the server is running and accessible
- Check that you’ve selected a model that’s available on your plan
Cline Provider
The easiest way to get started is selecting Cline as your provider. This handles all API key management and billing through a single account. Benefits of using the Cline provider:- No API key juggling. You don’t need to create accounts with Anthropic, OpenAI, or other providers individually.
- Access to multiple models. Switch between Claude, GPT, and other models without configuring separate providers.
- Built-in billing. Add credits once and use them across any available model.
- Free options available. Some models are completely free to use for learning and experimentation.
Finding Free Models
Search “free” in the model selector to find no-cost options. Free models display a FREE tag, and some include:free in their model name. These are great for learning Cline’s capabilities before committing to paid usage.
Adding Credits
When you’re ready to use paid models, click the Add Credits button in the Cline settings or visit your account page. Credits work across all models available through the Cline provider.Other Providers
If you prefer using your own API keys directly with providers, Cline supports all major options.| Provider | Best For | Setup Guide |
|---|---|---|
| OpenRouter | Multiple models, competitive pricing | OpenRouter Setup |
| Anthropic | Direct Claude access, reliable tool usage | Anthropic Setup |
| Claude Code | Use your Claude Max/Pro subscription | Claude Code Setup |
| OpenAI | GPT models | OpenAI Setup |
| Google Gemini | Large context windows | Gemini Setup |
| DeepSeek | Great value for complex tasks | DeepSeek Setup |
Running Models Locally
For complete privacy or to avoid per-request costs, you can run models on your own hardware.| Provider | Best For | Setup Guide |
|---|---|---|
| Ollama | Easy local setup, wide model selection | Ollama Setup |
| LM Studio | GUI-based local model management | LM Studio Setup |
When to Use Your Own API Key
You might prefer managing your own API keys if:- Your organization requires specific billing arrangements
- You need higher rate limits than aggregated providers offer
- You want access to beta features or models not yet available through Cline
- You’re running models locally for privacy or cost reasons
Which Model Should I Choose?
| Your Priority | Recommended Model |
|---|---|
| Reliability | Claude Sonnet 4 |
| Value | Qwen3 Coder |
| Speed | Cerebras GLm 4.6 |
| Privacy | Any Ollama/LMStudio model |
| Use existing subscription | Claude Code with Max/Pro |
Want to learn more about LLMs and models? Check out Chapter 2 of AI Coding University with Cline.

