Skip to main content
Cline needs an AI model to understand your requests and write code. This guide walks you through selecting a provider and model so you can start coding with AI assistance.
Watch: Selecting Your Model walks through choosing and configuring your first model.

Setting Up Your Provider

AI models can live in the cloud or run locally on your machine. Cloud providers like Anthropic host models on their servers and charge per use. Local options like Ollama let you run models directly on your hardware for complete privacy and no per-request costs. Here’s how to configure any provider in Cline:
1

Open Cline Settings

Click the settings icon in the top-right corner of the Cline panel. This opens the configuration view where you’ll set up your provider and model.
2

Select Your Provider

Click the API Provider dropdown at the top. You’ll see a list of all supported providers. Select the one you want to use.If you’re not sure which to pick:
  • Choose Cline if you want the simplest setup with no API key management
  • Choose OpenRouter if you want access to many models through one API key
  • Choose Anthropic if you want direct access to Claude models
  • Choose Ollama or LM Studio if you want to run models locally on your machine
3

Enter Your API Key

Most cloud providers require an API key. After selecting a provider, you’ll see an API key field.
  1. Go to your provider’s website and create an account
  2. Navigate to their API keys section (usually under Settings or API)
  3. Generate a new API key
  4. Copy and paste it into the API key field in Cline
API keys are stored securely in your system’s credential manager. Cline never sends your keys anywhere except directly to the provider you’ve selected.
If you’re using a local provider like Ollama, you don’t need an API key. Just make sure the local server is running and Cline will connect to it automatically.
4

Choose Your Model

Once your provider is configured, the Model dropdown populates with available options. Click it to see all models your provider offers.Things to consider when choosing a model:
  • Context window size determines how much code and conversation the model can “see” at once. Larger is generally better for complex tasks.
  • Speed varies significantly between models. Smaller models respond faster but may be less capable.
  • Cost differs across cloud models. Local models are free to run after initial download.
5

Verify Your Setup

Type a simple message and press Enter. If you see a response, you’re all set.If you get an error:
  • Double-check that your API key is correct (no extra spaces or missing characters)
  • Verify your provider account has billing set up or credits available
  • For local models, confirm the server is running and accessible
  • Check that you’ve selected a model that’s available on your plan

Cline Provider

The easiest way to get started is selecting Cline as your provider. This handles all API key management and billing through a single account. Benefits of using the Cline provider:
  • No API key juggling. You don’t need to create accounts with Anthropic, OpenAI, or other providers individually.
  • Access to multiple models. Switch between Claude, GPT, and other models without configuring separate providers.
  • Built-in billing. Add credits once and use them across any available model.
  • Free options available. Some models are completely free to use for learning and experimentation.

Finding Free Models

Search “free” in the model selector to find no-cost options. Free models display a FREE tag, and some include :free in their model name. These are great for learning Cline’s capabilities before committing to paid usage.

Adding Credits

When you’re ready to use paid models, click the Add Credits button in the Cline settings or visit your account page. Credits work across all models available through the Cline provider.

Other Providers

If you prefer using your own API keys directly with providers, Cline supports all major options.
ProviderBest ForSetup Guide
OpenRouterMultiple models, competitive pricingOpenRouter Setup
AnthropicDirect Claude access, reliable tool usageAnthropic Setup
Claude CodeUse your Claude Max/Pro subscriptionClaude Code Setup
OpenAIGPT modelsOpenAI Setup
Google GeminiLarge context windowsGemini Setup
DeepSeekGreat value for complex tasksDeepSeek Setup

Running Models Locally

For complete privacy or to avoid per-request costs, you can run models on your own hardware.
ProviderBest ForSetup Guide
OllamaEasy local setup, wide model selectionOllama Setup
LM StudioGUI-based local model managementLM Studio Setup
Local models require sufficient hardware (especially GPU memory) to run effectively. Check the Running Models Locally guide for hardware recommendations.

When to Use Your Own API Key

You might prefer managing your own API keys if:
  • Your organization requires specific billing arrangements
  • You need higher rate limits than aggregated providers offer
  • You want access to beta features or models not yet available through Cline
  • You’re running models locally for privacy or cost reasons

Which Model Should I Choose?

Your PriorityRecommended Model
ReliabilityClaude Sonnet 4
ValueQwen3 Coder
SpeedCerebras GLm 4.6
PrivacyAny Ollama/LMStudio model
Use existing subscriptionClaude Code with Max/Pro
Want to learn more about LLMs and models? Check out Chapter 2 of AI Coding University with Cline.