Introduction
To build AI agents, you need access to large language model APIs. This section walks you through obtaining and configuring API keys for the major providers: Anthropic (Claude), OpenAI, and Google (Gemini).
Security First: API keys are like passwords to your AI infrastructure. Never commit them to version control, share them publicly, or expose them in client-side code.
Anthropic API (Claude)
Getting Your API Key
- Go to console.anthropic.com
- Create an account or sign in
- Navigate to API Keys in the left sidebar
- Click Create Key
- Name your key (e.g., "agentic-ai-book") and copy it immediately
Copy Your Key Now
Anthropic only shows your API key once. If you lose it, you'll need to create a new one.
Testing Your Key
🐍test_anthropic.py
1import anthropic
2
3# Initialize the client (reads ANTHROPIC_API_KEY from environment)
4client = anthropic.Anthropic()
5
6# Test with a simple message
7response = client.messages.create(
8 model="claude-sonnet-4-20250514",
9 max_tokens=100,
10 messages=[
11 {"role": "user", "content": "Say 'Hello, Agent Builder!' in exactly 3 words."}
12 ]
13)
14
15print(response.content[0].text)
16# Expected output: "Hello, Agent Builder!"Available Models
| Model | Identifier | Best For |
|---|---|---|
| Claude Opus 4 | claude-opus-4-20250514 | Complex reasoning, research, coding |
| Claude Sonnet 4 | claude-sonnet-4-20250514 | Balanced performance and cost |
| Claude Haiku 3.5 | claude-3-5-haiku-20241022 | Fast, lightweight tasks |
Start with Sonnet
Claude Sonnet 4 offers the best balance of capability and cost for agent development. Use Opus for complex tasks, Haiku for high-volume simple operations.
OpenAI API
Getting Your API Key
- Go to platform.openai.com
- Sign in or create an account
- Navigate to API keys in the left sidebar
- Click Create new secret key
- Name it and copy immediately
Testing Your Key
🐍test_openai.py
1from openai import OpenAI
2
3# Initialize the client (reads OPENAI_API_KEY from environment)
4client = OpenAI()
5
6# Test with a simple completion
7response = client.chat.completions.create(
8 model="gpt-4o",
9 max_tokens=100,
10 messages=[
11 {"role": "user", "content": "Say 'Hello, Agent Builder!' in exactly 3 words."}
12 ]
13)
14
15print(response.choices[0].message.content)
16# Expected output: "Hello, Agent Builder!"Available Models
| Model | Identifier | Best For |
|---|---|---|
| GPT-4o | gpt-4o | General-purpose, multimodal |
| GPT-4o Mini | gpt-4o-mini | Faster, more affordable |
| o3 | o3 | Complex reasoning, agentic tasks |
| o4-mini | o4-mini | Efficient reasoning |
Google AI (Gemini)
Getting Your API Key
- Go to aistudio.google.com
- Sign in with your Google account
- Click Get API Key
- Create a key in a new or existing Google Cloud project
- Copy the generated key
Testing Your Key
🐍test_gemini.py
1import google.generativeai as genai
2
3# Configure with your API key
4genai.configure(api_key=os.environ["GOOGLE_API_KEY"])
5
6# Initialize the model
7model = genai.GenerativeModel("gemini-2.0-flash")
8
9# Test with a simple prompt
10response = model.generate_content("Say 'Hello, Agent Builder!' in exactly 3 words.")
11
12print(response.text)
13# Expected output: "Hello, Agent Builder!"Available Models
| Model | Identifier | Best For |
|---|---|---|
| Gemini 2.0 Flash | gemini-2.0-flash | Fast, multimodal, agentic |
| Gemini 2.0 Pro | gemini-2.0-pro | Advanced reasoning |
| Gemini 1.5 Pro | gemini-1.5-pro | Long context, code |
Environment Variables
The .env File
Store your API keys in a .env file at the root of your project:
⚡.env
1# LLM API Keys
2ANTHROPIC_API_KEY=sk-ant-api03-your-key-here
3OPENAI_API_KEY=sk-proj-your-key-here
4GOOGLE_API_KEY=AIzaSy-your-key-here
5
6# Optional: Default model preferences
7DEFAULT_MODEL=claude-sonnet-4-20250514
8DEFAULT_MAX_TOKENS=4096
9
10# Optional: Logging level
11LOG_LEVEL=INFOLoading Environment Variables
🐍load_env.py
1import os
2from dotenv import load_dotenv
3
4# Load .env file at the start of your application
5load_dotenv()
6
7# Access keys (with validation)
8def get_api_key(provider: str) -> str:
9 key_map = {
10 "anthropic": "ANTHROPIC_API_KEY",
11 "openai": "OPENAI_API_KEY",
12 "google": "GOOGLE_API_KEY",
13 }
14
15 env_var = key_map.get(provider.lower())
16 if not env_var:
17 raise ValueError(f"Unknown provider: {provider}")
18
19 key = os.environ.get(env_var)
20 if not key:
21 raise ValueError(f"{env_var} not set in environment")
22
23 return key
24
25# Usage
26anthropic_key = get_api_key("anthropic")
27openai_key = get_api_key("openai")Add .env to .gitignore
Never commit your
.env file to version control. Make sure it's listed in your .gitignore.Creating a .env.example
Create a template file that others can copy:
⚡.env.example
1# Copy this file to .env and fill in your API keys
2# Do NOT commit .env to version control
3
4# Required: At least one LLM API key
5ANTHROPIC_API_KEY=
6OPENAI_API_KEY=
7GOOGLE_API_KEY=
8
9# Optional settings
10DEFAULT_MODEL=claude-sonnet-4-20250514
11LOG_LEVEL=INFOAPI Best Practices
Key Management Rules
- Rotate regularly - Generate new keys periodically
- Use separate keys - Different keys for dev, staging, production
- Set spending limits - Configure billing alerts and caps
- Monitor usage - Track API calls for anomalies
- Revoke if compromised - Immediately regenerate exposed keys
Rate Limiting and Retries
🐍api_wrapper.py
1import time
2from typing import Callable, TypeVar
3import anthropic
4
5T = TypeVar("T")
6
7def with_retry(
8 func: Callable[[], T],
9 max_retries: int = 3,
10 base_delay: float = 1.0,
11) -> T:
12 """Execute a function with exponential backoff retry."""
13 for attempt in range(max_retries):
14 try:
15 return func()
16 except anthropic.RateLimitError:
17 if attempt == max_retries - 1:
18 raise
19 delay = base_delay * (2 ** attempt)
20 print(f"Rate limited. Retrying in {delay}s...")
21 time.sleep(delay)
22 except anthropic.APIError as e:
23 if attempt == max_retries - 1:
24 raise
25 print(f"API error: {e}. Retrying...")
26 time.sleep(base_delay)
27 raise RuntimeError("Max retries exceeded")
28
29# Usage
30def call_claude():
31 client = anthropic.Anthropic()
32 return client.messages.create(
33 model="claude-sonnet-4-20250514",
34 max_tokens=100,
35 messages=[{"role": "user", "content": "Hello"}]
36 )
37
38response = with_retry(call_claude)Cost Management
| Strategy | Description |
|---|---|
| Use smaller models first | Start with Haiku/GPT-4o-mini, escalate if needed |
| Cache responses | Store and reuse results for identical queries |
| Truncate context | Only include relevant context to reduce tokens |
| Set max_tokens | Always limit response length appropriately |
| Batch requests | Combine multiple queries when possible |
Development vs Production
During development, use shorter max_tokens and lower-cost models. Switch to higher-capability models for production or complex tasks.
Summary
You should now have:
- API keys for Anthropic, OpenAI, and/or Google
- .env file configured with your keys
- .env.example template for sharing
- Understanding of rate limiting and cost management
Next Step: With your API keys configured, let's look at how to structure your agent projects for maintainability and scalability.