Anthropic Claude API Integration with Website
Claude API from Anthropic provides access to the Claude 3 family (Haiku, Sonnet, Opus). Claude distinguishes itself with a long context window (200K tokens in Claude 3), precision in instruction following, and strong performance with code and structured text.
Basic Integration
use Anthropic\Anthropic;
$client = new Anthropic(['apiKey' => config('services.anthropic.api_key')]);
$message = $client->messages->create([
'model' => 'claude-3-5-haiku-20241022',
'max_tokens' => 1024,
'system' => 'You are a product documentation assistant.',
'messages' => [
['role' => 'user', 'content' => $userQuestion],
],
]);
$answer = $message->content[0]->text;
Streaming
import anthropic
client = anthropic.Anthropic()
async def stream_response(user_message: str):
with client.messages.stream(
model='claude-3-5-haiku-20241022',
max_tokens=1024,
messages=[{'role': 'user', 'content': user_message}],
) as stream:
for text in stream.text_stream:
yield text
Structured Output via Tools (tool_use)
Claude supports tool_use for obtaining structured JSON:
tools = [{
'name': 'extract_product_info',
'description': 'Extract product information from text',
'input_schema': {
'type': 'object',
'properties': {
'name': {'type': 'string'},
'price': {'type': 'number'},
'sku': {'type': 'string'},
'description': {'type': 'string'},
'features': {'type': 'array', 'items': {'type': 'string'}},
},
'required': ['name', 'price'],
},
}]
response = client.messages.create(
model='claude-3-5-sonnet-20241022',
max_tokens=1024,
tools=tools,
messages=[{
'role': 'user',
'content': f'Extract product information:\n\n{product_description}',
}],
)
# Extract tool_use result
for block in response.content:
if block.type == 'tool_use' and block.name == 'extract_product_info':
product_data = block.input
break
Long Context (200K tokens)
Claude's unique advantage is the ability to pass an entire document or codebase into the context:
# Analyze a long PDF document (convert to text)
with open('contract.txt', 'r') as f:
document = f.read()
response = client.messages.create(
model='claude-3-5-sonnet-20241022',
max_tokens=2048,
messages=[{
'role': 'user',
'content': f"Read the contract and answer: what are the termination conditions?\n\nContract:\n{document}",
}],
)
Vision: Image Analysis
import base64
with open('screenshot.png', 'rb') as f:
image_data = base64.standard_b64encode(f.read()).decode('utf-8')
response = client.messages.create(
model='claude-3-5-sonnet-20241022',
max_tokens=1024,
messages=[{
'role': 'user',
'content': [
{'type': 'image', 'source': {'type': 'base64', 'media_type': 'image/png', 'data': image_data}},
{'type': 'text', 'text': 'Describe what is shown in the screenshot'},
],
}],
)
Comparison with OpenAI
| Parameter | Claude 3.5 Sonnet | GPT-4o |
|---|---|---|
| Context | 200K tokens | 128K tokens |
| Instruction Following | High | High |
| Code | Excellent | Excellent |
| Price (input) | $3 / 1M | $2.50 / 1M |
Timeline
Basic integration with chat: 1–2 days. Structured output + vision: +2 days.







