Overview
defineModel creates a model configuration that can be referenced by prompts and agents. Models use provider factory functions imported from provider packages for typed configuration.
import { defineModel } from '@standardagents/spec';
import { openai } from '@standardagents/openai';
export default defineModel({
name: 'heavy-thinking',
provider: openai,
model: 'gpt-4o',
inputPrice: 2.5,
outputPrice: 10,
});
Type Definition
import type { ProviderFactoryWithOptions, ModelCapabilities } from '@standardagents/spec';
import type { ZodTypeAny, z } from 'zod';
function defineModel<
N extends string,
P extends ProviderFactoryWithOptions<ZodTypeAny>
>(options: ModelDefinition<N, P>): ModelDefinition<N, P>;
interface ModelDefinition<
N extends string = string,
P extends ProviderFactoryWithOptions<ZodTypeAny> = ProviderFactoryWithOptions<ZodTypeAny>
> {
name: N;
provider: P;
model: string;
inputPrice?: number;
outputPrice?: number;
cachedPrice?: number;
fallbacks?: string[];
includedProviders?: string[];
capabilities?: ModelCapabilities;
providerOptions?: InferProviderOptions<P>;
providerTools?: string[];
}
interface ModelCapabilities {
reasoningLevels?: Record<number, string | null>;
supportsImages?: boolean;
supportsToolCalls?: boolean;
supportsStreaming?: boolean;
supportsJsonMode?: boolean;
maxContextTokens?: number;
maxOutputTokens?: number;
}
// Automatically infers the correct type from the provider's schema
type InferProviderOptions<P> = P extends ProviderFactoryWithOptions<infer S>
? S extends ZodTypeAny ? z.input<S> : Record<string, unknown>
: Record<string, unknown>;
Parameters
Unique identifier for this model configuration. Used to reference the model in prompts and agents.Best Practice: Name models by use case, not by model ID:
heavy-thinking - Complex reasoning tasks
fast-response - Quick, simple responses
code-generation - Code-focused tasks
Provider factory function imported from a provider package.import { openai } from '@standardagents/openai';
import { openrouter } from '@standardagents/openrouter';
provider: openai // Direct OpenAI API
provider: openrouter // Multi-provider gateway
The provider determines which providerOptions are available. TypeScript provides autocompletion based on the provider’s schema.
The specific model identifier sent to the provider API.OpenAI examples:
gpt-4o
gpt-4o-mini
o1-mini
OpenRouter format: provider/model-name
anthropic/claude-sonnet-4
google/gemini-2.0-flash-exp
meta-llama/llama-3.3-70b-instruct
Price per 1 million input tokens in USD. Required for OpenAI provider.OpenRouter models fetch pricing automatically from the API.
Price per 1 million output tokens in USD. Required for OpenAI provider.
Price per 1 million cached tokens in USD (if provider supports caching).
Array of model names to try if the primary model fails. Must reference other models defined with defineModel.Fallbacks are tried for:
- Network errors
- Rate limits (429)
- Server errors (5xx)
- Authentication errors (401)
OpenRouter only. Only use specific providers for this model.includedProviders: ['anthropic'] // Only route to Anthropic
Model capability flags for framework behavior. See Capabilities section.
Provider-specific options. Type is inferred from the provider’s schema.See Provider Options section for available options per provider.
Enable built-in provider tools by name. Available tools depend on the provider and model.providerTools: ['web_search', 'code_interpreter']
Provider Options
OpenAI
import { openai } from '@standardagents/openai';
defineModel({
name: 'gpt-4o',
provider: openai,
model: 'gpt-4o',
providerOptions: {
service_tier: 'auto' | 'default' | 'flex',
user: 'user-123',
seed: 42,
frequency_penalty: 0.5, // -2.0 to 2.0
presence_penalty: 0.5, // -2.0 to 2.0
logprobs: true,
top_logprobs: 5, // 0-20
store: true,
metadata: { key: 'value' },
},
});
OpenRouter
import { openrouter } from '@standardagents/openrouter';
defineModel({
name: 'claude-sonnet',
provider: openrouter,
model: 'anthropic/claude-sonnet-4',
providerOptions: {
provider: {
order: ['anthropic', 'openai'],
allow_fallbacks: true,
require_parameters: true,
data_collection: 'allow' | 'deny',
zdr: true,
only: ['anthropic'],
ignore: ['azure'],
sort: 'price' | 'throughput' | 'latency',
max_price: {
prompt: 1,
completion: 5,
request: 0.01,
image: 0.05,
},
quantizations: ['fp16', 'bf16'],
preferred_min_throughput: 100,
preferred_max_latency: 1,
},
},
});
Capabilities
capabilities: {
// Reasoning support (0-100 scale → model values)
reasoningLevels: {
0: null, // No reasoning
33: 'low', // Low effort
66: 'medium', // Medium effort
100: 'high', // High effort
},
// Feature support
supportsImages: true, // Can process image inputs
supportsToolCalls: true, // Supports function calling
supportsStreaming: true, // Supports streaming responses
supportsJsonMode: true, // Supports structured JSON output
// Token limits
maxContextTokens: 128000, // Max context window
maxOutputTokens: 16384, // Max output tokens
}
Return Value
Returns the same ModelDefinition object passed in, enabling the build system to register it.
Examples
OpenAI Model
import { defineModel } from '@standardagents/spec';
import { openai } from '@standardagents/openai';
export default defineModel({
name: 'heavy-thinking',
provider: openai,
model: 'gpt-4o',
inputPrice: 2.5,
outputPrice: 10,
capabilities: {
supportsImages: true,
supportsToolCalls: true,
supportsJsonMode: true,
maxContextTokens: 128000,
},
providerOptions: {
service_tier: 'default',
},
});
OpenRouter Model
import { defineModel } from '@standardagents/spec';
import { openrouter } from '@standardagents/openrouter';
export default defineModel({
name: 'claude-sonnet',
provider: openrouter,
model: 'anthropic/claude-sonnet-4',
capabilities: {
supportsImages: true,
supportsToolCalls: true,
maxContextTokens: 200000,
},
providerOptions: {
provider: {
zdr: true,
only: ['anthropic'],
},
},
});
OpenRouter models automatically fetch pricing from the OpenRouter API. You do not need to specify inputPrice or outputPrice.
With Fallbacks
import { defineModel } from '@standardagents/spec';
import { openai } from '@standardagents/openai';
// Define fallback model first
export const fallbackCheap = defineModel({
name: 'fallback-cheap',
provider: openai,
model: 'gpt-4o-mini',
inputPrice: 0.15,
outputPrice: 0.60,
});
// Primary model with fallback
export default defineModel({
name: 'primary',
provider: openai,
model: 'gpt-4o',
fallbacks: ['fallback-cheap'],
inputPrice: 2.5,
outputPrice: 10,
});
import { defineModel } from '@standardagents/spec';
import { openai } from '@standardagents/openai';
export default defineModel({
name: 'gpt-4o-with-tools',
provider: openai,
model: 'gpt-4o',
inputPrice: 2.5,
outputPrice: 10,
providerTools: ['web_search', 'code_interpreter', 'file_search'],
});
Reasoning Model
import { defineModel } from '@standardagents/spec';
import { openai } from '@standardagents/openai';
export default defineModel({
name: 'deep-reasoning',
provider: openai,
model: 'o1',
inputPrice: 15,
outputPrice: 60,
capabilities: {
supportsToolCalls: true,
reasoningLevels: {
0: null,
33: 'low',
66: 'medium',
100: 'high',
},
},
});
File Location
Models are auto-discovered from agents/models/:
agents/
└── models/
├── heavy_thinking.ts
├── fast_response.ts
└── budget.ts
Requirements:
- Use snake_case for file names
- One model per file
- Default export required
Generated Types
After running the build, StandardAgents.Models type is generated:
declare namespace StandardAgents {
type Models = 'heavy-thinking' | 'fast-response' | 'budget';
}
This enables type-safe model references in prompts.
Runtime Validation
defineModel performs runtime validation:
name is required and must be a non-empty string
provider must be a valid ProviderFactory function
model is required
- Pricing values must be non-negative if provided
providerOptions are validated against the provider’s schema
// This will throw at runtime:
defineModel({
name: 'my-model',
provider: 'openai', // Error: Must be a ProviderFactory, not a string
model: 'gpt-4o',
});