pub fn send_request(
provider: &Provider,
system_prompt: &str,
user_prompt: &str,
) -> Result<InferenceResponse>Expand description
Send a request to an AI inference provider
This function handles the complete process of sending a request to an AI provider, including:
- Building the appropriate HTTP client with timeout configuration
- Formatting the request according to the provider’s API specification
- Sending the request and handling HTTP errors
- Parsing the response into a unified
InferenceResponsestructure - Tracking response time and token usage
§Arguments
provider- The AI provider configuration containing API details, model settings, and timeoutssystem_prompt- The system prompt that defines the AI’s behavior or contextuser_prompt- The user prompt containing the actual request or question
§Returns
Returns a Result<InferenceResponse> containing either:
Ok(InferenceResponse)with the generated text, token usage, and timing informationErr(anyhow::Error)with detailed error context if the request fails
§Errors
This function will return an error if:
- The HTTP client cannot be created with the specified timeout settings
- The HTTP request fails (network errors, invalid URLs, etc.)
- The API returns a non-successful HTTP status code
- The response cannot be parsed into the expected JSON format
§Supported Providers
The function currently supports:
- OpenAI: Uses the
/chat/completionsendpoint with proper authentication headers - Ollama: Uses the
/api/chatendpoint with local inference support
§Example
use aimx::inference::{send_request, Provider, Api, Model, Capability};
let provider = Provider {
api: Api::Ollama,
url: "http://localhost:11434".to_string(),
key: "".to_string(), // No key needed for local Ollama
model: Model::Standard,
capability: Capability::Standard,
fast: "llama3.2".to_string(),
standard: "llama3.2".to_string(),
planning: "llama3.2".to_string(),
temperature: 0.7,
max_tokens: 1000,
connection_timeout_ms: 30000,
request_timeout_ms: 120000,
};
let response = send_request(&provider, "You are a helpful assistant", "Tell me a joke");