Radkit LLM Agent SDK
This example integrates radkit — a Rust SDK for building reliable AI agent systems — with Zart’s durable execution. The workflow takes a natural language query like “Find breweries in Portland, Oregon”, uses an LLM to extract the location, fetches brewery data from an API, and generates a conversational summary.
Because the task needs an LLM provider as a dependency, this example uses the manual DurableExecution trait instead of the #[zart_durable] macro. This lets us define a struct with fields and inject the provider at construction time.
Features demonstrated: Manual DurableExecution trait with dependency injection, radkit LlmFunction<T> structured outputs, #[zart_step], mixed AI + API workflows.
Data types
Section titled “Data types”#[derive(Debug, Clone, Serialize, Deserialize)]struct AgentInput { query: String,}
#[derive(Debug, Clone, Serialize, Deserialize)]struct ExtractedLocation { city: String, state: String,}
#[derive(Debug, Clone, Serialize, Deserialize)]struct BreweryInfo { name: String, brewery_type: String, city: String, state: String,}
#[derive(Debug, Clone, Serialize, Deserialize)]struct AgentOutput { query: String, location: ExtractedLocation, breweries: Vec<BreweryInfo>, summary: String, generated_at: String,}LLM schema type
Section titled “LLM schema type”Radkit uses schemars::JsonSchema and radkit::macros::LLMOutput to drive structured outputs. The LLM populates this type directly:
#[derive(Debug, Clone, Serialize, Deserialize, LLMOutput, schemars::JsonSchema)]struct LocationExtraction { city: String, state: String,}The Durable Execution struct
Section titled “The Durable Execution struct”Instead of using #[zart_durable], we define the struct manually. The LLM provider is stored as Arc<dyn BaseLlm> so it can be cloned into each step and any radkit provider type can be injected:
use radkit::models::BaseLlm;use std::sync::Arc;
struct RadkitAgent { llm: Arc<dyn BaseLlm>,}Step functions use #[zart_step] to receive the LLM as a parameter:
#[zart_step("extract-location", retry = "exponential(3, 2s)")]async fn extract_location( llm: Arc<dyn BaseLlm>, query: &str,) -> Result<ExtractedLocation, StepError> { println!("[extract-location] Attempt {}", zart::context().current_attempt + 1);
let prompt = format!( r#"Extract the city and state from this query. Return valid JSON.
Query: "{query}"
Respond with only a JSON object with "city" and "state" fields."# );
let function = LlmFunction::<LocationExtraction>::new_with_system_instructions( llm, "You are a location extraction assistant. \ Always return valid JSON with city and state fields.", );
let result = function .run(Thread::from_user(&prompt)) .await .map_err(|e| StepError::Failed { step: "extract-location".into(), reason: format!("LLM extraction failed: {e}"), })?;
Ok(ExtractedLocation { city: result.city, state: result.state })}
#[zart_step("find-breweries", retry = "exponential(3, 1s)")]async fn find_breweries(city: &str) -> Result<Vec<BreweryRaw>, StepError> { println!("[find-breweries] Attempt {}", zart::context().current_attempt + 1); let client = reqwest::Client::new(); client .get("https://api.openbrewerydb.org/v1/breweries") .query(&[("by_city", city)]) .send().await .map_err(|e| StepError::Failed { step: "find-breweries".into(), reason: e.to_string() })? .json::<Vec<BreweryRaw>>().await .map_err(|e| StepError::Failed { step: "find-breweries".into(), reason: format!("failed to parse response: {e}"), })}
#[zart_step("transform-results")]async fn transform_results( raw: Vec<BreweryRaw>, city: &str, state: &str,) -> Result<Vec<BreweryInfo>, StepError> { Ok(raw.into_iter().map(|b| BreweryInfo { name: b.name, brewery_type: b.brewery_type.unwrap_or_else(|| "unknown".into()), city: b.city.unwrap_or_else(|| city.to_string()), state: b.state.unwrap_or_else(|| state.to_string()), }).collect())}
#[zart_step("generate-summary", retry = "exponential(3, 2s)")]async fn generate_summary( llm: Arc<dyn BaseLlm>, query: &str, location: &ExtractedLocation, breweries: &[BreweryInfo],) -> Result<String, StepError> { println!("[generate-summary] Attempt {}", zart::context().current_attempt + 1);
let brewery_list = breweries.iter().take(5) .map(|b| format!("- {} ({})", b.name, b.brewery_type)) .collect::<Vec<_>>().join("\n");
let prompt = format!( "You're a friendly beer enthusiast. Write a short summary about: {query}\n\ Found {} breweries in {}, {}.\n{brewery_list}", breweries.len(), location.city, location.state );
let response = llm .generate_content(Thread::from_user(&prompt), None) .await .map_err(|e| StepError::Failed { step: "generate-summary".into(), reason: format!("LLM summary generation failed: {e}"), })?;
Ok(response.into_content().joined_texts().unwrap_or_default())}The DurableExecution impl composes steps by calling them directly:
#[async_trait::async_trait]impl DurableExecution for RadkitAgent { type Data = AgentInput; type Output = AgentOutput;
async fn run(&self, data: Self::Data) -> Result<Self::Output, TaskError> { let location = extract_location(self.llm.clone(), data.query.clone()).await?;
let raw_breweries: Vec<BreweryRaw> = find_breweries(location.city.clone()).await?;
let breweries: Vec<BreweryInfo> = transform_results( raw_breweries, location.city.clone(), location.state.clone(), ).await?;
let summary = generate_summary( self.llm.clone(), data.query.clone(), location.clone(), breweries.clone(), ).await?;
Ok(AgentOutput { query: data.query, location, breweries, summary, generated_at: Utc::now().to_rfc3339(), }) }}Registering and running
Section titled “Registering and running”let api_key = std::env::var("OPENAI_API_KEY").expect("OPENAI_API_KEY must be set");let llm = Arc::new(OpenAILlm::new("gpt-4o", &api_key));
let mut registry = TaskRegistry::new();registry.register("radkit-agent", RadkitAgent { llm });let registry = Arc::new(registry);
let durable = DurableScheduler::new(sched.clone());durable .start_for::<RadkitAgent>(&execution_id, "radkit-agent", &AgentInput { query: "Find breweries in Portland, Oregon".to_string(), }) .await?;Running the example
Section titled “Running the example”# Ensure PostgreSQL is runningjust up
# Set your OpenAI API keyexport OPENAI_API_KEY="your-key-here"
# Run the examplejust example-radkit-agentWhat you’ll see
Section titled “What you’ll see”=== Zart Radkit Agent Example ===
Starting execution 'radkit-demo-...'... Query: Find breweries in Portland, Oregon
Initial execution status: Pending
Waiting for execution to complete...
[extract-location] Attempt 1 Extracted location: Portland, Oregon[find-breweries] Attempt 1 Found 20 raw brewery results[generate-summary] Attempt 1
Execution completed! Query: Find breweries in Portland, Oregon Location: Portland, Oregon Breweries: 20
Summary: Portland is a craft beer paradise with 20 fantastic breweries to explore! ...
Breweries found: 1. Breakside Brewery (micro) — Portland, Oregon 2. Cascade Brewing (brewpub) — Portland, Oregon ...Key concepts
Section titled “Key concepts”Manual DurableExecution for dependency injection — When a task needs external dependencies (like an LLM provider), implementing DurableExecution manually lets you define struct fields. The #[zart_durable] macro generates a unit struct with no fields.
Arc<dyn BaseLlm> for shareable providers — Radkit’s Arc<dyn BaseLlm> implements BaseLlm directly via a blanket impl, so it can be passed to LlmFunction::new_with_system_instructions and generate_content without unwrapping.
LlmFunction<T> for structured output — Requires T: LLMOutput + JsonSchema. Derive LLMOutput from radkit::macros alongside schemars::JsonSchema. The LLM response is parsed and deserialized into T automatically.
generate_content for free-form text — For steps that return a plain String, call llm.generate_content(Thread::from_user(&prompt), None) and extract text via .into_content().joined_texts().
Mixed durable workflows — AI steps (LLM calls) and traditional steps (HTTP APIs, data transformations) coexist in the same durable workflow. Each step is independently persisted and retried.
Retry strategy — LLM steps use retry = "exponential(3, 2s)" since they’re prone to transient failures. The deterministic transform_results step needs no retry.
Changing the LLM provider
Section titled “Changing the LLM provider”Because the struct stores Arc<dyn BaseLlm>, swapping providers only requires changing the initialization:
// Anthropiclet llm = Arc::new(AnthropicLlm::from_env("claude-sonnet-4-5-20250929")?);registry.register("radkit-agent", RadkitAgent { llm });
// OpenRouterlet llm = Arc::new(OpenRouterLlm::from_env("openai/gpt-4o")?);registry.register("radkit-agent", RadkitAgent { llm });