Optional
customCustom system prompt to override the default
Optional
llmLanguage model to use for generating the plan (falls back to state.defaultLLM if not provided)
Optional
temperatureTemperature setting for the language model (0.0-1.0)
Optional
includeWhether to include the plan in the final research results
Optional
retryRetry configuration for language model calls Useful for handling transient errors in LLM services
Optional
maxRetries?: numberMaximum number of retries (default: 2)
Optional
baseDelay?: numberBase delay between retries in ms (default: 1000)
Configuration options for the research planning step
This interface defines all the configurable aspects of the planning step, including the language model to use, prompt customization, and result handling.
PlanOptions