Optional
maxMaximum length of the generated summary (characters)
Optional
llmModel to use for summarization (from the AI SDK)
Optional
temperatureTemperature for the LLM generation (0.0 to 1.0)
Optional
formatFormat for the summary (paragraph, bullet, structured)
Optional
focusFocus areas for the summary (aspects to emphasize)
Optional
includeWhether to include citations in the summary
Optional
includeWhether to add the summary to the final results
Optional
customCustom prompt for summary generation
Optional
additionalAdditional instructions for summary generation
Optional
retryRetry configuration for LLM calls
Optional
maxRetries?: numberMaximum number of retries
Optional
baseDelay?: numberBase delay between retries in ms
Options for the summarization step