Docs
Workflows
LLM

LLM

Query a chosen LLM with a structured prompt and retrieve its response

Input Parameters

  • Prompt Template: An effective prompt designed to use text and variables from the workflow to achieve the best output
  • Generative Model Name: Choose LLM from your activated models to use for the query

Expected Output

Generated text designed by the prompt template

Example Use Case

In this sample workflow, a LLM "travel agent" is queried with what the top 5 sights to see in the world. The node queries the LLM using the prompt and returns the response.

Workflow View

Was this page useful?

Questions? We're here to help

Subscribe to updates