This document explains the Agent node, which interacts with a Large Language Model (LLM) in Agent mode to perform tasks, answer questions, or process data dynamically.
Instructions:
Custom instructions provided to the LLM for task execution. Use dynamic variables ({{myVariable}}
) to personalize input.
Example: "Write a story about F1 driver loosing the championship for this driver {{driver}} with the team {{team}}"
Tools:
Defines the tools or functions the agent can call during its execution.
Example: "Scrapper"
Datasources:
Specifies the data sources the agent can access or query.
Example:
Datasource: "Internal company knowledge"
Instructions block:
Advanced Settings:
Allows configuration of advanced LLM parameters such as temperature, token limits, and output formats. (see “Node AI Settings” for more information)
Model:
Select the LLM model to be used.
Example: GOOGLE_VERTEXAI - gemini-1.5-pro-002
Temperature:
Controls creativity and randomness in responses.
0.2
) make outputs more deterministic.0.8
) increase creativity.Max Output Tokens:
Sets the maximum number of tokens in the response.
Example: 1000
Top K:
Restricts the response to the top K
most likely tokens.
Example: 40
Top P:
Adjusts the probability distribution for token selection.
Example: 1
(uses the full distribution).
Output JSON Schema:
Define the format of the response in JSON schema if structured output is required.
Example:
Output:
The response generated by the LLM based on the provided instructions, tools, and datasources.
Example Output:
The Agent node:
This document explains the Agent node, which interacts with a Large Language Model (LLM) in Agent mode to perform tasks, answer questions, or process data dynamically.
Instructions:
Custom instructions provided to the LLM for task execution. Use dynamic variables ({{myVariable}}
) to personalize input.
Example: "Write a story about F1 driver loosing the championship for this driver {{driver}} with the team {{team}}"
Tools:
Defines the tools or functions the agent can call during its execution.
Example: "Scrapper"
Datasources:
Specifies the data sources the agent can access or query.
Example:
Datasource: "Internal company knowledge"
Instructions block:
Advanced Settings:
Allows configuration of advanced LLM parameters such as temperature, token limits, and output formats. (see “Node AI Settings” for more information)
Model:
Select the LLM model to be used.
Example: GOOGLE_VERTEXAI - gemini-1.5-pro-002
Temperature:
Controls creativity and randomness in responses.
0.2
) make outputs more deterministic.0.8
) increase creativity.Max Output Tokens:
Sets the maximum number of tokens in the response.
Example: 1000
Top K:
Restricts the response to the top K
most likely tokens.
Example: 40
Top P:
Adjusts the probability distribution for token selection.
Example: 1
(uses the full distribution).
Output JSON Schema:
Define the format of the response in JSON schema if structured output is required.
Example:
Output:
The response generated by the LLM based on the provided instructions, tools, and datasources.
Example Output:
The Agent node: