Chat Completion
- 13 Mar 2025
- 1 Minute to read
- Print
- DarkLight
Chat Completion
- Updated on 13 Mar 2025
- 1 Minute to read
- Print
- DarkLight
Article summary
Did you find this summary helpful?
Thank you for your feedback
Step Details | |
Introduced in Version | 9.3 |
Last Modified in Version | 9.3 |
Location | AI > Anthropic |
This step enables prompts to be submitted to Anthropic's large language model (LLM), which will return the response that the model provides. Users can either add prompts directly or pull them from the Flow. Additionally, users can select the specific model to review the prompt. This is crucial for automating processes that require language generation, such as creating chat responses, content generation, and other natural language processing tasks within the Flow.
Prerequisites
This step requires installing the AI Common and AI Anthropic modules and adding their dependencies to the Project before they are available in the toolbox. The Anthropic account should also be available, and the API key must be added in System > Settings> Anthropic AI Settings.
We have claude-2.0, claude-2.1, and claude-instant-1.2 made available in Decisions. However, the claude-instant-1.2 model is retired, and claude-2.0 and claude-2.1 are deprecated.
For more information on model status, please visit: https://docs.anthropic.com/en/docs/resources/model-deprecations#model-status.
For more information on model status, please visit: https://docs.anthropic.com/en/docs/resources/model-deprecations#model-status.
Properties
Settings
Property | Description | Data Type |
---|---|---|
Select Model | Checkbox to enable the Model selection in settings. | Boolean |
Model | List of Anthropic models available for use in the Flow. | --- |
Get Model from Flow | Checkbox to enable the Model Name input property. Disables Select Model and Model settings when checked. | Boolean |
Prompt | Input prompt for the LLM. | String |
Select Prompt | Enables access to prompts stored in the AI Prompt Manager through the "Select Prompt" property. When checked, this disables the direct Prompt input option. | Boolean |
Get Prompt from Flow | Checkbox to enable the Prompt input property. Disables Select Prompt when checked. | Boolean |
Inputs
Property | Description | Data Type |
---|---|---|
Max Tokens | Maximum number of tokens allowed in the input string | Int32 |
Model Name | Name of the desired Anthropic model for the given use case in the Flow | String |
Prompt | Input prompt for the LLM | String |
Outputs
Property | Description | Data Type |
---|---|---|
Chat Response | Response from Anthropic's LLM based on the input prompt. | String |
Was this article helpful?