- 27 Dec 2024
- 6 Minutes to read
- Print
- DarkLight
Utilizing AI Modules in Decisions
- Updated on 27 Dec 2024
- 6 Minutes to read
- Print
- DarkLight
Overview
Decisions offer several AI Modules that integrate Artificial Intelligence capabilities into the platform. Each AI Module has its purpose, and in the v9.3 Release, the AI Common Module was introduced to handle background efforts for other AI Modules.
The AI Common Module provides a foundational functionality, offering necessary tools and interfaces for various AI services, and can be utilized in conjunction with other AI Modules, such as the Open AI Module, which enables Users to utilize OpenAI's ChatGPT integration.
AI Modules provide pre-built steps, tools, and interfaces for a wide range of AI functionalities, enabling Users to leverage AI in their Workflow.
AI Module Glossary
Term | Description |
---|---|
Chat Completion | This step enables prompts to be submitted to a large language model (LLM) and will return the response the LLM gives back. Prompts can be added directly or pulled from the Flow. Users can also select the model they want to review the prompt. |
Get Embeddings for Text | This step generates embeddings for the given text. Embeddings represent semantic relationships and contextual information about the text, enabling advanced natural language processing. |
Vector Embedding | A list of numbers that represents unstructured Data so that computers can process the information. |
Vector Database | A Vector Database is a specialized Database that indexes vector embeddings, maximizing efficiency for retrieval and similarity search. |
Installation and Setup
Navigate to System > Administration > Features to view the available AI Modules. The AI Common Module will be automatically installed when any other AI-related Modules are installed since these Modules depend on the AI Common Module.
For more information on installing Modules and to see a list of our available modules, please visit:
Core Functionality
Although each AI Module has its purpose, the core functionality of AI Modules as a whole involves integrating, managing, and leveraging various artificial intelligence capabilities within the platform to enhance workflows and enable efficient process automation.
We currently offer the following AI Modules, as well as Modules that have AI integrations:
- AI Common
- Google Vertex AI
- Open AI
- AWS Bedrock
- AI.GoogleGemini
- AI.Anthropic
Connecting a Vector Database
To set up a Vector Database in Decisions:
- Navigate to Manage > Integrations > Databases.
- Select Create Connection, enter a name in the connection, and then select POSTGRES from the Database type dropdown menu.
- From here, under the POSTGRESQL SERVER section, enter details for the Server name, database name, username, and password associated with the external database.
- Users can select Test to confirm the connection or Ok to save the connection.
- Once the connection is saved, a new Folder will appear for the connection under Databases. Make note of the Folder ID for this connection as this corresponds to the Vector Database ID.
Connecting AI Common to Other AI Modules
Adding Modules as a Project Dependency
- Ensure the AI Common Module is installed.
- Install the AI Modules that will be connected to AI Common.
- After both AI Common and the AI Module(s) have been installed, they must be added as a dependency to Projects that will utilize them.
- This can be accomplished by navigating to Manage > Configuration > Dependencies within the desired Project.
Configure AI Defaults
AI Defaults can be configured by navigating to System > Integrations > AI and selecting Add AI Defaults.
- This action allows Users to add and manage different AI vendors and models.
- A model can be selected for use in all chat completion steps within a process.
- If a change is needed, the default model can be updated, and any changes will be reflected in all AI predictive steps. 
Using Exposed and Module-Specific Steps
Using Exposed Steps:
The AI Common Module exposes steps that can be located in the toolbox under AI > Common, AI > Text Manipulation, AI > Prompt Management, AI > PostgreSQL, and AI Vector Preparation.
- These Steps enable Users to perform various AI tasks such as retrieving text embeddings and managing vector data in PostgreSQL.
Using Specific Steps:
Once the AI Common Module is installed, Users can utilize steps from the other AI Modules that have been installed, such as the Chat Completion Step in Google Gemini.
- To use a specific step, navigate to the desired AI Module then select the corresponding Step.
- Drag the Step onto the workspace and select it view the Properties Panel.
- Navigate to the Settings section and check the Override Default Provider box.
- Select the desired AI Module from the Provider drop down menu. In this example we selected Google Gemini.
- Users can proceed with customizing settings information to utilize the selected Step in the Flow.
The AI Prompt Manager
What is the AI Prompt Manager?
The AI Prompt Manager is a feature that appears in the Decisions environment after the AI Common Module is installed and can used with other AI Modules. Although Prompts can be created within a Flow, such as through the Chat Completion Step, the AI Prompt Manager allows Users to create a prompt ahead of time and use it immediately within a Flow instead of creating it using the Flow itself. The following example demonstrates an AI Prompt being created for use in the Open AI Module.
Utilizing the AI Prompt Manager
Once both the Open AI and AI Common Modules are downloaded and a dependency for these modules has been declared on your desired Project, a Prompts folder will be created. This folder houses created prompts for later reference in a Workflow, such as within a Chat Completion Step. To access this folder, navigate to Integrations > AI > Prompts.
- After verifying that the folder has been generated, select Add Prompt.
- This action will populate the Add Prompt Form. Fill out the desired details for the name, description, and prompt fields and hit Save.
- In this example, the variable used is $colorname. Users can add as many variables as they wish before saving the prompt.
- This action will populate the Add Prompt Form. Fill out the desired details for the name, description, and prompt fields and hit Save.
Use Case: Using an AI Prompt in a Flow with Google Gemini
- Once the AI Prompt has been created, add a dollar sign symbol to the beginning of every variable. This ensures the variable will be searchable during runtime.
- Once the AI Prompt has been created and saved it can now be used in an existing or new Flow.
- Open a new or existing Flow and navigate to the Toolbox.
- Navigate to AI > AI Common and drag the Chat Completion Step under Providers onto the workspace.
- Select the Chat Completion Step and scroll down the Properties panel to Settings.
- From here, check the Override Default Provider box, select the desired Model and check the Select Prompt Box.
- This action will populate a Select Prompt box. Select the plus icon and navigate to the AI Prompts Folder to select the created AI Prompt.
- Select Pick to save the prompt into the Flow.
- Once the Prompt has been saved, navigate to the Inputs section.
- Select Build Array on Override Variables and then select Build Data on Item 0.
- In the Find field, enter the name of the variable defined earlier in the created prompt.
- In the Replace field, enter the name of the item that will replace the variable. In this example, the variable name will be replaced by pink.
- Select Build Array on Override Variables and then select Build Data on Item 0.
- Once this information is saved. Connect the Chat Completion Step to the End Step and select Debug.
- Once the Debugger has been run, the Flow should generate a Chat Response that include the selected prompt and any designated variables.