Azure Foundry Module

Prev Next

Overview

Module Details

Core or GitHub ModuleCore
Restart Required No
Step Location AI > Azure Foundry
Settings Location Settings > Azure Foundry Settings
Prerequisites

The Azure Foundry Module enables Users to access Azure Foundry through the integrated Chat Completion step. 

Once prerequisites are installed, the Module automatically exposes the Chat Completion step in the Toolbox. This step includes a properties panel with fields that can be configured according to user preference.

Please visit Utilizing AI Modules in Decisions for more information on AI Modules.

Note for Third-Party Systems and Subscriptions
Customers are responsible for securing and maintaining accounts with third-party systems and subscriptions.

Configuration/Properties

Installation

  • To set up the Module in Decisions, navigate to Settings > Administration > Features, locate the Azure Foundry Module, and select Install. 
    • The AI Common Module will automatically install in the background if this is the first AI Module installed in the environment. 

Creating a Dependency:

  • After installing the module, navigate to the project that will utilize Azure Foundry.
    • Then, within the Project, navigate to Manage > Configuration > Dependencies
    • Select Manage, navigate to Modules, and select Decisions.AI.AzureFoundry.
    • Select Confirm to save the change

For more information on installing Modules, please visit Installing Modules in Decisions.

For more information about creating a project dependency for a Module, please visit Adding Modules as a Project Dependency


Available Steps

The table below lists all the steps in the Module.

Step NameDescription
Chat CompletionThe Chat Completion step enables prompts to be submitted to a large language model(LLM) and will return the response the LLM gives back. Prompts can be added directly or pulled from the Flow. Users can also select the model they want to review the prompt. 

Available Models

The Azure Foundry Module includes the following models:

  • claude-sonnet-4-5
  • gpt-5-mini
  • gpt-5-nano
  • grok-4
  • model-router

Users can also specify a model by selecting Get Model from Flow.  


Overriding Provider Settings for Gen AI Features (v9.22+)

Users can now override the LLM provider for our generative AI features (Create Forms, Rules, Formulas, etc. from AI) and specify the desired LLM provider from a list of currently implemented LLMs. Users can also specify which model will be utilized. Authentication is performed by the corresponding LLMs settings in Decisions. 


Feature Changes

DescriptionVersionRelease DateDeveloper Task
Added the brand new Azure Foundry Integration Module. This Module will allow Users to utilize the LLM built by Azure.
9.13July 2025[DT-044872]
Users can now override the LLM provider for our generative AI features (Create Forms, Rules, Formulas, etc. from AI) and specify the desired LLM provider from a list of currently implemented LLMs.
9.22March 2026[DT-047084]