Grooper 2025 - Azure Document Intelligence

Grooper 2025 - LLM Connector Config

Grooper 2025 - LLM Connector Config

LLM Connector is a Repository Option that enables large language model (LLM) powered AI features for a Grooper Repository.

About

LLM Connectors enable Grooper's AI-based features, including AI Extract and AI Assistants. Adding an LLM Connector connects Grooper to large language models (LLMs) such as OpenAI's GPT models and models in Microsoft Azure's Model Catalog (including Azure OpenAI models). An LLM Connector is enabled by adding it with the Grooper Root's Options editor.

LLM Connector is a Repository Option in Grooper. Repository Options enable optional features in Grooper. This means enabling LLM connectivity is entirely up to you and your organization. Grooper is enhanced by AI features enabled by an LLM Connector but can operate without it.

Once added to a Grooper Repository, LLM Connector is configured by adding an LLM Provider. The LLM Provider connects Grooper to service providers that offer LLMs, such as OpenAI, Microsoft Azure, and other providers that use OpenAI's API standard.

LLM Provider Options

LLM-enabled extraction capabilities

LLM-enabled Data Section Extract Methods

LLM-enabled separation and classification capabilities

Other LLM-enabled capabilities

LLM connection options

Grooper primarily connects to LLMs using OpenAI and Azure providers, supporting a wide range of models and hosting options.

OpenAI API

Grooper's LLM features were designed around OpenAI models. Connecting to the OpenAI API is considered the standard method.

An API key and active payment method are required.

  1. Go to the Grooper Root node.
  2. Open the Options editor.
  3. Add the LLM Connector option.
  4. Open Service Providers and add OpenAI.
  5. Enter your OpenAI API Key.
  6. Enable Use System Messages (recommended).
  7. Save changes.

Azure AI Foundry deployments

Grooper connects to Azure OpenAI and Azure AI Foundry model deployments using the Azure provider.

Both Chat Services and Embeddings Services can be configured depending on the feature requirements.

  1. Go to the Grooper Root node.
  2. Open the Options editor.
  3. Add the LLM Connector option.
  4. Add an Azure provider under Service Providers.
  5. Configure Chat Service and/or Embeddings Service deployments.
  6. Set Model Id, URL, and Authorization method.
  7. Enter the API Key or configure Bearer authentication.
  8. Enable Use System Messages (recommended).
  9. Save changes.