Home Applications iris-AgenticAI

iris-AgenticAI

InterSystems does not provide technical support for this project. Please contact its developer for the technical assistance.
5
1 reviews
0
Awards
117
Views
0
IPM installs
0
0
Details
Releases (4)
Reviews (1)
Issues
Articles (2)
Next generation of autonomous AI Agentic Application

What's new in this version

Added Vector Search Agent (RAG functionality)

Iris-AgenticAI 🤖⚡

The rise of agentic AI marks a transformative leap in how artificial intelligence interacts with the world—moving beyond static responses to dynamic, goal-driven problem-solving. Powered by OpenAI’s Agentic SDK, The OpenAI Agents SDK enables you to build agentic AI apps in a lightweight, easy-to-use package with very few abstractions. It’s a production-ready upgrade of our previous experimentation for agents, Swarm.

This application showcases the next generation of autonomous AI systems capable of reasoning, collaborating, and executing complex tasks with human-like adaptability.

one one one one one one OEX license

Application Structure

image

Application Interface

image

Features

  • Agent Loop 🔄
    A built-in loop that autonomously manages tool execution, sends results back to the LLM, and iterates until task completion.

  • Python-First 🐍
    Leverage native Python syntax (decorators, generators, etc.) to orchestrate and chain agents without external DSLs.

  • Handoffs 🤝
    Seamlessly coordinate multi-agent workflows by delegating tasks between specialized agents.

  • Function Tools ⚒️
    Decorate any Python function with @tool to instantly integrate it into the agent’s toolkit.

  • Vector Search (RAG) 🧠
    Native integration of vector store (IRIS) for RAG retrieval.

  • Tracing 🔍
    Built-in tracing to visualize, debug, and monitor agent workflows in real time (think LangSmith alternatives).

  • MCP Servers 🌐
    Support for Model Context Protocol (MCP) via stdio and HTTP, enabling cross-process agent communication.

  • Chainlit UI 🖥️
    Integrated Chainlit framework for building interactive chat interfaces with minimal code.

  • Stateful Memory 🧠
    Preserve chat history, context, and agent state across sessions for continuity and long-running tasks.

Installation

  1. Clone/git pull the repo into any local directory
git clone https://github.com/mwaseem75/iris-AgenticAI.git

Requirement

Application requires OpenAI API Key, sign up for OpenAI API on this page. Once you have signed up and logged in, click on Personal, and select View API keys in drop-down menu. Create and copy the API Key
image

Create a .env file in the root directory and add your OpenAI API key:
image

  1. Open a Docker terminal in this directory and run:
docker-compose build
  1. Run the IRIS container:
docker-compose up -d 

Run Chainlit Web Application

To run the Application, Navigate to http://localhost:8002
image

Agent

Agents are the core building block in your apps. An agent is a large language model (LLM), configured with instructions and tools.
Basic configuration
The most common properties of an agent you’ll configure are:

instructions: also known as a developer message or system prompt.
model: which LLM to use, and optional model_settings to configure model tuning parameters like temperature, top_p, etc.
tools: Tools that the agent can use to achieve its tasks.

from agents import Agent, ModelSettings, function_tool

@function_tool
def get_weather(city: str) -> str:
return f"The weather in {city} is sunny"

agent = Agent(
name="Haiku agent",
instructions="Always respond in haiku form",
model="o3-mini",
tools=[get_weather],
)

The application contains 7 agents:

  • Triage Agent 🤖 : Main agent receives user input and delegates to other agent by using handoffs
  • Vector Search Agent 🤖: Provide IRIS 2025.1 Release notes details (RAG Functionality)
  • IRIS Dashboard Agent 🤖: Assist in providing below management portal dashboard details:
    ( ApplicationErrors,CSPSessions,CacheEfficiency,DatabaseSpace,DiskReads,DiskWrites, ECPAppServer,ECPAppSrvRate,ECPDataServer,ECPDataSrvRate,GloRefs,GloRefsPerSec,GloSets,
    JournalEntries,JournalSpace,JournalStatus,last_backup,LicenseCurrent,LicenseCurrentPct, LicenseHigh,LicenseHighPct,LicenseLimit,LicenseType,LockTable,.LogicalReads,Processes, RouRefs,SeriousAlerts,ShadowServer,ShadowSource,SystemUpTime,WriteDaemon)
  • IRIS Running Process Agent 🤖: Assist to provide IRIS running processes details.(Process ID, NameSpace, Routine, state, PidExternal)
  • IRIS Production Agent 🤖: Assist to provide Production information, start and stop the production.
  • WebSearch Agent 🤖: Perform web searches to find relevant information.
  • Order Agent 🤖: Check the status of an order with the given order ID.

Handoffs

Handoffs allow an agent to delegate tasks to another agent. This is particularly useful in scenarios where different agents specialize in distinct areas. For example, a customer support app might have agents that each specifically handle tasks like order status, refunds, FAQs, etc.

Triage agent is our main agent which delegate tasks to another agent based on user input

    #TRIAGE AGENT, Main agent receives user input and delegates to other agent by using handoffs
    triage_agent = Agent(
        name="Triage agent",
        instructions=(
            "Handoff to appropriate agent based on user query."
            "if they ask about Release Notes, handoff to the vector_search_agent."
            "If they ask about production, handoff to the production agent."
            "If they ask about dashboard, handoff to the dashboard agent."
            "If they ask about process, handoff to the processes agent."     
            "use the WebSearchAgent tool to find information related to the user's query and do not use this agent is query is about Release Notes."               
            "If they ask about order, handoff to the order_agent."            
        ),
        handoffs=[vector_search_agent,production_agent,dashboard_agent,processes_agent,order_agent,web_search_agent]
    )

Application Workflow Process

Vector Search Agent

Vector Search Agent automatically ingests New in InterSystems IRIS 2025.1 text information into IRIS Vector Store only once if the data doesn’t already exist.
image

Use the query below to retrieve the data

SELECT 
id, embedding, document, metadata
FROM SQLUser.AgenticAIRAG

image

The Triage Agent receives user input, routing the question to the Vector Search Agent.
image

IRIS Dashboard Agent

The Triage Agent receives user input, routing the question to the IRIS Dashboard Agent.
image

IRIS Processes Agent

The Triage Agent receives user input, routing the question to the IRIS Processes Agent.
image

IRIS Production Agent

Start and Stop the Production.
image

Get Production Details.
image

Local Agent

The Triage Agent receives user input, routing the question to the Local Order Agent.
image

WebSearch Agent

Here, the triage Agent receives two questions, routing both to the WebSearcg Agent.
image

Tracing

The Agents SDK includes built-in tracing, collecting a comprehensive record of events during an agent run: LLM generations, tool calls, handoffs, guardrails, and even custom events that occur. Using the Traces dashboard, you can debug, visualize, and monitor your workflows during development and in production.
https://platform.openai.com/logs
image

MCP Server

MCP Server is running at https://localhost:8000/sse
image

NOTE: The MCP Server is configured to start automatically. If the server fails to launch, manually start it using the following command:

uv run python /irisdev/app/src/python/aai/runMCPServer.py

The MCP Server is equipped with the following tools:

  • Provide IRIS 2025.1 Release notes details (RAG Functionality)
  • IRIS Info tool
  • Check Weather tool
  • Find secret word tool (Local function)
  • Addition Tool (Local function)
import random,iris
import requests
from mcp.server.fastmcp import FastMCP

Create server

mcp = FastMCP("Echo Server")

#Local function
@mcp.tool()
def add(a: int, b: int) -> int:
"""Add two numbers"""
print(f"[debug-server] add({a}, {b})")
return a + b

#Local Function
@mcp.tool()
def get_secret_word() -> str:
print("[debug-server] get_secret_word()")
return random.choice(["apple", "banana", "cherry"])

#Get IRIS Version details
@mcp.tool()
def get_iris_version() -> str:
print("[debug-server] get_iris_version()")
return iris.system.Version.GetVersion()

#Get Current weather
@mcp.tool()
def get_current_weather(city: str) -> str:
print(f"[debug-server] get_current_weather({city})")

endpoint = "https://wttr.in"
response = requests.get(f"{endpoint}/{city}")
return response.text

if name == "main":
mcp.run(transport="sse")

MCP application

The application communicates with the MCP Server, which runs locally at localhost.
MCP application is running at http://localhost:8001
image

Starting the MCP application

NOTE: In case of “Page isn’t working error”, manually start the application by using the following Docker command:

chainlit run /irisdev/app/src/python/aai/MCPapp.py -h --port 8001 --host 0.0.0.0

image

MCP Server Vector Search (RAG) functionality

The MCP Server is equipped with InterSystems IRIS vector search ingestion capabilities and Retrieval-Augmented Generation (RAG) functionality.
image

MCP Server other functionality

The MCP Server dynamically delegates tasks to the appropriate tool based on user input.
image

Thanks

Made with
Version
1.0.305 Apr, 2025
Python package
openai-agents
Category
Frameworks
Works with
InterSystems IRIS
First published
30 Mar, 2025
Last edited
06 Apr, 2025