Added Vector Search Agent (RAG functionality)
The rise of agentic AI marks a transformative leap in how artificial intelligence interacts with the world—moving beyond static responses to dynamic, goal-driven problem-solving. Powered by OpenAI’s Agentic SDK, The OpenAI Agents SDK enables you to build agentic AI apps in a lightweight, easy-to-use package with very few abstractions. It’s a production-ready upgrade of our previous experimentation for agents, Swarm.
This application showcases the next generation of autonomous AI systems capable of reasoning, collaborating, and executing complex tasks with human-like adaptability.
Agent Loop 🔄
A built-in loop that autonomously manages tool execution, sends results back to the LLM, and iterates until task completion.
Python-First 🐍
Leverage native Python syntax (decorators, generators, etc.) to orchestrate and chain agents without external DSLs.
Handoffs 🤝
Seamlessly coordinate multi-agent workflows by delegating tasks between specialized agents.
Function Tools ⚒️
Decorate any Python function with @tool to instantly integrate it into the agent’s toolkit.
Vector Search (RAG) 🧠
Native integration of vector store (IRIS) for RAG retrieval.
Tracing 🔍
Built-in tracing to visualize, debug, and monitor agent workflows in real time (think LangSmith alternatives).
MCP Servers 🌐
Support for Model Context Protocol (MCP) via stdio and HTTP, enabling cross-process agent communication.
Chainlit UI 🖥️
Integrated Chainlit framework for building interactive chat interfaces with minimal code.
Stateful Memory 🧠
Preserve chat history, context, and agent state across sessions for continuity and long-running tasks.
git clone https://github.com/mwaseem75/iris-AgenticAI.git
Application requires OpenAI API Key, sign up for OpenAI API on this page. Once you have signed up and logged in, click on Personal, and select View API keys in drop-down menu. Create and copy the API Key
Create a .env file in the root directory and add your OpenAI API key:
docker-compose build
docker-compose up -d
To run the Application, Navigate to http://localhost:8002
Agents are the core building block in your apps. An agent is a large language model (LLM), configured with instructions and tools.
Basic configuration
The most common properties of an agent you’ll configure are:
instructions: also known as a developer message or system prompt.
model: which LLM to use, and optional model_settings to configure model tuning parameters like temperature, top_p, etc.
tools: Tools that the agent can use to achieve its tasks.
from agents import Agent, ModelSettings, function_tool
@function_tool
def get_weather(city: str) -> str:
return f"The weather in {city} is sunny"
agent = Agent(
name="Haiku agent",
instructions="Always respond in haiku form",
model="o3-mini",
tools=[get_weather],
)
The application contains 7 agents:
Handoffs allow an agent to delegate tasks to another agent. This is particularly useful in scenarios where different agents specialize in distinct areas. For example, a customer support app might have agents that each specifically handle tasks like order status, refunds, FAQs, etc.
Triage agent is our main agent which delegate tasks to another agent based on user input
#TRIAGE AGENT, Main agent receives user input and delegates to other agent by using handoffs
triage_agent = Agent(
name="Triage agent",
instructions=(
"Handoff to appropriate agent based on user query."
"if they ask about Release Notes, handoff to the vector_search_agent."
"If they ask about production, handoff to the production agent."
"If they ask about dashboard, handoff to the dashboard agent."
"If they ask about process, handoff to the processes agent."
"use the WebSearchAgent tool to find information related to the user's query and do not use this agent is query is about Release Notes."
"If they ask about order, handoff to the order_agent."
),
handoffs=[vector_search_agent,production_agent,dashboard_agent,processes_agent,order_agent,web_search_agent]
)
Vector Search Agent automatically ingests New in InterSystems IRIS 2025.1 text information into IRIS Vector Store only once if the data doesn’t already exist.
Use the query below to retrieve the data
SELECT
id, embedding, document, metadata
FROM SQLUser.AgenticAIRAG
The Triage Agent receives user input, routing the question to the Vector Search Agent.
The Triage Agent receives user input, routing the question to the IRIS Dashboard Agent.
The Triage Agent receives user input, routing the question to the IRIS Processes Agent.
Start and Stop the Production.
Get Production Details.
The Triage Agent receives user input, routing the question to the Local Order Agent.
Here, the triage Agent receives two questions, routing both to the WebSearcg Agent.
The Agents SDK includes built-in tracing, collecting a comprehensive record of events during an agent run: LLM generations, tool calls, handoffs, guardrails, and even custom events that occur. Using the Traces dashboard, you can debug, visualize, and monitor your workflows during development and in production.
https://platform.openai.com/logs
MCP Server is running at https://localhost:8000/sse
NOTE: The MCP Server is configured to start automatically. If the server fails to launch, manually start it using the following command:
uv run python /irisdev/app/src/python/aai/runMCPServer.py
The MCP Server is equipped with the following tools:
import random,iris import requests from mcp.server.fastmcp import FastMCP
Create server
mcp = FastMCP("Echo Server")
#Local function
@mcp.tool()
def add(a: int, b: int) -> int:
"""Add two numbers"""
print(f"[debug-server] add({a}, {b})")
return a + b#Local Function
@mcp.tool()
def get_secret_word() -> str:
print("[debug-server] get_secret_word()")
return random.choice(["apple", "banana", "cherry"])#Get IRIS Version details
@mcp.tool()
def get_iris_version() -> str:
print("[debug-server] get_iris_version()")
return iris.system.Version.GetVersion()#Get Current weather
@mcp.tool()
def get_current_weather(city: str) -> str:
print(f"[debug-server] get_current_weather({city})")endpoint = "https://wttr.in" response = requests.get(f"{endpoint}/{city}") return response.text
if name == "main":
mcp.run(transport="sse")
The application communicates with the MCP Server, which runs locally at localhost.
MCP application is running at http://localhost:8001
NOTE: In case of “Page isn’t working error”, manually start the application by using the following Docker command:
chainlit run /irisdev/app/src/python/aai/MCPapp.py -h --port 8001 --host 0.0.0.0
The MCP Server is equipped with InterSystems IRIS vector search ingestion capabilities and Retrieval-Augmented Generation (RAG) functionality.
The MCP Server dynamically delegates tasks to the appropriate tool based on user input.
Thanks