
Initial Release
๐ฅ Enhance your InterSystems IRIS global analytics with AI.
Understand database growth, detect global trends, and perform time-aware analysis using generative AI connected directly to your system metadata โ helping you optimize disk usage, namespace organization, and prevent incidents before they happen.

For Database Administrators (DBAs) and System Managers working with InterSystems IRIS, monitoring the health of globals is often a reactive process.
In large environments, globals may grow silently for days or weeks. When disk pressure or performance issues finally appear, identifying which specific global is responsible โ or whether the behavior matches a known failure pattern โ can feel like finding a needle in a haystack.
Global Guard AI is an intelligent observability tool designed to give system managers a conversational and analytical view of global behavior.
By capturing periodic metadata-based snapshots โ including size, disk location, growth, and relational mapping โ Global Guard AI transforms static system metrics into a time-aware knowledge base, enabling proactive analysis instead of late-stage firefighting.
These are real examples of questions you can execute against your InterSystems IRIS environment.
Instead of manually parsing dashboards or writing ad-hoc SQL queries, system managers can simply ask the AI agent:
/usr/irissys/mgr/iristemp/.โdata.TblEventLogD.โ (requires vectorized data)data.TblEventLogD.โdata.TblEventLogD?โUSER namespace.โ'ENS*' globals in the USER namespace.โGlobal Guard AI is fully containerized and can be started locally using Docker Compose.
Prerequisites โ
clone Clone the git Repository
git clone https://github.com/Davi-Massaru/iris-global-guard-ai.git
cd iris-global-guard-ai
Before starting the stack, define the OPENAI_API_KEY environment variable on your machine.
Linux / macOS
export OPENAI_API_KEY=your_openai_api_key_here
Windows (PowerShell)
setx OPENAI_API_KEY "your_openai_api_key_here"
โณ The first startup may take a while if the necessary Docker images need to be downloaded.
From the project root directory, run:
docker compose up --build
Docker Compose will automatically:
Inside an InterSystems IRIS session, you can execute the following commands
in the namespace where the project was installed
(use %SYS if you are running the provided docker-compose stack):
Do ##class(guard.SnapshotGenerator).run()
Do ##class(guard.WeeklyVectorGenerator).run(90)
Do ##class(guard.FakerSeed).run()
Global Guard AI is built around a clear separation of responsibilities between data collection, analysis, and intelligence.
At the data layer, Embedded Python It is responsible for generating daily snapshots and subsequently vectorizing growth trends using Vector Search to improve analysis.
These snapshots are created using native IRIS system views such as %SYS.GlobalQuery_NameSpaceList and %SYS.GlobalQuery_Size, ensuring that no global is scanned node by node. This approach guarantees low overhead and makes the solution safe even for systems with very large globals.
Each snapshot captures metadata such as global size, allocation, disk location, namespace, and growth relative to the previous snapshot. The data is persisted in a historical table, allowing precise temporal analysis.
Once stored, snapshots can optionally be vectorized inside IRIS using vector search capabilities. This enables semantic-style comparisons between growth patterns over time, allowing the system to identify globals that behave similarly, even if their absolute sizes differ.
On top of this data, a Quarkus-based backend using LangChain exposes an AI-driven analytical layer.
The AI agent accesses IRIS through the Java Native SDK.
All interactions are strictly mediated by well-defined tools that execute predefined SQL queries against historical snapshot tables and IRIS system metadata.
Depending on the question, these tools may read previously generated snapshots and Vector Search or access native IRIS system views such as %SYS.GlobalQuery_NameSpaceList and %SYS.GlobalQuery_Size to retrieve authoritative metadata.
For security and determinism, LangChain never generates SQL dynamically and never executes arbitrary code against IRIS. Instead, the agent invokes a controlled set of Java-based tools, implemented on top of the Java Native SDK, each responsible for executing validated, read-only queries and returning structured results to the agent.
All answers are therefore grounded in real system data, retrieved exclusively through explicit native IRIS metadata.
If a requested snapshot or historical reference does not exist, the agent reports this explicitly instead of generating inferred or approximate results.
This architecture is designed to be predictable and auditable, making Global Guard AI well suited for observability studies, capacity analysis, and controlled operational scenarios.
It is particularly appropriate for use in mirror environments, lab setups, or production systems during low-load periods, where additional analytical workloads do not interfere with critical database operations.

Daily Metadata Snapshots [Embedded Python]
Global metrics are collected once per day using native InterSystems IRIS system views (%SYS.GlobalQuery_*).
No node-by-node traversal is performed, ensuring minimal impact on production workloads.
Historical Growth Analysis [vectorized data]
Each snapshot is linked to the previous one, enabling precise calculation of:
This data is subsequently vectorized, also allowing similar queries for the last 90 days (patterned value) to query system globals that exhibit similar behaviors.
SQL-First Analytics Layer
All analysis is performed using explicit Native SDK SQL queries, used by IA LangChain4j TOOLs
This guarantees results that are predictable, explainable, and auditable.
Disciplined AI Agent with Tool Calling
The conversational AI agent never guesses:
