8 Application(s) results for #Business Operations
Filter
Show only
Last release on OEX
Categories
Works with
Industry
Status
Application NameDeveloperMade withRatingLast updatedViewsInstalls

WP-ES

Warfarin patient enrollment and stratification

Y
Yangkun Fan
0.0 (0)27 Nov, 2025 34

SharePoint Online SPO REST API

Sharepoint API template

M
Mark OReilly
0.0 (0)13 Dec, 2024 272

InterSystems Ideas Waiting to be Implemented

AI extensibility Prompt keyword for Class and Method implementation. Also Prompt macro generator.

To accelerate capability of growing code generation. This proposal suggests new extensibility facilities and hooks that can be democratized to community and / or fulfilled by commercial partners. To add Training metadata to Refine a Large Language Model for code, a "Prompt Input" is associated with an expected "Code Output", as part of a class definition. This provide structured keywords to describe: * The expected output * And / Or Chain-of-thought to generate the correct output | /// The following Prompt describes the full implementation of the class Class alwo.Calculator [Abstract, Prompt = "Provides methods to Add, Subtract, Multiply and divide given numbers." ] { /// The following Prompt describes the full implementation of the method ClassMethod Add(arg1 As %Float, arg2 As %Float) As %Float [ Prompt ="Add numeric arguments and return result." ] { return arg1 + arg2 } ClassMethod Subtract(arg1 as %Float, arg2 As %Float) { &Prompt("Subtract numeric arguments and return result") ) } | The Prompt macro generates code based on the context of the method it is within. Once resolved, it automatically comments out the processed macro. | ClassMethod Subtract(arg1 as %Float, arg2 As %Float) { //&Prompt("Subtract arguments and return the result") return arg1 - arg2 //&Prompt("Model alwogen-objectscript-7.1.3") ) | The generator leveraged at compilation time could be configured in a similar way to how source control is configured for a namespace. Configuration could lock / exclude packages from being processed in this way. A "\prompt" compilation flag could be used to control the default environment behavior and editor compilation behavior. For example to force reprocessing of previously resolved prompts due to a newer more capable version of code Large Language Model, then a "\prompt=2" could be applied. Different models or third-party services could be applied depending the language of the given method. When redacting source code by "deployment", the existing "deploy" facility could be extended to also ensure removal of "Prompt" metadata from code.

A
by Alex Woodhead

3

Votes

1

Comments
Vote

iris-health-coach

LLM Health Coach using InterSystems Vector DB

Z
Zacchaeus Chok
IPM
AI
ML
ML
4.5 (1)18 May, 2024 391 9

iris-openai

Library for use Open AI

Kurro Lopez
Docker
IPM
AI
5.0 (5)18 Apr, 2024 1.1k 50

iris-teams-adapter

Adapter to connect your IRIS producction with Microsoft Teams

Kurro Lopez
Docker
IPM
4.6 (5)14 Mar, 2024 497 12

Internal-SQL-Service

query an internal SQL table and send a snapshot downstream

M
Mark OReilly
5.0 (1)08 Dec, 2023 211

native-api-py-demo

This is a Native API for Python demo

s
shan yue
3.5 (1)17 Sep, 2023 250

telegram-adapter-demo

This demo shows how to use the IRIS Telegram Adapter.

N
Nikolay Solovyev
Docker
IPM
0.0 (0)21 Jun, 2023 256 19