Use sqlalchemy-iris to copy data to IRIS Cloud SQL
Many thanks to Robert Cemper for his support with bringing this idea to reality.
Many thanks to Dmitry Maslennikov for his support when I struggled to implement using sqlalchemy-iris in this project.
IRIS Audit database logs many events such as login failures for example. It can be configured to log successful logins as well.
Why is this important? We have a rule to disable a user account if they have not logged in for a certain number of days.
We have IRIS clusters with many IRIS instances. I like to run queries against audit data from ALL IRIS instances and identify user accounts which have not logged into ANY IRIS instance.
I like to export audit data from each IRIS instance and consolidate audit data into ONE database table to run queries against the consolidated audit data.
Command to run Audit Export Task now AND schedule the task to run it daily
Do ##class(otw.audit.AuditExportTask).RunNow()
The Audit Export creates an XML file and stores it in the mgr directory: /usr/irissys/mgr
I created a persistent consolidator class to hold the audit data from ALL my IRIS instances.
Command to import all XML files in mgr directory
Do ##class(otw.audit.consolidator).ImportAll()
Do ##class(otw.audit.Util).CreateViewUserChange()
export ICSHOST='k8s-a34cb3c6-aa6428f3-181bcb4a5c-1d7a6ab2ab286107.elb.us-east-1.amazonaws.com'
export ICSPASSWORD='Passw0rd123!'
python3 python/audit.py
You can find online demo here - Management Portal
A sample DDL file is included in this repo.
LOAD DATA FROM FILE 'C://InterSystems/IRIS/mgr/audit.CSV'
INTO otw_audit.consolidator1
USING {
"from":{
"file":{
"header":true
}}}
go
https://portal.sql-contest.isccloud.io/account/login
https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls?KEY=ECLOUD_intro
https://docs.intersystems.com/irislatest/csp/docbook/DocBook.UI.Page.cls?KEY=ECLOUD_api
https://docs.intersystems.com/iris20223/csp/docbook/Doc.View.cls?KEY=GSQL_import
https://docs.intersystems.com/iris20223/csp/docbook/Doc.View.cls?KEY=GSQL_esql
https://community.intersystems.com/post/cache-writing-aws-s3-bucket
As part of the AWS Free Tier, you can get started with Amazon S3 for free. Upon sign-up, new AWS customers receive 5GB of Amazon S3 storage in the S3 Standard storage class; 20,000 GET Requests; 2,000 PUT, COPY, POST, or LIST Requests; and 100 GB of Data Transfer Out each month.