Telegram Send Message transformation fix
This is an IRIS Interoperability Production that enables Telegram Bot to talk to ChatGPT
Create a bot using @BotFather account and get the Bot Token. Then add bot into a telegram chat or channel and give it admin rights. Learn more at https://core.telegram.org/bots/api
Open (create if you don't have it) an account on https://platform.openai.com/ and get your Open AI API Key and Organization id.
Make sure you have IPM installed in your InterSystems IRIS. if not here is one liner to install:
USER> s r=##class(%Net.HttpRequest).%New(),r.Server="pm.community.intersystems.com",r.SSLConfiguration="ISC.FeatureTracker.SSL.Config" d r.Get("/packages/zpm/latest/installer"),$system.OBJ.LoadStream(r.HttpResponse.Data,"c")
Open IRIS Namespace with Interoperability Enabled. Open Terminal and call:
USER>zpm "install telegram-gpt -D TgToken=your_telegram_token -D GPTKey=your_ChatGPT_key"
Clone/git pull the repo into any local directory
$ git clone https://github.com/evshvarov/openai-telegram-bot
Create .env file in the root directory of the repo with:
TG_BOT_TOKEN=Your_telegrambot_token OPENAPI_KEY=Your_chatGPT_key
Open the terminal in this directory and run:
$ docker-compose build
$ docker-compose up -d
USER>d ##class(shvarov.telegramgpt.Setup).Init($system.Util.GetEnviron("TG_BOT_TOKEN"),$system.Util.GetEnviron("OPENAPI_KEY"))
Open the production.
Put your bot's Telegram Token into Telegram business service and Telegram Business operation both:
Also initialize St.OpenAi.BO.Api.Connect operation with your Chat GPT API key and Organisation id:
Start the production
Ask any question in the telegram chat. You'll get an answer via Chat GPT. Enjoy!
This example uses 3.5 version of Chat GPT Open AI. It could be altered in the data-transformation rule for the Model parameter.
This application uses Telegram-adapter by Nikolay Soloviev and Iris-OpenAI adapter by Kurro Lopez. Thank you both for making it easy to enable interoperability scenarios for OpenAI and Telegram!
Telegram Send Message transformation fix
Optimized production
Workaround IPM Production Start up (Thanks to Nikolay's PR)
Make use of production-settings module to init the production
module version bump
bugfix with classname typo
bugfix
bugfix
Add production autostart when install ipm module
you can install module now as:
zpm "install telegram-gpt -D TgToken=your_token -D GPTKey=your_chatgpt_api_key"
adding the production init parameters from env variables and as parameters to IPM module
Production names bugfix
Talk to Open AI via Telegram!
Initial release