Initial Release
With the 2024.1 release, we’re adding a powerful Vector Search capability to the InterSystems IRIS Data Platform, to help you innovate faster and build intelligent applications powered by Generative AI. At the center of the new capability is a new VECTOR
native datatype for IRIS SQL, along with similarity functions that leverage optimized chipset instructions (SIMD).
The same Vector Search capability is now also available with InterSystems IRIS Cloud SQL. Check out cloud_sql_demo.ipynb
for instructions on setting up a connection from your Jupyter notebook. The notebooks exploring langchain and llama-index also support connecting to Cloud SQL deployments.
This repository offers code samples to get you started with the new features, and we’ll continue to add more, but encourage you to let us know about your own experiments on the InterSystems Developer Community. At the bottom of this page, you’ll find links to a few demo repositories we liked a lot!
git clone https://github.com/intersystems-community/iris-vector-search.git
If you prefer just running the demos from your local Python environment, skip to Using your local Python environment.
For langchain_demo.ipynb
and llama_demo.ipynb
, you need an OpenAI API Key. Update the corresponding entry in docker-compose.yml
:
OPENAI_API_KEY: xxxxxxxxx
Start the Docker containers (one for IRIS, one for Jupyter):
docker-compose up
Please note that building the container involves downloading the sentence_transformers
module, which measures over 2GB!
Install IRIS Community Edtion in a container:
docker run -d --name iris-comm -p 1972:1972 -p 52773:52773 -e IRIS_PASSWORD=demo -e IRIS_USERNAME=demo intersystemsdc/iris-community:latest
:information_source: After running the above command, you can access the System Management Portal via http://localhost:52773/csp/sys/UtilHome.csp. Please note you may need to configure your web server separately when using another product edition.
Create a Python environment and activate it (conda, venv or however you wish) For example:
conda create --name iris-vector-search python=3.10
conda activate iris-vector-search
Install packages for all demos:
pip install -r requirements.txt
For langchain_demo.ipynb
and llama_demo.ipynb
, you need an OpenAI API Key. Create a .env
file in this repo to store the key:
OPENAI_API_KEY=xxxxxxxxx
The demos in this repository are formatted as Jupyter notebooks. To run them, just start Jupyter and navigate to the /demo/
folder:
jupyter lab
IRIS SQL now supports vector search (with other columns)! In this demo, we’re searching a whiskey dataset for whiskeys that are priced < $100 and have a taste description similar to “earthy and creamy taste”.
IRIS now has a langchain integration as a VectorDB! In this demo, we use the langchain framework with IRIS to ingest and search through a document.
IRIS now has a llama_index integration as a VectorDB! In this demo, we use the llama_index framework with IRIS to ingest and search through a document.
This notebook describes how to tap into the Vector Search capability when using InterSystems IRIS Cloud SQL instead of a local install or container. It covers the additional settings for establishing a secure connection to a Cloud SQL deployment.
If you need to use hybrid search (similarity search with other columns), use IRIS SQL.
If you’re building a genAI app that uses a variety of tools (agents, chained reasoning, api calls), go for langchain.
If you’re building a RAG app, go for the approach llama_index.
Feel free to contact Alvin / Thomas or file an issue in this GitHub repository if you have any questions!
Neat shopping cart demo that leverages Vector Search to match your voice-recorded order to available items.
Uses langchain-iris to search Youtube Audio transcriptions
Original IRIS langhain demo, that runs the containerized IRIS in the notebook
Original IRIS llama_index demo, that runs the containerized IRIS in the notebook
Official page for InterSystems Documentation