ThanoSQL supports integration with AI models such as Large Language Models (LLMs), platforms like Hugging Face, and custom AI models. This capability allows organizations to leverage advanced AI capabilities directly within their distributed data infrastructure. It means that alongside traditional data, ThanoSQL can manage and query data that includes outputs from AI models, enhancing the range of applications and insights that can be derived from the data.

ThanoSQL’s cost efficiency is achieved by saving repeated development time typically required to support the combination of AI models and data. By integrating seamlessly with AI models and data within its distributed architecture, ThanoSQL reduces the need for custom integration efforts and development cycles. This not only saves on development costs but also accelerates time-to-market for applications leveraging AI alongside traditional data processing. It allows organizations to focus more on deriving insights and value from their data and AI models rather than on managing complex integrations and infrastructure.

How does it work?

ThanoSQL combines traditional relational databases with advanced AI models and vector databases. It provides a seamless interface for interacting with data, generating insights, and leveraging the power of large language models (LLMs). The following diagram illustrates the core components of ThanoSQL and how they work together:


ThanoSQL provides two primary interfaces for interacting with the platform:

  1. REST API: A RESTful API that allows developers to integrate ThanoSQL functionality into their applications programmatically.
  2. SDK: A software development kit that provides a convenient and easy-to-use interface for interacting with ThanoSQL from various programming languages.

ThanoSQL Functions

At the core of ThanoSQL are three main functions that enable seamless integration of language models with traditional databases:


Generate text based on a given input using a pre-trained text generation model.


Generate embeddings for given input data using a pre-trained model.


Perform various prediction tasks using pre-trained models.

Web Apps

ThanoSQL provides web applications for managing and interacting with the platform:

  1. Query Manager: A user-friendly interface for writing queries, executing them against the database, and retrieving results.
  2. Lab: An interactive environment for data exploration, AI/ML modeling, and application development, based on Jupyter Lab.
  3. File Manager: A tool for managing and uploading data files to the ThanoSQL Workspace.


ThanoSQL utilizes two types of databases to store and manage data:

  1. Relational Database (RDB): A traditional relational database for storing structured data in tables.
  2. Vector Database: A specialized database for storing and indexing dense vector representations of textual data, enabling efficient similarity search and retrieval.


ThanoSQL supports the use of pre-built models as well as LLMs (Large Language Models):

  1. Pre-built Models: ThanoSQL provides access to pre-trained models for various tasks.
  2. LLMs: Users can integrate their own custom-trained large language models into the ThanoSQL platform, enabling specialized and domain-specific language model capabilities.

Demo Video

The “Query Everything” demo showcases the power and versatility of ThanoSQL, a SQL-based query engine designed to handle data from diverse sources seamlessly. This sample application demonstrates how ThanoSQL can query and analyze data from various sources, including databases, APIs, files, and more, using a unified SQL interface.