AI Query Engine (LLM)

The AI Query Engine is the intelligence core of Billx-Agent. It translates your natural language prompt into a valid, executable SQL query using a Large Language Model (LLM) β€” currently powered by Gemini (via Agno SDK).


πŸ” What It Does

  • Understands the intent behind the user prompt

  • Analyzes your database schema

  • Selects or generates a relevant SQL query

  • Supports parameterized templates via reusable tools

  • Performs optional response refinement for natural answers


🧠 Powered by Gemini (via Agno)

Your prompt is sent to a Gemini LLM instance with rich context including:

  • Table and column metadata

  • Example SQL tools (templates)

  • Your exact user query


🧰 Tool Matching & Template Filling

Billx-Agent uses a tool-based architecture:

  1. Admins can define reusable SQL templates like:

  1. The AI selects the best-matching tool (if any) for a given prompt.

  2. It extracts placeholder values (e.g. dates, limits) from the input.

  3. It fills the template and produces executable SQL.

⚠️ If no matching tool is found, the LLM generates raw SQL based on schema only.


🧼 Validation & Safety

  • The system checks for valid SQL syntax

  • Ensures it’s a read-only SELECT query

  • Guards against unsupported operations (INSERT, DROP, etc.)


πŸ“‹ Output Example


πŸ€– Summary Flow

  1. You send: "Show me total orders in March"

  2. AI receives:

    • Your prompt

    • Schema for orders table

    • SQL templates (tools)

  3. AI returns:

    • SQL statement

    • Used tool (if matched)

    • Parameters detected

    • Refined plain-English response (optional)


πŸ” This module powers both /chat and /audio-chat, enabling natural language querying at scale.

Last updated