Imagine effortlessly translating ordinary language into structured SQL queries. Welcome to the world of **Hrida-T2SQL-3B-V0.1**! This small language model is designed specifically for converting user queries into SQL commands, making data interaction more intuitive than ever.
Getting Started with Hrida-T2SQL
The Hrida-T2SQL model is based on the powerful MicrosoftPhi-3-mini-4k-instruct and is perfectly tailored for text-to-SQL transformations. To utilize this model, follow the steps outlined below:
How to Use the Model
- Prepare Your Environment: Ensure you have Python and the required libraries, such as transformers, installed.
- Load the Model: Here’s the code to get you started:
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
# Define the model and tokenizer
model_id = 'HridaAI/Hrida-T2SQL-3B-V0.1'
tokenizer = AutoTokenizer.from_pretrained(model_id, trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained(model_id, torch_dtype=torch.float16, trust_remote_code=True)
# Define the context and prompt
prompt = "Answer to the query will be in the form of an SQL query."
# Context contains the database schema
context = """
CREATE TABLE Employees (
EmployeeID INT PRIMARY KEY,
FirstName VARCHAR(50),
LastName VARCHAR(50),
Age INT,
DepartmentID INT,
Salary DECIMAL(10, 2),
DateHired DATE,
Active BOOLEAN,
FOREIGN KEY (DepartmentID) REFERENCES Departments(DepartmentID)
);
CREATE TABLE Departments (
DepartmentID INT PRIMARY KEY,
DepartmentName VARCHAR(100),
Location VARCHAR(100)
);
"""
input_query = "Write a SQL query to select all the employees who are active."
# Prepare the input
messages = [{"role": "user", "content": prompt}]
inputs = tokenizer.apply_chat_template(messages, return_tensors='pt', add_generation_prompt=True)
# Generate the output
outputs = model.generate(inputs, max_length=300)
print(tokenizer.decode(outputs[0]))
Understanding the Code through Analogy
Think of the Hrida-T2SQL model as a translator that takes requests from users (like you asking for a pizza) and converts them into standardized recipes (SQL queries). Here’s a step-by-step analogy:
- Model Loading: Imagine loading a smart chef (the model) into your kitchen (your coding environment) equipped with all the necessary cooking tools (libraries).
- Context Setting: You provide the chef with a menu (database schema) and specific orders from patrons (user queries), ensuring they are aware of what’s available in the kitchen.
- Query Generation: Based on the orders, the chef quickly retrieves and pans out the dishes (SQL commands) that fulfill the patrons’ requests.
Troubleshooting
If you encounter any issues during setup or execution, consider the following:
- Make sure that you have the correct versions of Python and the transformers library installed.
- Double-check that the model ID is correct and that you have internet access to download the model from Hugging Face.
- If the code throws an error, ensure that your context and input formats are correct.
For more insights, updates, or to collaborate on AI development projects, stay connected with fxis.ai.
Concluding Remarks
At fxis.ai, we believe that such advancements are crucial for the future of AI, as they enable more comprehensive and effective solutions. Our team is continually exploring new methodologies to push the envelope in artificial intelligence, ensuring that our clients benefit from the latest technological innovations.

