Prompt Manager¶
Usage¶
- prompt_manager.get_model_list()
Description
Returns a list of available models in local
ollama
installation.- Returns:
List of model names
Note
The models are expected to be installed locally and accessible via the ollama python library. The function returns a list of model names that can be used for generating SQL queries.
You can check if
ollama
is installed by running the following command in your terminal:ollama list NAME ID SIZE MODIFIED starcoder2:7b 1550ab21b10d 4.0 GB 5 weeks ago codellama:7b 8fdf8f752f6e 3.8 GB 5 weeks ago sqlcoder:7b 77ac14348387 4.1 GB 5 weeks ago duckdb-nsql:latest 3ed734989690 3.8 GB 5 weeks ago
- prompt_manager.get_system_prompt()
Description
Returns a system prompt template for SQL query generation. The system prompt provides context to the model about its role and the task it needs to perform. The prompt includes a placeholder for metadata, which will be replaced with the actual database schema information. The metadata is expected to be a list of tuples, each containing information about a table and its columns. The system prompt also includes guidelines for the model to follow when generating the SQL query.
- prompt_manager.get_guideline_prompts()
Description
Returns a set of guidelines for the model to follow when generating SQL queries. The guidelines specify that the result should be a valid SQL query that can be executed on the provided schema. The model is instructed to return only the SQL query as a string. The guidelines are designed to ensure that the generated SQL query is syntactically correct and adheres to SQL standards. The guidelines are appended to the system prompt to provide additional context to the model.
- prompt_manager.generate_sql(prompt: str, model: str)
Description
Generates a SQL query from a natural language prompt using the specified model. The function first retrieves the metadata from the database using the get_all_metadata function. The metadata is then used to replace the placeholder in the system prompt template. The model is then called to generate the SQL query based on the provided prompt and system prompt. The generated SQL query is returned along with the metadata used for generation.
- Parameters:
prompt (str) – The natural language prompt.
model (str) – The model to use for query generation (default: duckdb-nsql:latest).
- Returns:
A tuple containing the generated SQL query and the metadata used for generation.
- Raises:
Exception – If the model is not found or if there is an error during SQL generation.
Installation¶
Ollama Installation: https://ollama.com/download
Ollama CLI commands: https://github.com/ollama/ollama?tab=readme-ov-file#quickstart
ollama
Usage
ollama [flags]
ollama [command]
Available Commands
serve Start ollama
create Create a model from a Modelfile
show Show information for a model
run Run a model
pull Pull a model from a registry
push Push a model to a registry
list List models
ps List running models
cp Copy a model
rm Remove a model
help Help about any command
Flags:
-h, --help help for ollama
-v, --version Show version information
Use "ollama [command] --help" for more information about a command.
Ollama Python Package
pip install ollama