Setting up the Prometheus SQL Exporter may seem like a daunting task, but fear not! This guide will walk you through the process step-by-step while providing troubleshooting tips along the way. By the end, you’ll have a powerful tool for running SQL queries and exporting metrics for Prometheus consumption.
What is Prometheus SQL Exporter?
Prometheus SQL Exporter is a service that runs user-defined SQL queries at flexible intervals and exports the resulting metrics via HTTP for Prometheus to consume. Currently, it supports various databases such as PostgreSQL, MySQL, and AWS Athena, among others.

Getting Started
To get started with Prometheus SQL Exporter, follow these steps:
- First, create a configuration file named _config.yml_.
- Next, run the following command to install the required package:
go get github.com/justwatchcom/sql_exporter
- Then, copy the default configuration file:
cp config.yml.dist config.yml
Running the Service
You can run the service directly or using Docker. Here are both methods:
Running Directly
Once you have your configuration file ready, run the exporter like this:
./sql_exporter
Running in Docker
If you prefer using Docker, run the following command:
docker run -v $(pwd)/config.yml:/config/config.yml -e CONFIG=/config/config.yml -d -p 9237:9237 --name sql_exporter ghcr.io/justwatchcom/sql_exporter
Understanding the Configuration
Think of your configuration file as a recipe for a dish. Each component of the file is akin to an ingredient, determining the end result. The jobs section is where you define how frequently SQL queries should occur, much like how often you season your dish. Each job has a unique name and an interval or CRON schedule, determining when it runs.
For example:
jobs:
- name: example
interval: 5m
cron_schedule: 0 0 * * *
connections:
- postgres:postgres@localhost/postgres?sslmode=disable
queries:
- name: running_queries
help: Number of running queries
labels:
- datname
- usename
values:
- count
query: SELECT now() as created_at, datname::text, usename::text, COUNT(*)::float AS count FROM pg_stat_activity GROUP BY created_at, datname, usename;
allow_zero_rows: false
Here, you instruct the exporter to execute an SQL query every 5 minutes, providing essential metrics like the number of running queries.
Logging and Environment Variables
To change the log level, you can set the LOGLEVEL environment variable. This feature is like adjusting the volume on your radio; it helps you get the right amount of feedback from the system. For example:
LOGLEVEL=info
Troubleshooting
If you encounter issues during setup, consider these troubleshooting tips:
- Make sure that your configuration file is correctly formatted, as YAML is sensitive to indentation.
- Confirm that the database connections are properly set up and the necessary permissions are in place.
- Check if the Docker container is running, and the ports are correctly mapped.
For more insights, updates, or to collaborate on AI development projects, stay connected with fxis.ai.
Conclusion
At fxis.ai, we believe that such advancements are crucial for the future of AI, as they enable more comprehensive and effective solutions. Our team is continually exploring new methodologies to push the envelope in artificial intelligence, ensuring that our clients benefit from the latest technological innovations.