The Timescale NFT Starter Kit serves as a stepping stone to dive into the world of NFTs, enabling you to collect, store, analyze, and visualize NFT data from OpenSea using PostgreSQL and TimescaleDB. This user-friendly guide will help you set up your environment, run data ingestion scripts, and build engaging dashboards to visualize your insights.
Understanding the Foundation
Imagine diving into a treasure chest full of valuable collectibles; each NFT resembles a unique item in that chest. The Timescale NFT Starter Kit is like a map leading you through the treasure trove of NFT data, allowing you to identify trends, make informed decisions, and explore complex analytical projects.
Project Components
The kit consists of several standalone components that facilitate your data exploration journey:
- Database Schema: A relational schema for storing NFT sales, assets, collections, and accounts.
- Data Ingestion:
- A data ingestion script that collects historical data from OpenSea.
- Sample data for quick ingestion.
- Dashboards:
- Streamlit dashboard for analyzing collection sales.
- Grafana dashboard template.
- Dockerized TimescaleDB + Apache Superset for storing and analyzing NFTs.
- Data Analysis: Sample queries to kick off your analysis.
Getting Started
To start your journey with the Timescale NFT Starter Kit, follow these steps:
git clone https://github.com/timescale/nft-starter-kit.git
cd nft-starter-kit
Setting Up Pre-built Superset Dashboards
This section is entirely Dockerized. By completing the steps below, you will create a local TimescaleDB and Superset instance running in containers with over 500K NFT transactions:
Prerequisites
Instructions
- Run the following command in the pre-built dashboards folder:
- Access the dashboard at http://0.0.0.0:8088. Log in with:
- View the Databases page in Superset at this link.
- Explore your NFT dashboards:
cd pre-built-dashboards
docker-compose up --build
user: admin
password: admin
Running the Data Ingestion Script
To ingest data directly from the OpenSea API into your database, follow these instructions:
Prerequisites
- Python 3
- TimescaleDB installed
- Schema set up using the schema.sql script.
Instructions
- Navigate to the root folder:
- Create a new Python virtual environment and install requirements:
- Update the config.py file with your parameters.
- Run the ingestion script:
- Monitor the process as it fetches transactions and ingests data in batches.
cd nft-starter-kit
virtualenv env
source env/bin/activate
pip install -r requirements.txt
python opensea_ingest.py
Ingesting Sample Data
If you’re eager to get started without waiting for data, you can ingest a sample dataset. Here’s how:
Prerequisites
Instructions
- Go to the folder containing the sample CSV files:
- Connect to your database:
- Import the CSV files in order:
- Run some queries to verify:
cd pre-built-dashboards/database/data
psql -x postgres://host:port/tsdb?sslmode=require
copy accounts FROM 001_accounts.csv CSV HEADER;
copy collections FROM 002_collections.csv CSV HEADER;
copy assets FROM 003_assets.csv CSV HEADER;
copy nft_sales FROM 004_nft_sales.csv CSV HEADER;
SELECT count(*), MIN(time) AS min_date, MAX(time) AS max_date FROM nft_sales;
Troubleshooting
If you encounter issues while setting up or running your scripts, try the following:
- Ensure Docker and Docker Compose are installed and running properly.
- Check that there are no other services using ports 8088 and 6543.
- Verify your PostgreSQL connection details in the config.py file.
- Monitor the logs for any error messages that might indicate where the issue lies.
For more insights, updates, or to collaborate on AI development projects, stay connected with fxis.ai.
At fxis.ai, we believe that such advancements are crucial for the future of AI, as they enable more comprehensive and effective solutions. Our team is continually exploring new methodologies to push the envelope in artificial intelligence, ensuring that our clients benefit from the latest technological innovations.