How to Get Started with the Timescale NFT Starter Kit

Aug 30, 2024 | Programming

The Timescale NFT Starter Kit serves as a stepping stone to dive into the world of NFTs, enabling you to collect, store, analyze, and visualize NFT data from OpenSea using PostgreSQL and TimescaleDB. This user-friendly guide will help you set up your environment, run data ingestion scripts, and build engaging dashboards to visualize your insights.

Understanding the Foundation

Imagine diving into a treasure chest full of valuable collectibles; each NFT resembles a unique item in that chest. The Timescale NFT Starter Kit is like a map leading you through the treasure trove of NFT data, allowing you to identify trends, make informed decisions, and explore complex analytical projects.

Project Components

The kit consists of several standalone components that facilitate your data exploration journey:

  • Database Schema: A relational schema for storing NFT sales, assets, collections, and accounts.
  • Data Ingestion:
    • A data ingestion script that collects historical data from OpenSea.
    • Sample data for quick ingestion.
  • Dashboards:
    • Streamlit dashboard for analyzing collection sales.
    • Grafana dashboard template.
    • Dockerized TimescaleDB + Apache Superset for storing and analyzing NFTs.
  • Data Analysis: Sample queries to kick off your analysis.

Getting Started

To start your journey with the Timescale NFT Starter Kit, follow these steps:

git clone https://github.com/timescale/nft-starter-kit.git
cd nft-starter-kit

Setting Up Pre-built Superset Dashboards

This section is entirely Dockerized. By completing the steps below, you will create a local TimescaleDB and Superset instance running in containers with over 500K NFT transactions:

Prerequisites

Instructions

  1. Run the following command in the pre-built dashboards folder:
  2. cd pre-built-dashboards
    docker-compose up --build
  3. Access the dashboard at http://0.0.0.0:8088. Log in with:
  4. user: admin
    password: admin
  5. View the Databases page in Superset at this link.
  6. Explore your NFT dashboards:
    • Collections dashboard: here.
    • Assets dashboard: here.

Running the Data Ingestion Script

To ingest data directly from the OpenSea API into your database, follow these instructions:

Prerequisites

Instructions

  1. Navigate to the root folder:
  2. cd nft-starter-kit
  3. Create a new Python virtual environment and install requirements:
  4. virtualenv env 
    source env/bin/activate 
    pip install -r requirements.txt
  5. Update the config.py file with your parameters.
  6. Run the ingestion script:
  7. python opensea_ingest.py
  8. Monitor the process as it fetches transactions and ingests data in batches.

Ingesting Sample Data

If you’re eager to get started without waiting for data, you can ingest a sample dataset. Here’s how:

Prerequisites

Instructions

  1. Go to the folder containing the sample CSV files:
  2. cd pre-built-dashboards/database/data
  3. Connect to your database:
  4. psql -x postgres://host:port/tsdb?sslmode=require
  5. Import the CSV files in order:
  6. copy accounts FROM 001_accounts.csv CSV HEADER;
    copy collections FROM 002_collections.csv CSV HEADER;
    copy assets FROM 003_assets.csv CSV HEADER;
    copy nft_sales FROM 004_nft_sales.csv CSV HEADER;
  7. Run some queries to verify:
  8. SELECT count(*), MIN(time) AS min_date, MAX(time) AS max_date FROM nft_sales;

Troubleshooting

If you encounter issues while setting up or running your scripts, try the following:

  • Ensure Docker and Docker Compose are installed and running properly.
  • Check that there are no other services using ports 8088 and 6543.
  • Verify your PostgreSQL connection details in the config.py file.
  • Monitor the logs for any error messages that might indicate where the issue lies.

For more insights, updates, or to collaborate on AI development projects, stay connected with fxis.ai.

At fxis.ai, we believe that such advancements are crucial for the future of AI, as they enable more comprehensive and effective solutions. Our team is continually exploring new methodologies to push the envelope in artificial intelligence, ensuring that our clients benefit from the latest technological innovations.

Stay Informed with the Newest F(x) Insights and Blogs

Tech News and Blog Highlights, Straight to Your Inbox