If you’re diving into the exciting world of cryptocurrency development or looking to explore transaction data, Block Atlas is your clean explorer API and transaction observer. While it’s important to note that this repository is no longer maintained, you can still harness its power to build your applications. This blog will guide you through the setup process, explaining how to configure and run Block Atlas effectively.
Understanding Block Atlas
Imagine you’re throwing a grand party, and you need to keep track of all the guests, their arrival times, and their interactions throughout the event. Block Atlas acts as the perfect party planner, watching over each transaction and notifying you of important activities. Instead of guest names, it handles cryptocurrency transaction data. By connecting to nodes and explorer APIs, it organizes the information into a neat JSON format, making it easy to work with.
Supported Coins
Block Atlas supports over 25 blockchains, including:
- Bitcoin
- Ethereum
- Binance Chain
- …and more!
You can find the complete feature matrix here.
Architecture Overview
The architecture of Block Atlas allows you to:
- Access information about transactions, tokens, staking details, and collectibles for supported coins.
- Subscribe to receive price notifications via RabbitMQ.
The API operates independently, meaning it can handle specific blockchains like Bitcoin or Ethereum without interference. Notifications flow through several components to ensure you receive timely insights into transactions.
Setup Requirements
To get started with Block Atlas, you need the following prerequisites:
- Go Toolchain (version 1.14+)
- PostgreSQL for storing user subscriptions and block numbers
- RabbitMQ for passing subscriptions and sending transaction notifications
Quick Start
Let’s get rolling with the setup! Follow these steps:
1. Download the Source Code
Open your terminal and run:
go get -u github.com/trustwallet/blockatlas
cd $(go env GOPATH)/src/github.com/trustwallet/blockatlas
2. Build and Run the Services
You’ll need to build and start various components of Block Atlas:
go build -o api-bin cmd/api/main.go
./api-bin -p 8420 # Start Platform API server
go build -o parser-bin cmd/parser/main.go
./parser-bin # Start parser
go build -o notifier-bin cmd/notifier/main.go
./notifier-bin # Start notifier
go build -o subscriber-bin cmd/subscriber/main.go
./subscriber-bin # Start subscriber
3. Alternatively, Use Make Commands
To build and start all services, you can simply use:
make go-build
make start
4. Using Docker
For those who prefer containerization, you can use Docker:
docker-compose build
docker-compose up # Start all services
docker-compose build api
docker-compose start api # Start individual service
Configuration
By default, all services enable public RPC/explorer APIs for the supported coins, so you can start using Block Atlas right away without additional configurations. However, to run a specific service, you can set the environmental variable ATLAS_PLATFORM
related to the required blockchain:
ATLAS_PLATFORM=ethereum go run cmd/api/main.go
# For multiple platforms
ATLAS_PLATFORM=ethereum,binance,bitcoin go run cmd/api/main.go
Testing and Documentation
Before going live, it’s essential to test your setup. You can run:
make test # For unit tests
Mocked tests are also advised to verify API interactions without relying on external services.
Troubleshooting
Let’s address some common issues you might encounter:
- Service Not Starting: Ensure all prerequisites are installed correctly and that you’re using the right commands.
- Error in Configuration: Double-check the
config.yml
file for any typos and ensure environmental variables are set properly. - RabbitMQ Connection Issues: Verify that RabbitMQ is running and that the correct ports are open.
For more insights, updates, or to collaborate on AI development projects, stay connected with fxis.ai.
Conclusion
At fxis.ai, we believe that such advancements are crucial for the future of AI, as they enable more comprehensive and effective solutions. Our team is continually exploring new methodologies to push the envelope in artificial intelligence, ensuring that our clients benefit from the latest technological innovations.