Subscan Q3/Q4 2024 Funding Proposal: Basic service fee & Development cost for the Polkadot, Westend, and Rococo Ecosystems)

1d 13hrs ago
0 Comments

> Due to potential display issues with some charts in markdown format across various devices and platforms, please refer to the detailed information in this Google Document:

https://docs.google.com/document/d/1qeOoJww2aXMzAcpCMyp5r_ETxcjSm1SFGCAEtNL2T9Y/edit?usp=sharing

------------

Subscan is a comprehensive and widely-used explorer and indexer within the Polkadot ecosystem. It provides developers, validators, parachain teams, and general users with critical data insights—enabling efficient on-chain operations, debugging, and decision-making. We have previously received funding from the Polkadot Treasury for infrastructure and development costs, ensuring that the platform remains stable, accessible, and continuously evolving.

Past proposals

Overview and Rationale

In earlier funding cycles, we relied on a reimbursement model closely tied to API usage and traffic distribution. This often resulted in cost volatility and estimation inaccuracies. To improve fairness, predictability, and transparency, we have refined our pricing model based on storage usage—an approach that correlates more reliably with total costs and has been successfully tested over the last year.

In addition to standard maintenance, Subscan invests heavily in continuous improvement. We detail our Q3 development efforts here, linking costs directly to feature enhancements and platform optimizations. By providing this level of transparency, we help the community understand how funds translate into tangible ecosystem benefits. Future development tasks for Q4 are ongoing; while their costs are not included here, we offer a brief roadmap to demonstrate what’s on the horizon.

Encouraging Community Participation
We strongly value community feedback. Stakeholders can suggest new features, raise issues, and influence development priorities via our GitHub Issue Tracker:

In the past, community feedback has led to UI improvements, enhanced token price display mechanisms, and more robust search functionalities. This iterative feedback loop ensures the ongoing alignment of our efforts with the community’s evolving needs.

Scope of This Proposal

1. Q3/Q4 Maintenance (14 Networks):

  • Polkadot and Parachains: Polkadot, Assethub-Polkadot (Statemint), Bridgehub-Polkadot, Collectives-Polkadot, Coretime-Polkadot, People-Polkadot
  • Westend and Parachains: Westend, Assethub-Westend, Bridgehub-Westend, Coretime-Westend
  • Rococo and Parachains (until Oct 18, 2024): Rococo, Assethub-Rococo (Rockmine), Bridgehub-Rococo, Coretime-Rococo
  • These networks benefit from high-quality data indexing, stable uptime, accurate block and transaction data, NFT indexing improvements, and reliable APIs that developers and validators depend on daily.

2. Q3 Development Costs:
This covers new feature development and improvements completed during Q3, including NFT module optimization, multi-address query APIs, login and account management enhancements, token pricing configurations, and performance and security audits. These changes have already been implemented and are improving the end-user experience, developer tooling, and platform reliability.

3. Future Work (Q4 Development Preview):
Although not included in this funding request, ongoing Q4 development tasks include improved cross-chain data retrieval, further performance optimizations, and enhanced data visualization features. We will submit these costs in Q1 2025. Transparency remains key: once Q4 tasks conclude, we will present their breakdown and associated costs.

Cost Optimization and Historical Metrics

Our previous proposal for 5 networks over 9 months totaled approximately USD 303,791.7, averaging USD 33,754.63 per month. Through targeted optimizations—improving indexing algorithms, refining caching strategies, and consolidating infrastructure—our monthly average costs have decreased by about 35.01% for the current cycle (Q3/Q4), landing at approximately USD 21,937.33 per month for Q3.

These optimizations have not come at the expense of performance. Over the past two quarters, we have:

  • Reduced average API response times by ~12% due to more efficient database queries and enhanced caching.
  • Decreased storage overhead by implementing automated data cleaning and improved indexing structures, ensuring sustained performance even as chain data grows.
  • Lowered bandwidth costs through better CDN (Cloudflare) and request-caching strategies, resulting in more stable page load times for end-users.

Benefits to Different User Groups

  • Developers: The enhanced APIs and improved indexing accuracy allow developers to more quickly access real-time and historical data, enabling them to build more reliable dApps and automation tools.
  • Validators & Parachain Teams: Up-to-date block information, node health monitoring, and clear notification pipelines help validators maintain stable operations and quickly address issues. Parachain teams receive timely insights into network performance, asset transfers, and governance activities, facilitating better decision-making.
  • End-Users & Traders: Subscan’s improved UI/UX, faster data retrieval, and more accurate token pricing empower end-users to confidently explore and evaluate chain data. Improved NFT module performance enhances the experience for users and creators alike, making the platform more accessible and engaging.

Service Fee Model & Pricing Methodology

Our maintenance packages are designed to address a variety of data needs, ensuring smooth and efficient operations. Given the complexity of our overall cost structure—which includes expenses for GCP, Cloudflare, Datadog, Onfinality, Dwellir, and Subscan’s team operations—we use storage as the primary metric to determine fees. Storage usage strongly correlates with total costs and fluctuates as new blocks are produced. To maintain fairness and accuracy, we use the median storage value measured on the 15th day of each quarter to establish the quarterly baseline fee.

Pricing Tiers:
a. Basic Plan

  • Up to 200 GB: $799/month
  • Beyond 200 GB: $5.3/GB
  • Automatically upgrades to the next tier after 350 GB

b. Advanced Plan

  • Up to 500 GB: $1,699/month
  • Beyond 500 GB: $5.2/GB
  • Automatically upgrades to the next tier after 750 GB

c. Professional Plan

  • Up to 1 TB: $2,999/month
  • Beyond 1 TB: $5/GB

Monthly Service Fee Breakdown

Database Storage & Indexing

  • What It Covers: Continuous indexing of on-chain data—from blocks and transactions to event logs, governance info, and NFTs—so users can reliably access both real-time and historical records.
  • Why It Matters: As chain activity increases, so does data volume. By tying fees to storage needs, we accurately account for the costs of scaling infrastructure (e.g., additional database capacity).

Network Egress Bandwidth

  • What It Covers: The outgoing traffic required to serve user requests quickly and consistently, including API calls, UI data retrieval, and integrations with third-party tools.
  • Why It Matters: A stable, high-bandwidth connection ensures that explorers, wallets, and dApps receive timely data responses, even during peak activity.

Monitoring & DevOps

  • What It Covers: Round-the-clock health checks, internal alert pipelines, CI/CD automation, and system auditing. This includes running multiple nodes, maintaining synchronization, and ensuring high availability.
  • Why It Matters: These processes help detect issues early (e.g., node crashes, indexer slowdowns) and streamline updates or deployments, minimizing service disruption for the community.

Node Status Monitoring & Notifications

  • What It Covers: Dedicated tools and alert mechanisms to track the real-time status of network nodes—particularly those relating to the Kusama Relay Chain and the associated parachains (Assethub, Coretime, Bridgehub, People).
  • Why It Matters: Quick notifications enable parachain teams, validators, and the wider ecosystem to react rapidly to performance changes, forks, or potential security incidents.

Technical Support & Troubleshooting

  • What It Covers: Responsive assistance for users, developers, and validators, including issue resolution, feature guidance, and integration help.
  • Why It Matters: Timely support keeps the ecosystem running smoothly and fosters a positive development environment—especially crucial for teams building dApps, explorers, and automation tools.

Why We Use Storage as a Pricing Metric

  • Correlation to Real Costs: Storage growth scales in tandem with block production and transaction activity, making it a more stable metric than raw API traffic.
  • Quarterly Baseline: By measuring the median storage on the 15th day of each quarter, we capture a fair snapshot of usage to determine the appropriate fee tier.
  • Predictable Billing: This approach reduces wild fluctuations tied to sudden traffic spikes and provides a more transparent cost basis for the community and treasury.

Beyond Storage: The Full Cost Landscape

Our pricing also factors in overhead for hosting and operational services such as GCP, Cloudflare, Datadog, Onfinality, Dwellir, plus Subscan’s own team efforts in maintenance, development, and user support. By pegging fees to storage usage, we ensure a straightforward, usage-based cost structure while continuing to deliver reliable, feature-rich explorer services to the Kusama ecosystem.

Fee Details

Q3/Q4 Maintenance Fees

  • Total Maintenance Fees for Q3 & Q4 (14 networks): USD 131,624
Network Actual Usage/GB Date Package Fees/Month Billing Period Fees
Polkadot 2569.03 15/08/2024 Professional 10724 Q3: 01/07/2024-30/09/2024 32,172
Polkadot 2752.98 15/11/2024 Professional 11644 Q4: 01/10/2024-31/12/2024 34,932
Assethub-Polkadot (Statemint) 242.68 15/08/2024 Basic+ 1025 Q3: 01/07/2024-30/09/2024 3,076
Assethub-Polkadot (Statemint) 268.53 15/11/2024 Basic+ 1162 Q4: 01/10/2024-31/12/2024 3,487
Bridgehub-Polkadot 112 15/08/2024 Basic 799 Q3: 01/07/2024-30/09/2024 2,397
Bridgehub-Polkadot 139.33 15/11/2024 Basic 799 Q4: 01/10/2024-31/12/2024 2,397
Collectives-Polkadot 132.19 15/08/2024 Basic 799 Q3: 01/07/2024-30/09/2024 2,397
Collectives-Polkadot 148.9 15/11/2024 Basic 799 Q4: 01/10/2024-31/12/2024 2,397
People-Polkadot 3.51 01/09/2024 Basic 799 Q3: 01/08/2024-30/09/2024 1,598
People-Polkadot 9.94 15/11/2024 Basic 799 Q4: 01/10/2024-31/12/2024 2,397
Coretime-Polkadot 6.91 15/11/2024 Basic 799 Q4: 01/10/2024-31/12/2024 2,397
Westend 276.57 15/08/2024 Basic+ 1205 Q3: 01/07/2024-30/09/2024 3,614
Westend 490.51 15/11/2024 Advanced 1699 Q4: 01/10/2024-31/12/2024 5,097
Assethub-Westend 185.99 15/08/2024 Basic 799 Q3: 01/07/2024-30/09/2024 2,397
Assethub-Westend 159.7 15/11/2024 Basic 799 Q4: 01/10/2024-31/12/2024 2,397
Bridgehub-Westend 100.9 15/08/2024 Basic 799 Q3: 01/07/2024-30/09/2024 2,397
Bridgehub-Westend 119.3 15/11/2024 Basic 799 Q4: 01/10/2024-31/12/2024 2,397
Coretime-westend 35.51 15/08/2024 Basic 799 Q3: 01/07/2024-30/09/2024 2,397
Coretime-westend 67.54 15/11/2024 Basic 799 Q4: 01/10/2024-31/12/2024 2,397
Rococo 1009.25 15/02/2024 Professional 2999 01/07/2024-18/10/2024 (3.5 month) 10,496
Assethub-Rococo(Rockmine) 138.56 15/02/2024 Basic 799 01/07/2024-18/10/2024 (3.5 month) 2,796
Bridgehub-Rococo 93.93 01/05/2024 Basic 799 01/07/2024-18/10/2024 (3.5 month) 2,796
Coretime-Rococo 42.54 01/05/2024 Basic 799 01/07/2024-18/10/2024 (3.5 month) 2,796

Q3 New Features Development Costs

Below are the new features implemented during Q3. Additionally, we have dedicated custom development efforts underway for Coretime and Bridgehub. Since these features have not yet reached their final milestones, the related tasks are not included in this request. We will submit a separate reimbursement proposal once the corresponding milestones are completed.

  • Completed Feature & Performance Improvements: USD 34,630 (466 hours of work with transparent hourly rates)
  • These improvements directly translate to tangible platform enhancements, already deployed and benefiting users today.
Task Number Task Name Description Product Manager/Test Designer Backend Developer Frontend Developer DevOps
Task 1 NFT Module Expansion Optimize the indexing and thumbnail loading logic of NFTs to improve image loading speed and reduce bandwidth consumption. By improving image compression algorithms and caching strategies, ensure that users have a smoother experience when browsing NFTs, while reducing server load and enhancing overall system performance. 4 hours - 26 hours 10 hours -
Task 2 Multi-Address Query Aggregation API Develop a new API that allows users to query asset, transaction, and activity information for multiple addresses at once. Optimize API performance and response speed to ensure stability under large-scale queries. Update developer documentation with detailed usage examples and parameter descriptions to facilitate third-party developer integration. 4 hours - 30 hours - -
Task 3 Platform Login and Account Management Optimization Enhance account management features, allowing users to customize personal settings such as time zone, list display count, and fiat currency units. Strengthen account security by adding multi-factor authentication options, enhancing the security of user accounts. Optimize the free API Key acquisition process to make it easier for new users to start using our services. 4 hours 4 hours 20 hours 8 hours -
Task 4 TokenCurrencyAmountTooltip Component Development Develop a new frontend component that supports displaying detailed information such as token name, current exchange rate, and value conversion when hovering over asset amounts. Ensure the component is universal and configurable, reusable in multiple pages and scenarios. Perform cross-browser and multi-device testing to ensure compatibility and responsiveness. 4 hours 4 hours 5 hours 8 hours -
Task 5 System Security Enhancement Implement real-time and post-event strict security audits on requests in gateways and backend services. This helps to timely detect and prevent potential security risks, ensuring system security and stability. At the same time, introduce detailed log recording to provide important basis for subsequent problem diagnosis, performance analysis, and security audits, allowing developers and operations personnel to clearly understand the system's operating status and request processing situations. 4 hours - 6 hours 4 hours 4 hours
Task 6 Account Snapshot Function Develop an account snapshot function to periodically record users' asset statuses, supporting historical data queries and comparisons. Optimize data storage methods and indexing structures to ensure high query performance even with large amounts of snapshot data. The frontend provides intuitive snapshot browsing and comparison tools to help users clearly understand changes in their account assets. 4 hours 4 hours 10 hours 8 hours 2 hours
Task 7 Account Search Function Field Expansion Expand the account search function to support searching accounts through On-chain Identity. Optimize search algorithms to improve the efficiency and accuracy of new field searches, meeting users' diverse search needs. Improve the display of search results to provide richer account information. 4 hours - 8 hours 4 hours -
Task 8 Circulation Chart Optimization Adjust the data classification and display method of the \Other\" category in the circulation pie chart to improve data readability and accuracy. Add interactive features to the chart, such as displaying detailed information on mouse hover, supporting data filtering and dynamic updates,providing a better user experience." 4 hours 4 hours 2 hours 8 hours -
Task 9 Multi-Chain Token Pricing Configuration Configure pricing logic for XCM and cross-chain assets to support price display for more tokens. Integrate with multiple third-party price sources to ensure the accuracy and real-time of price data. Provide interfaces to allow Token Owners or project parties to submit Token information and price data via GitHub, achieving community co-construction. 2 hours - 3 hours 4 hours 2 hours
Task 10 Cross-Chain Data Time Filtering Logic Fix Fix the logical error of the time filter in cross-chain transfer data, ensuring that users can accurately filter and query data according to the specified time range. Add unit tests and integration tests to prevent similar issues from recurring and improve system reliability. - - 8 hours 4 hours -
Task 11 Fix Slow NFT Pagination Loading Issue Optimize backend interface query performance by improving database indexes and query statements to reduce data retrieval time. Adjust frontend loading logic by adopting lazy loading and asynchronous loading technologies to reduce initial loading time, improve page response speed, and provide users with a smoother browsing experience. - 2 hours 8 hours 6 hours 2 hours
Task 12 IPFS Gateway Update Update the public IPFS gateway address to solve issues with resource loading exceptions and slow access speeds. Add support for multiple IPFS gateways, automatically selecting the fastest gateway for resource loading to improve data availability and stability. - - 4 hours 4 hours 1 hour
Task 13 Optimize Voting Module Filter Function Fix display and functionality issues of filters such as \Origins\" \"Action\" \"Proposed by\"and \"Status\" in the voting module. Optimize the response speed of filters and improve frontend interactive experiencemaking it easier for users to find and filter voting information." 2 hours - 4 hours 6 hours -
Task 14 Fix Balance Loading Delay Issue Correct the issue where account balances are not updated in time after unlocking or changes, optimizing the data refresh mechanism. Add real-time detection to ensure that users see the latest balances, improving system reliability and user trust. - - 8 hours 4 hours -
Task 15 Fix XCM Multi-Currency Price Parsing Error Fix the issue of price parsing errors for XCM cross-chain assets, updating price retrieval and calculation logic. Ensure that prices of different currencies can be accurately displayed, supporting more cross-chain assets and enhancing user confidence in cross-chain transactions. - - 10 hours 4 hours -
Task 16 Account Snapshot Index Reconstruction Reconstruct the Elasticsearch indexing logic of account snapshots, optimizing data structures to reduce index size and improve query performance. Update index creation and update strategies to reduce system load and enhance data processing efficiency. - - 20 hours - 4 hours
Task 17 Data Cleaning Mechanism Optimization Implement an automated data cleaning mechanism to regularly delete expired or invalid historical data, freeing up storage space. Add monitoring and logging for cleaning tasks to ensure the safety and traceability of the data cleaning process, preventing accidental deletion of important data. - - 14 hours - 1 hour
Task 18 Migration Task Optimization Optimize the execution process of large-scale data migration tasks by adopting batch processing and parallel processing technologies to accelerate data migration speed. Add failure retry mechanisms to ensure data integrity and consistency, reducing manual intervention. - - 14 hours - 2 hours
Task 19 Performance Monitoring Metrics Enhancement Add system performance monitoring metrics, including request response time, error rate, resource usage, etc. Implement real-time monitoring and alert mechanisms to promptly discover and handle performance issues, ensuring system stability. Improve the monitoring dashboard to provide intuitive data display and analysis tools. - 2 hours 8 hours - 1 hour
Task 20 Asset Page Pagination Optimization Optimize the pagination loading logic of the asset detail page to reduce server pressure and improve page loading speed. Coordinate frontend and backend pagination to reduce data transmission volume and enhance user experience. - 2 hours 10 hours 4 hours -
Task 21 Voting Module Code Refactoring Refactor the code of the voting module, optimizing data structures and algorithms to improve operational efficiency. Enhance code readability and maintainability, laying the foundation for subsequent feature expansions. Add code comments and documentation to facilitate team collaboration. - - 8 hours 8 hours -
Task 22 System Resource Optimization Optimize the system's caching mechanism to reduce disk and memory usage. Optimize database connection pools and query strategies to reduce resource consumption and improve system stability and scalability. - - 14 hours - 1 hour
Task 23 Error Log Recording Optimization Standardize the format and content of error logs by adding error codes, occurrence times, and context information. Implement log classification and filtering functions to help developers quickly locate and resolve issues. Configure log alert mechanisms to promptly notify relevant personnel to handle severe errors. - - 6 hours 4 hours 2 hours
Task 24 API Performance Optimization Optimize API interface response speed and reduce data processing time. Conduct special optimizations on frequently used interfaces, including improving query statements, adding indexes, and using caching. Perform stress testing to ensure stability under high concurrency scenarios. - - 18 hours   2 hours
Task 25 API Documentation Update Update the Subscan Open API documentation by adding explanations and usage examples for new interfaces. Optimize the documentation structure and format to improve readability and usability. Collect developer feedback to continuously improve documentation content, helping third parties better integrate our API. 4 hours 2 hours 2 hours - -
Task 26 People Polkadot Chain Support People Polkadot Chain Support - - 4 hours 2 hours -
Task 27 Page High-Concurrency Request Caching and Optimization By caching requests, support differentiated caching times based on the chain settings. Intercept identical high-concurrency requests and return unified results to reduce the number of duplicate requests. This optimizes server pressure, enhances page loading speed, and improves user experience. - - 8 hours - -

Our Compensation Rates and Total Costs:

  • DevOps Engineer: $75/hour
  • Front-End Engineer: $75/hour
  • Back-End Engineer: $75/hour
  • Product Manager/Test: $70/hour
  • Designer: $70/hour

With a total of 466 hours worked, the calculated cost based on the above rates amounts to $34,630.

Advertising Revenue & Offset

From August 6 to December 18, 2024, we tested advertising as a revenue source, generating USD 2,746.12. We will deduct this from our total requested amount to reduce the Treasury’s burden.

Ads remain optional and user-controlled, and based on community feedback, we may refine or discontinue this approach. Users who opt-in may receive future benefits or premium features.

Risk Assessment & Mitigation

  • Rising Storage Costs: We employ automated data cleaning and indexing optimizations to prevent runaway growth.
  • Security & Reliability: Regular audits, real-time monitoring, and DevOps best practices minimize downtime and security threats. We remain responsive to any chain or protocol updates, ensuring compatibility and smooth operation.

Long-Term Sustainability & Vision
Our long-term goal is to reduce reliance on treasury funding through multiple channels:

  • Continuous optimization to limit infrastructure costs as chain data grows.
  • Exploring responsible revenue streams (e.g., optional premium features or partnerships).
  • Actively incorporating community suggestions to ensure value delivery aligns with user needs, increasing platform utility and potential monetization avenues without compromising openness or neutrality.

Total Requested Amount

  • Q3 & Q4 Operations: USD 131,624
  • Q3 Development: USD 34,630
  • Subtract Advertising Revenue: -USD 2,746.12
  • Total: USDT 163,507.88

We are requesting to make this payment using USDT, totaling 163,507.88 USDT.

Conclusion
This proposal reflects our commitment to delivering stable, high-quality infrastructure and valuable features to the Polkadot ecosystem. Through cost optimization, transparent reporting, continuous improvement, and a clear roadmap for future development, we strive to maintain the community’s trust and support. Our approach balances technical rigor, community engagement, and long-term sustainability—ensuring Subscan remains a cornerstone for the Polkadot ecosystem’s data needs.

A heartfelt thank-you to every one of our community members for the time, patience, and continuous feedback you’ve given. With your support, we can keep improving and make our community an even better place—together!

Up
Comments
No comments here