Problem
Multiple validators report inconsistent application of DN rules (e.g., “discouraged/unique” locations and contributions), a vague “holistic approach,” perceived favoritism, and limited, ad-hoc feedback. Some also question validator identity continuity (new wallets) and the practical meaning of operator financial independence.
Because DN is a Web3 Foundation–run program (not protocol/OpenGov), a Wish-For-Change (WFC) referendum is the right tool to register community expectations without an executable call.
Rationale
DN’s stated goals include diversification (location/provider/hardware) and strong operational performance.
Clear, measurable selection criteria and transparent committee actions directly support these goals and build trust.
The WFC track exists precisely to capture on-chain community sentiment via a remark - signaling desired changes without altering state or blocking Root.
Committee Accountability & Governance Responsibility
The DN Committee is composed of hired employees of Web3 Foundation. This is a critical distinction: their actions and decisions are not merely individual, but institutional.
When a paid committee manages access to DN slots - which provide real economic opportunities - the community has a legitimate expectation of accountability, transparency, and procedural fairness.
If irregularities or rule violations are identified in the DN selection process, these incidents should not be dismissed as personal mistakes.
They should be treated as governance failures that require:
* Transparent and consistent rule application
* Clear decision accountability and traceability
* An appeals and review mechanism with public reporting
* Oversight beyond closed internal processes
In such cases, the community should:
Trust in DN cannot rely on closed-door decisions made by a small group of employees. For a program funded and operated under Web3 Foundation, accountability to the community is not optional - it’s a fundamental requirement.
Deliverables
1 - Publish measurable selection criteria & methodology
* Public “criteria & examples” guide, showing how discouraged/unique locations, independence, uptime/telemetry, and contributions are evaluated.
* After each cohort: a short template summary of how criteria were applied.
* Provide cohort applicants with access to a comparative table of submitted applications (excluding personal data) during the selection process, ensuring the data is verifiable and trustworthy.
2 - Committee transparency & conflicts policy
* Disclose the selection body at least to KYC/KYB’d applicants (NDA if needed).
* Publish a brief conflicts-of-interest handling process.
3 - Unified policy on operator financial independence
* Clarify how transfers/loans/gifts between DN participants (including via DAOs) are treated; provide compliant vs. non-compliant examples and verification steps.
4 - Validator identity continuity
* If stash/addresses change, require explicit linkage to prior identity and track record (on-chain identity/sub-identities + relevant off-chain fields).
5 - Reasonable cap on per-operator DN slots per cohort
* Exclude operators who already run active self-funded validators from receiving additional DN slots, in order to prioritize new and smaller operators.
* DN slots should be reserved primarily for participants who do not already have active validators.
* Any exceptions must be clearly justified and documented, and the final DN slot distribution per operator must be published after each selection round to ensure full transparency and fairness.
6 - Structured appeals & feedback
* Standard form, scope that avoids private data leaks, and an SLA for responses (e.g., 14 days).
7 - Cohort 3 spot review once rules are public
* Reassess disputed cases under the clarified framework and publish a short post-mortem.
8 - Per-cohort public metrics (aggregate, non-private)
* Regional/provider/hardware diversity and per-operator slot counts.
9 - Comparative application table (read-only access during cohort selection)
* Provide applicants with view-only access to an anonymized comparative table of key parameters (e.g., location, provider type, hardware, contribution metrics, uptime, status of discouraged/unique factors).
* All personal or sensitive applicant data must be excluded.
* This will increase transparency, allow participants to better understand the competitive landscape, and reduce perceptions of favoritism.
Timeline:
* Within 30 days of WFC passing: Publish updated rules/criteria, examples guide, and committee/conflict policy.
* Within 45 days: Launch standardized appeals workflow with public SLA and response template.
* Each cohort: Publish aggregate diversity & slot-distribution metrics.
* Within 60 days: Brief report on any Cohort 3 re-evaluations using the new framework.
Links / References:
DN site - Processes, Selection Criteria & Rules (program goals & criteria)
DN Terms & Conditions (operator obligations/independence language)
DN Cohort 1 (announcement/results reference).
DN Cohort 2 (announcement/results reference).
DN Cohort 2.1 (announcement/results reference).
DN Cohort 3 (announcement/results reference).
Problem
Multiple validators report inconsistent application of DN rules (e.g., “discouraged/unique” locations and contributions), a vague “holistic approach,” perceived favoritism, and limited, ad-hoc feedback. Some also question validator identity continuity (new wallets) and the practical meaning of operator financial independence.
Because DN is a Web3 Foundation–run program (not protocol/OpenGov), a Wish-For-Change (WFC) referendum is the right tool to register community expectations without an executable call.
Rationale
DN’s stated goals include diversification (location/provider/hardware) and strong operational performance.
Clear, measurable selection criteria and transparent committee actions directly support these goals and build trust.
The WFC track exists precisely to capture on-chain community sentiment via a remark - signaling desired changes without altering state or blocking Root.
Committee Accountability & Governance Responsibility
The DN Committee is composed of hired employees of Web3 Foundation. This is a critical distinction: their actions and decisions are not merely individual, but institutional.
When a paid committee manages access to DN slots - which provide real economic opportunities - the community has a legitimate expectation of accountability, transparency, and procedural fairness.
If irregularities or rule violations are identified in the DN selection process, these incidents should not be dismissed as personal mistakes.
They should be treated as governance failures that require:
* Transparent and consistent rule application
* Clear decision accountability and traceability
* An appeals and review mechanism with public reporting
* Oversight beyond closed internal processes
In such cases, the community should:
Trust in DN cannot rely on closed-door decisions made by a small group of employees. For a program funded and operated under Web3 Foundation, accountability to the community is not optional - it’s a fundamental requirement.
Deliverables
1 - Publish measurable selection criteria & methodology
* Public “criteria & examples” guide, showing how discouraged/unique locations, independence, uptime/telemetry, and contributions are evaluated.
* After each cohort: a short template summary of how criteria were applied.
* Provide cohort applicants with access to a comparative table of submitted applications (excluding personal data) during the selection process, ensuring the data is verifiable and trustworthy.
2 - Committee transparency & conflicts policy
* Disclose the selection body at least to KYC/KYB’d applicants (NDA if needed).
* Publish a brief conflicts-of-interest handling process.
3 - Unified policy on operator financial independence
* Clarify how transfers/loans/gifts between DN participants (including via DAOs) are treated; provide compliant vs. non-compliant examples and verification steps.
4 - Validator identity continuity
* If stash/addresses change, require explicit linkage to prior identity and track record (on-chain identity/sub-identities + relevant off-chain fields).
5 - Reasonable cap on per-operator DN slots per cohort
* Exclude operators who already run active self-funded validators from receiving additional DN slots, in order to prioritize new and smaller operators.
* DN slots should be reserved primarily for participants who do not already have active validators.
* Any exceptions must be clearly justified and documented, and the final DN slot distribution per operator must be published after each selection round to ensure full transparency and fairness.
6 - Structured appeals & feedback
* Standard form, scope that avoids private data leaks, and an SLA for responses (e.g., 14 days).
7 - Cohort 3 spot review once rules are public
* Reassess disputed cases under the clarified framework and publish a short post-mortem.
8 - Per-cohort public metrics (aggregate, non-private)
* Regional/provider/hardware diversity and per-operator slot counts.
9 - Comparative application table (read-only access during cohort selection)
* Provide applicants with view-only access to an anonymized comparative table of key parameters (e.g., location, provider type, hardware, contribution metrics, uptime, status of discouraged/unique factors).
* All personal or sensitive applicant data must be excluded.
* This will increase transparency, allow participants to better understand the competitive landscape, and reduce perceptions of favoritism.
Timeline:
* Within 30 days of WFC passing: Publish updated rules/criteria, examples guide, and committee/conflict policy.
* Within 45 days: Launch standardized appeals workflow with public SLA and response template.
* Each cohort: Publish aggregate diversity & slot-distribution metrics.
* Within 60 days: Brief report on any Cohort 3 re-evaluations using the new framework.
Links / References:
DN site - Processes, Selection Criteria & Rules (program goals & criteria)
DN Terms & Conditions (operator obligations/independence language)
DN Cohort 1 (announcement/results reference).
DN Cohort 2 (announcement/results reference).
DN Cohort 2.1 (announcement/results reference).
DN Cohort 3 (announcement/results reference).
Threshold
Threshold