# PQC Integration — KPI Dashboard **Proposal:** €2.8M HORIZON-CL3-2025-CS-ECCC-06 **Version:** 1.0 **Date:** 2025-11-06 **Owner:** VaultMesh Technologies B.V. (Coordinator) --- ## Overview This dashboard defines **quantitative** and **qualitative** Key Performance Indicators (KPIs) aligned with Horizon Europe evaluation criteria (Excellence 30%, Impact 30%, Implementation 40%). **Measurement approach:** - **Baseline:** Current state (TRL 4, existing VaultMesh node) - **Target:** End of project (M24, TRL 6) - **Verification:** How we prove the target was achieved - **Frequency:** How often we measure during project --- ## Excellence KPIs (Technical Innovation & Methodology) ### E1: Technology Readiness Level (TRL) Progression | Metric | Baseline (M0) | Target (M24) | Verification Method | Measurement Frequency | |--------|---------------|--------------|---------------------|----------------------| | **TRL Level** | 4 (Lab validation) | 6 (Pilot validation) | Independent TRL audit by external evaluator | M12, M24 | | **PQC Algorithms Integrated** | 0 | 3 (Kyber, Dilithium, SPHINCS+) | Code repository tags + unit test coverage | Monthly | | **Receipt Throughput** | 1,000 receipts/day | 10,000 receipts/day | Benchmark tests (D2.2) | Quarterly | | **Merkle Tree Depth** | 5 levels (36 manifests) | 8 levels (256 manifests) | Compaction efficiency metrics | Monthly | **Success Criteria:** TRL 6 achieved if ≥2/3 pilot sites validate system in operational environment. --- ### E2: Scientific Publications & Dissemination | Metric | Baseline (M0) | Target (M24) | Verification Method | Measurement Frequency | |--------|---------------|--------------|---------------------|----------------------| | **Peer-Reviewed Publications** | 0 | 10+ (top-tier venues: IEEE S&P, ACM CCS, Usenix Security) | DOI links in D5.3 | M12: 3, M18: 7, M24: 10+ | | **Conference Presentations** | 0 | 5+ (invited talks at ETSI TC CYBER, IETF CFRG) | Presentation slides + recordings | Quarterly | | **Technical Reports** | 0 | 3 (D2.3, D3.3, D4.3) | Submitted to EU Open Research Repository | Per deliverable | | **Open-Source Contributions** | 1 repo (vaultmesh-core) | 5+ repos (sealer, verifier, psi-field, router, pilots) | GitHub stars (target: 500+), forks (target: 50+) | Monthly | **Success Criteria:** ≥8 publications in top-tier venues (h-index ≥30) by M24. --- ### E3: Standards Contributions | Metric | Baseline (M0) | Target (M24) | Verification Method | Measurement Frequency | |--------|---------------|--------------|---------------------|----------------------| | **Standards Drafts Submitted** | 0 | 5+ (ETSI, IETF, ISO/IEC) | Draft IDs + submission confirmations (D5.2) | M18: 2, M24: 5+ | | **Working Group Participation** | 0 | 3+ (ETSI TC CYBER, IETF CFRG, ISO/IEC JTC 1/SC 27) | Meeting attendance records | Quarterly | | **Reference Implementation Adoption** | 0 | 3+ organizations test VaultMesh PQC sealer | Community feedback + GitHub issues | M18, M24 | **Success Criteria:** ≥3 standards drafts accepted for working group review by M24. --- ## Impact KPIs (Societal & Economic Value) ### I1: Compliance Cost Reduction | Metric | Baseline (M0) | Target (M24) | Verification Method | Measurement Frequency | |--------|---------------|--------------|---------------------|----------------------| | **Audit Hours Saved per Incident** | 0% (no baseline) | 30% reduction vs. manual audit | Pilot benchmarks (D5.1): time to verify receipt chain vs. manual log review | Pilot phase (M12-M24) | | **Receipt Verification Time** | N/A | <5 seconds per receipt (Merkle proof) | Performance benchmarks (D2.2) | Quarterly | | **Cost per Receipt (€)** | €0 (no TSA/blockchain yet) | <€0.01 per receipt (batched anchoring) | Monthly TSA/blockchain invoices | Monthly | | **Audit Trail Completeness** | 85% (current VaultMesh node) | 99%+ (LAWCHAIN + TSA anchoring) | Pilot assessments (D5.1) | Pilot phase | **Success Criteria:** ≥2/3 pilot sites report ≥25% audit cost reduction vs. their current systems. --- ### I2: Incident Response Improvement | Metric | Baseline (M0) | Target (M24) | Verification Method | Measurement Frequency | |--------|---------------|--------------|---------------------|----------------------| | **Incident Detection Time** | N/A (no Ψ-Field yet) | 50% faster vs. manual monitoring | Pilot logs (D5.1): time from anomaly to alert | Pilot phase | | **False Positive Rate** | N/A | <10% (Ψ-Field tuned thresholds) | Pilot feedback + precision/recall metrics | Monthly (pilot phase) | | **Forensic Query Speed** | N/A | <10 seconds (LAWCHAIN indexed queries) | Benchmarks (D4.2) | Quarterly | **Success Criteria:** ≥1/3 pilot sites demonstrate ≥40% faster incident detection with <15% false positive rate. --- ### I3: Adoption & Dissemination | Metric | Baseline (M0) | Target (M24) | Verification Method | Measurement Frequency | |--------|---------------|--------------|---------------------|----------------------| | **Open-Source Downloads** | ~100/month (current vaultmesh-core) | 500+ post-M24 (cumulative over 6 months post-project) | GitHub Insights, Docker Hub pulls | Monthly | | **Pilot Participants** | 0 | 15+ peers (5 per pilot site: France, Czech, Greece) | Pilot deployment reports (D5.1) | M12: 5, M18: 10, M24: 15+ | | **Training Workshops** | 0 | 3+ (1 per pilot region) | Attendance lists + materials published | M15, M18, M21 | | **Media Coverage** | 0 | 5+ articles (tech press, cybersecurity blogs) | Links collected in D5.3 | M12: 1, M18: 3, M24: 5+ | **Success Criteria:** ≥400 downloads and ≥12 pilot peers by M24. --- ### I4: Sovereignty Enhancement | Metric | Baseline (M0) | Target (M24) | Verification Method | Measurement Frequency | |--------|---------------|--------------|---------------------|----------------------| | **Cross-Border Federation Nodes** | 0 | 15+ (across 3 countries) | Federation testbed logs (D4.2) | M12: 5, M18: 10, M24: 15+ | | **Sovereign Data Exchange (no third-party cloud)** | 0% | 100% (mTLS peer-to-peer) | Architecture review (D1.2) + pilot deployments | Pilot phase | | **GDPR Compliance** | Partial (current node) | Full (GDPR Art. 5(1)(f), Art. 25 compliance) | Legal review + ethics assessment (D5.3) | M10, M24 | **Success Criteria:** ≥12 federation nodes operational with 100% peer-to-peer exchange (no third-party intermediaries). --- ## Implementation KPIs (Management & Execution) ### IM1: Deliverable Completion | Metric | Baseline (M0) | Target (M24) | Verification Method | Measurement Frequency | |--------|---------------|--------------|---------------------|----------------------| | **Deliverables Submitted On-Time** | N/A | 100% (13/13 deliverables by deadline) | EU portal submission confirmations | Per deliverable | | **Deliverable Quality (EU Review)** | N/A | Average ≥4/5 stars (if EU provides feedback) | EU reviewer comments | M12, M24 | | **Public Deliverables** | N/A | 9/13 deliverables (DMP, reports, standards drafts) | Open access repository | Per deliverable | **Deliverable List (13 total):** - **WP1:** D1.1 (M3), D1.2 (M6) - **WP2:** D2.1 (M8), D2.2 (M11), D2.3 (M14) - **WP3:** D3.1 (M10), D3.2 (M14), D3.3 (M16) - **WP4:** D4.1 (M12), D4.2 (M16), D4.3 (M18) - **WP5:** D5.1 (M20), D5.2 (M22), D5.3 (M24) **Success Criteria:** ≥12/13 deliverables on-time, ≥8/9 public deliverables accessible via Open Access. --- ### IM2: Budget & Resource Management | Metric | Baseline (M0) | Target (M24) | Verification Method | Measurement Frequency | |--------|---------------|--------------|---------------------|----------------------| | **Budget Burn Rate** | 0% | Linear burn (±10% variance per quarter) | Financial reports to EU | Quarterly | | **Person-Months Allocated** | 0 PM | 104 PM total (VaultMesh: 44, Brno: 24, Cyber Trust: 30, France: 18) | Timesheet reports | Monthly | | **Contingency Budget Used** | 0% | <50% (€140K of €280K contingency) | Steering committee approvals | Monthly | | **Cost Overruns** | N/A | 0 WPs exceed budget by >15% | Partner financial statements | Quarterly | **Success Criteria:** ≤10% variance from planned budget per WP, <50% contingency used. --- ### IM3: Consortium Coordination | Metric | Baseline (M0) | Target (M24) | Verification Method | Measurement Frequency | |--------|---------------|--------------|---------------------|----------------------| | **Steering Committee Meetings** | 0 | 24+ (monthly for 24 months) | Meeting minutes | Monthly | | **Partner Attendance Rate** | N/A | ≥90% (all 4 partners attend ≥22/24 meetings) | Attendance logs | Monthly | | **Conflict Resolution Time** | N/A | <2 weeks (escalations resolved within 2 weeks) | Conflict log (internal) | As needed | | **Knowledge Transfer Events** | 0 | 6+ (workshops, joint debugging sessions) | Event reports | Quarterly | **Success Criteria:** ≥90% attendance, no unresolved conflicts lasting >1 month. --- ### IM4: Risk Mitigation Effectiveness | Metric | Baseline (M0) | Target (M24) | Verification Method | Measurement Frequency | |--------|---------------|--------------|---------------------|----------------------| | **High Risks (Score ≥6)** | 0 | 0 (no critical blockers by M24) | Risk register updates | Monthly | | **Risks Closed** | 0 | ≥5/15 risks closed as mitigated/irrelevant | Risk register | Quarterly | | **Risks Escalated to EU** | N/A | 0 (all handled internally) | EU correspondence | As needed | **Success Criteria:** No high-risk items at M24, ≥5 risks closed, 0 escalations to EU. --- ## Summary KPI Table (For Part B Section 2.1) | Category | KPI | Baseline | Target (M24) | Verification | |----------|-----|----------|--------------|--------------| | **Excellence** | TRL Level | 4 | 6 | External TRL audit | | **Excellence** | Publications | 0 | 10+ (top-tier) | DOI links | | **Excellence** | Standards Drafts | 0 | 5+ (ETSI/IETF/ISO) | Draft IDs | | **Impact** | Audit Cost Reduction | 0% | 30% | Pilot benchmarks (D5.1) | | **Impact** | Incident Detection | N/A | 50% faster | Pilot logs | | **Impact** | Open-Source Downloads | ~100/mo | 500+ post-M24 | GitHub Insights | | **Impact** | Federation Nodes | 0 | 15+ (3 countries) | Testbed logs (D4.2) | | **Implementation** | Deliverables On-Time | N/A | 100% (13/13) | EU portal confirmations | | **Implementation** | Budget Variance | N/A | ≤10% per WP | Financial reports | | **Implementation** | Steering Attendance | N/A | ≥90% | Attendance logs | --- ## KPI Dashboard Access **During Project:** - **Consortium Portal:** Real-time KPI tracking via Mattermost/NextCloud dashboard - **Monthly Steering Calls:** Review 3-5 priority KPIs per call - **Quarterly Reports:** Full KPI table in EU periodic reports **Public KPIs (Post-M24):** - Open-source downloads (GitHub Insights public) - Publications (DOI links in Open Access repos) - Standards contributions (ETSI/IETF public drafts) --- ## Reviewer Notes **For Part B Section 2.1 (Pathways to Impact):** > "The project defines 18 quantitative KPIs across Excellence, Impact, and Implementation dimensions. Key targets include: TRL 4→6 progression validated by external audit; 10+ top-tier publications; 5+ standards contributions; 30% audit cost reduction in pilots; 50% faster incident detection; 500+ open-source downloads post-project; 15+ federation nodes across 3 countries; 100% deliverable on-time completion; ≤10% budget variance. Monthly KPI tracking via consortium portal ensures proactive management and timely course corrections." **For reviewers evaluating Impact (30% of score):** - Shows **concrete, measurable outcomes** (not vague "we will contribute to...") - Demonstrates **realistic targets** (30% cost reduction, not 90%) - Proves **systematic measurement plan** (verification methods specified) - Indicates **impact beyond project** (open-source downloads post-M24) --- **Document Control:** - Version: 1.0-KPI-DASHBOARD - Date: 2025-11-06 - Owner: VaultMesh Technologies B.V. (Coordinator) - Classification: Consortium Internal (will become Part B Section 2.1) - Related: PQC_Risk_Register.md, PQC_Work_Package_Gantt.mmd