Stakeholders & Concerns
Understanding who your stakeholders are and what they care about is the starting point for any architectural work. This article frames stakeholders, their typical concerns, and how to elicit and prioritize them so architectural decisions align with the right outcomes. Scope: we focus on architecture-level concerns (mostly non-functional qualities and cross-cutting constraints), not detailed feature design. For how this topic fits alongside siblings, see Architecture vs. Design vs. Implementation and Architectural Decision Impact & Cost of Change.
Core ideas
- Stakeholder: anyone who has a vested interest in the system or is affected by it (customers, internal users, business/product, engineers, operations/SRE, platform/infra, security, compliance/legal, data, support, partners, regulators).
- Concern: a matter of interest to a stakeholder that architecture should address, often expressed as desired quality attributes (e.g., availability, performance, security) or constraints (e.g., regulatory, cost caps, tech choices).
- Viewpoint: a template for describing the system from a perspective that addresses a set of concerns. Views applying these viewpoints help communicate how the architecture satisfies concerns. See Views & Viewpoints.
- Quality attributes provide the language to express and test concerns. See Quality Attributes.
Typical stakeholders and their concerns
The list is illustrative; your context may include more (e.g., open‑source community, auditors) or fewer.
Stakeholder | Typical concerns | Example measures/signals |
---|---|---|
Business/Product | Time‑to‑market, differentiation, roadmap feasibility, cost, risk | Cycle time, lead time, burn rate, OKRs |
End Users/Customers | Usability, performance, reliability, accessibility, privacy | Core Web Vitals, app latency, uptime/SLA, a11y checks |
Engineering/Teams | Modularity, testability, maintainability, devX, tooling | Change failure rate, MTTR, code health metrics |
Operations/SRE | Availability, resilience, observability, capacity, run cost | SLO/SLI, error budgets, saturation, cloud spend |
Security | Threat surfaces, authn/z, data protection, secrets, supply chain | Security posture, vuln MTTR, mTLS coverage, SBOM |
Compliance/Legal | Data residency, PII handling, auditability, retention | Evidence artifacts, controls mapping, retention policy |
Data/Analytics | Data quality, lineage, access patterns, schema evolution | Freshness, completeness, lineage trace, CDC stability |
Platform/Infra | Standardization, operability, portability, quota/cost | Golden path adoption, image provenance, quotas |
Partners/Integrators | Stable contracts, SLAs, versioning, deprecation policy | API error rates, version lifecycle, partner satisfaction |
Support/CS | Diagnostics, feature flags, error clarity, rollback paths | Ticket volume, first‑response time, rollback MTTR |
Related topics for deeper dives: Observability & Operations, Security Architecture, and Architecture Governance & Organization.
Eliciting and prioritizing concerns
- Identify stakeholders — start from the value stream: who builds, runs, uses, sells, supports, audits, or integrates with the system?
- Elicit concerns — interviews/workshops, review of incidents/postmortems, contracts/SLAs, regulatory commitments; convert concerns into testable quality attribute scenarios (stimulus → environment → response → measure). See Quality Attributes.
- Prioritize — use impact vs. likelihood, business value, and risk exposure. Establish explicit trade‑offs (e.g., latency vs. consistency, speed vs. safety).
- Trace to views and decisions — choose appropriate viewpoints and produce views that address the concerns. See Views & Viewpoints. Capture decisions as ADRs with rationale and consequences. See Architecture Decision Records (ADR).
- Validate — align with governance/review practices. See Review Boards & Design Reviews. Define acceptance criteria and, where possible, executable checks (tests, budgets, policy-as-code).
Decision flow
Examples (Scenarios)
Implementation notes and pitfalls
Implementation notes
- Use a lightweight RACI for major decisions to clarify who approves vs. who is consulted.
- Maintain a traceability matrix from concern → quality attribute scenario → view(s) → ADR(s) → tests/monitors.
- Turn critical concerns into budgets and guardrails (latency/error budgets, cost/bandwidth budgets, policy-as-code for security/compliance).
Operational and observability considerations
- Capture top concerns as SLIs/SLOs and wire dashboards early; align alerting to error budgets, not just static thresholds.
- Ensure trace context and correlation IDs flow across all critical paths to tie concerns to actual runtime behavior. See Observability & Operations.
Common pitfalls
- Solutionizing too early: jumping to technology choices before clarifying concerns and trade‑offs.
- Ignoring “quiet” stakeholders: compliance, support, or downstream integrators not present in early meetings.
- Treating security and operability as afterthoughts—these are primary concerns, not add‑ons.
- Design by committee: lack of a clear decision owner stalls progress; use ADRs and RACI.
When to use
- At project inception, when scoping an initiative or new architecture.
- Before significant changes (e.g., major dependency, new region, multi‑tenant shift).
- After incidents or major SLO breaches to re‑validate priorities.
When not to use
- Tiny prototypes or throwaway spikes where architecture decisions are intentionally deferred.
- When concerns are already well understood and validated for a very similar context—avoid re‑running heavy workshops; do a light refresh instead.
Design Review Checklist
- Have all key stakeholders been identified?
- Are the top 3-5 quality attribute scenarios defined and prioritized?
- Have conflicting concerns been acknowledged and trade-offs documented?
- Is there a clear mapping from concerns to architectural decisions?
- Are there views that address the primary concerns of key stakeholders?
- Have security, operational, and compliance concerns been treated as first-class requirements?
Related topics
- Architecture vs. Design vs. Implementation
- Architectural Decision Impact & Cost of Change
- Quality Attributes
- Views & Viewpoints
- Architecture Decision Records (ADR)
- Architecture Governance & Organization
References
- ISO/IEC/IEEE 42010:2022 — Systems and software engineering — Architecture description ↗️
- Rozanski & Woods, Software Systems Architecture: Viewpoints and Perspectives ↗️
- SEI, Architecture Tradeoff Analysis Method (ATAM) — collection/overview ↗️
- Kazman, Klein, Clements. ATAM: Method for Architecture Evaluation (SEI Technical Report) ↗️