“Earlier this year, I wrote about how standards bodies move from “interesting idea” to something you can reasonably expect engineers to implement without crying.”
That post was intentionally process-focused: drafts, working groups, intellectual property rules, errata, and all the other unglamorous steps that sit between a whiteboard sketch and a stable specification.
This post is the next step in that series. Instead of talking about how standards get made in general, I want to look at what is happening inside specific standards development organizations (SDOs) that matter right now.
Let’s start with one that comes up in almost every serious conversation about digital credentials: the Digital Credentials Protocols (DCP) Working Group at the OpenID Foundation.
You can Subscribe and Listen to the Podcast on Apple Podcasts, or wherever you listen to Podcasts.
And be sure to leave me a Rating and Review!
What problem DCP is trying to solve (and why it’s harder than it sounds)
The DCP Working Group exists to define OpenID specifications for issuer–holder–verifier flows. In other words, it’s about making sure credentials can be issued and presented in a way that works across ecosystems, formats, and deployment models.
That includes support for:
- W3C Verifiable Credentials
- IETF SD-JWT-based credentials
- ISO/IEC 18013-5 mobile driving licence–style credentials (mdocs)
- Pseudonymous authentication from end users to relying parties
That last bullet is doing more work than it might appear. DCP isn’t just about moving bits around; it’s about enabling selective disclosure, privacy-preserving presentation, and interoperability without assuming a single credential format or a single trust model.
This is one of the plumbing layers that has to work even when everyone else disagrees about the furniture.
What’s actually finished (and usable today)
The DCP Working Group has already delivered two final, version-1.0 specifications:
- OpenID for Verifiable Credential Issuance 1.0 (OID4VCI)
- OpenID for Verifiable Presentations 1.0 (OID4VP)
These define the core issuance and presentation flows and are considered stable specs that implementers are already using in production pilots and early deployments. Work is already in progress on the 1.1 versions of these specifications; updates happen as we learn more from actual implementations.
The existence of these specifications matters because we’ve now crossed an important line: we’re no longer arguing about whether these flows should exist. We’re arguing about how well they interoperate, how they get deployed at scale, and what assurance levels they can realistically support.
Which brings us neatly to HAIP (usually pronounced as “hype”).
HAIP: where “interoperable” starts to mean something specific
One of the more practical pieces of work coming out of DCP is the High Assurance Interoperability Profile (HAIP).
HAIP defines a constrained profile of OpenID4VC designed to meet high-assurance requirements when used with SD-JWT-based credentials or ISO mdoc formats. Profiles like this are where standards stop being abstract and start colliding with regulatory and policy expectations.
A few notable points:
- The vote on HAIP 1.0 concluded on December 23–24, 2025
- The vote passed, and HAIP 1.0 is now officially published
- HAIP 1.0 is expected to be referenced in upcoming updates to the EU Digital Identity (EUDI) Wallet implementing acts
This is an important signal. Profiles don’t exist for fun; they exist because regulators and large-scale implementers need tighter guardrails than base specifications provide. Profiles are always a subset of an existing specification; they can tighten up the scope from the original, but they cannot broaden it. HAIP is an explicit acknowledgement that “just follow the core spec” isn’t enough when assurance levels matter.
What’s still in draft (and why you should care anyway)
The DCP Working Group isn’t done. In addition to updates for published specifications, two active drafts are worth paying attention to, even if you’re not planning to implement them tomorrow.
Security and Trust in OpenID for Verifiable Credentials (important, but currently dormant)
This working group draft describes a trust architecture for OpenID4VC It lays out security requirements for ecosystem components and provides an informal security analysis of the OpenID4VC protocols
It’s also not receiving much active attention from the working group at the moment. The document exists as a place the group can return to when there are sufficient resources—read: sustained interest and time—, but for now it’s largely dormant.
That’s unfortunate, but not unusual. Documents like this tend to stall not because they lack value, but because they are overtaken by more urgent efforts.
OpenID for Verifiable Presentations over BLE
One draft that occasionally comes up in conversations about “in-person” or proximity use cases is OpenID for Verifiable Presentations over BLE, which defines how Bluetooth Low Energy can be used to request the presentation of verifiable credentials.
On paper, this fills an obvious gap. In practice, it’s received very little attention from the working group since it was adopted. The short answer to “is this widely implemented?” is probably no.
There is at least one notable exception: MOSIP has deployed BLE-based flows in production. Beyond that, however, there has been limited interest from other implementers, and the draft has not seen sustained iteration or uptake within the OpenID ecosystem.
That lack of momentum is probably why the medium-term thinking has shifted. Rather than further evolving the BLE draft in isolation, attention is now focused on the joint work between the DCP Working Group and ISO/IEC JTC 1/SC 17 Working Group 10. The expectation is that harmonisation with ISO/IEC 18013-5—particularly around proximity and device-to-device interactions—will result in a protocol that is clearer, better aligned with existing deployments, and ultimately more attractive to implementers.
In other words, the BLE work hasn’t failed so much as it has been overtaken by a broader convergence effort. That’s a fairly normal outcome in standards work, and should not be considered a waste of time.
The unglamorous but essential bit: conformance
One of the responsibilities of the DCP Working Group is conformance testing, and this is where a lot of standards efforts differ. Developing formal conformance and, as a bonus prize, certification, is definitely not something all SDOs do.
In this case, the news is actually good:
- Conformance tests are in solid shape
- They support the 1.0 final versions of OID4VCI and OID4VP
- HAIP 1.0 is included
Even more practically, the OpenID Foundation is planning to launch self-certification programs for Verifiable Credential Issuance, Verifiable Presentations, and HAIP 1.0 at the end of February 2026.
I think this is incredibly useful. Self-certification doesn’t guarantee perfect interoperability, but it does create a shared baseline and a way to tell the difference between “implements the spec” and “implements something spec-adjacent”.
January 2026: harmonization conversations get real
One of the hotter topics right now is the set of joint meetings between the DCP Working Group and ISO/IEC JTC 1/SC 17 Working Group 10.
The question on the table is whether—and how—to harmonize OpenID4VC with ISO/IEC 18013-5 Annex C.
This isn’t about turf (ok, maybe it’s a little bit about turf). It’s about recognizing that the same ecosystems are trying to deploy both web-based and proximity-based credential flows, often under regulatory pressure, and fragmentation helps no one. There are several options
Harmonization work is slow, political, and occasionally painful. It’s also one of the clearest signals that a technology has moved out of the lab and into the real world.
Why DCP is worth watching (even if you’re tired of credential debates)
If you strip away the acronyms, the DCP Working Group is doing something important: it’s trying to support multiple credential formats, multiple assurance levels, and multiple deployment models without forcing everyone into a single architecture.
That’s messy. It’s also realistic.
If you’re building wallets, issuing credentials, verifying presentations, or trying to understand what regulators are likely to reference next, this is one of the rooms where those decisions are being shaped.
And if nothing else, it’s a reminder that the hardest part of digital credentials isn’t cryptography. It’s agreeing on how we use it—together—without breaking everything else.
If you want to follow along more closely, the DCP mailing list archives are public, and the GitHub repositories are where most of the real debates are documented. Bring coffee. And patience. And consider joining the OpenID Foundation (individual membership is $50USD last time I checked) if you’d like to participate more actively and vote on the specifications as they move along in the process.
📩 If you’d rather track the blog than the podcast, I have an option for you! Subscribe to get a notification when new blog posts go live. No spam, just announcements of new posts. [Subscribe here]
Transcript
[00:00:04]
Welcome to the Digital Identity Digest, the audio companion to the blog at Spherical Cow Consulting. I’m Heather Flanagan, and every week I break down interesting topics in the field of digital identity—from credentials and standards to browser weirdness and policy twists.
If you work with digital identity but don’t have time to follow every specification or hype cycle, you’re in the right place.
[00:00:26]
So let’s get into it.
Why Look Inside the OpenID DCP Working Group
[00:00:30]
I promised to spend more time talking about what’s actually happening inside different standards working groups—especially for people who aren’t participating directly.
This episode focuses on what’s happening inside the OpenID Foundation’s Digital Credentials Protocols (DCP) Working Group, including:
- What they’re shipping
- What’s stalled
- And why it matters
If you’ve spent any time around digital credentials, you’ve probably encountered the familiar acronym soup:
- OpenID for VC
- OpenID for VP
- OpenID for VCI
- CAPE
- mDocs
- SD-JWTs
- Wallets everywhere
It’s tempting to treat all of this as one big blob labeled “credential stuff” and move on.
However, beneath that blob is careful—and sometimes frustrating—standards work that reveals where the ecosystem is actually willing to go.
What the DCP Working Group Is Trying to Accomplish
[00:02:00]
Rather than diving into syntax or message flows, this episode looks at:
- The intent behind the work
- Where momentum exists
- And the tradeoffs shaping real deployments
At a high level, the DCP Working Group’s goal is straightforward:
- Develop OpenID specifications supporting issuance, presentation, and authentication for digital credentials
- Support multiple credential formats, not just one
That includes:
- W3C Verifiable Credentials
- IETF SD-JWT–based credentials
- ISO/IEC 18013-5–style mDocs
Importantly, this group is not trying to declare a single winning credential format. Instead, it’s focused on ensuring that different formats can participate in interoperable web-based flows—which turns out to be much harder than it sounds.
Pseudonymous Authentication and Architectural Choices
[00:04:10]
One underappreciated aspect of the DCP charter is its emphasis on pseudonymous authentication from end users to verifiers.
This signals that DCP isn’t just about moving credential data from point A to point B.
It’s about enabling authentication without:
- Long-lived accounts
- Globally stable identifiers
In other words, the work is intentionally trying to avoid recreating traditional login systems—just with wallets instead of passwords.
That architectural choice explains why some design decisions feel conservative or even stubborn to implementers who want faster shortcuts.
What’s Completed and Already in Use
[00:05:45]
So what’s actually done?
Two core specifications are finalized and published as version 1.0:
- OpenID for Verifiable Credential Issuance (OpenID for VC)
- OpenID for Verifiable Presentations (OpenID for VP)
These define the baseline issuance and presentation flows.
More importantly:
- They’re stable
- They’re testable
- They’re already being used in pilots and early deployments worldwide
That means the conversation has shifted. We’re no longer debating whether these flows should exist. Instead, the focus is now on interoperability, scalability, and realistic assurance levels.
HAPE and High Assurance Profiles
[00:07:20]
This brings us to HAPE, the High Assurance Interoperability Profile.
Profiles exist because base specifications are intentionally flexible. Regulators and large implementers, however, tend not to love that flexibility.
HAPE defines a constrained profile of OpenID for VC designed to meet higher assurance requirements, particularly when used with:
- SD-JWT–based credentials
- ISO mDocs
[00:08:15]
HAPE 1.0 was formally approved at the end of December 2025 and is now published.
It’s also expected to be referenced in upcoming updates to the EU Digital Identity Wallet implementation acts.
Once profiles show up in regulatory text, the standards conversation shifts from interesting to required attention.
Conformance Testing and Why It Matters
[00:09:40]
Another meaningful area of progress is conformance testing.
Conformance testing isn’t glamorous, but it’s often where standards efforts succeed or quietly fall apart.
In this case:
- DCP conformance tests support OpenID for VC 1.0
- OpenID for VP 1.0
- And HAPE
The OpenID Foundation plans to launch self-certification programs for these specifications at the end of February 2026.
While self-certification doesn’t guarantee perfect interoperability, it does:
- Establish a shared baseline
- Help distinguish real implementations from “spec-adjacent” ones
For many organizations, that validation is a meaningful step forward.
Closing Thoughts
[00:12:10]
If you’re building wallets, issuing credentials, verifying presentations, or trying to understand what regulators may reference next, the DCP Working Group is one of the rooms where those decisions are being shaped.
And if nothing else, it’s a reminder that the hardest part of digital credentials isn’t cryptography.
It’s getting agreement on how we use it—without breaking everything else.

