Thoughts on Diversity, Equity and Inclusion (DEI) as a Design Framework

“Tools for Accessibility” by Noreen Whysel. AI generated art produced at NightCafe Studio

I was on a call the other day where we were discussing identity services for underserved populations. Someone brought up Diversity, Equity, and Inclusion (DEI) as a framework for ensuring accessible services for all.

DEI, as applied to product and service design, is a three-pronged philosophy that asks if you are assuring that diverse perspectives and lived experiences are being considered in the design of the service; whether access to the design or service is fair to all categories of people; and whether those—whose diverse experiences are considered—feel safe, welcome and included in the service and its outcome.

We discussed DEI in our group, but one person became uncomfortable, insisting that it doesn’t matter who is using the services as long as everyone can use it. He was concerned that focusing on DEI might mean that the unique needs of people, like the parent of a disabled person, would be excluded from consideration in the design of a product or service.

I thought this was an odd framing. He isn’t wrong to worry that caregivers may not have the best-designed experiences, which is why Universal Design, or design that everyone can use without impediment, is so important as a framework.

But rejecting conversations about DEI outright seems short sighted.

As a framework, I like DEI because it offers a reminder that there are people who get forgotten in the design process. It asks questions like “Who are we including?” and “Who are we leaving out?” So, my colleague’s concern about addressing the needs of the parent of a disabled person is exactly the type of inclusion issue that a DEI framework can help to identify.

It is also an area that I have been focusing on at IA Gateway with Shari Thurow and Bev Corwin. We are working on a model for a group persona that addresses the search needs of caregivers and people with a medical concern, whether a family member, acquaintance or someone in guardianship care.

Resilient Identifiers for Underserved Populations WG Charter Approved

Earlier this week, the Kantara Initiative Leadership Council approved a new Charter for the Resilient Identifiers for Underserved Populations work group (RIUP WG). This work group combines two legacy work groups (WGs) from the Identity Ecosystem Steering Group (IDESG). IDESG formed in 2011 to provide a trust registry under the White House’s National Strategy for Trusted Identity in Cyberspace and absorbed by Kantara in 2018. As a member of the IDESG UX Committee, I wrote the User Experience Guidelines and Metrics document for the ID Ecosystem Framework Registry.

Under the new charter, two work groups, Federated Identifiers for a Resilient Ecosystem (FIRE WG) and Healthcare ID Assurance (HIAWG) will combine to form the RIUP WG. This group will address identity assurance concerns for underserved people, often referred to as “vulnerable populations” by healthcare sector.

1) WG NAME (and any acronym or abbreviation of the name):  Resilient Identifiers for Underserved Populations Work Group (RIUP WG) 

(2) PURPOSE:  The purpose of the Work Group is to support vulnerable and underserved populations in America. At a high level, these populations include those with physical and cognitive disabilities, or who are homeless, impoverished, senior citizens, immigrants, incarcerated, institutionalized and otherwise underserved minority groups that need digital credentials to access online resources; particularly, online healthcare and financial resources. Without an easily reusable identifier, it is nearly impossible for these individuals to gain secure access to the resources and services that may be available to them. 

We will work, in collaboration with other private sector and public agencies towards establishing identifiers and access management (IAM) solutions that respect privacy, promote efficiency, limit redundancy, reduce barriers to use/adoption, increase interoperability, improve security, enhance safety and trust, eliminate identification errors, support resiliency, and achieve greater empowerment across the entire spectrum of online transactions. The RIUP WG will identify, coordinate, innovate and harmonize with ongoing and emerging identity initiatives, standards, and technologies, and communicate our findings to all relevant stakeholders, both in the US and, selectively, with other countries, under the leadership of the Kantara Initiative.  

(3) A SCOPE – Guidelines for Cultivating a User-Centric Trust and Promoting Adoption within Underserved Communities 

About “Underserved Populations”

Why does the RIUP WG use “underserved” rather than “vulnerable” when discussing the needs of healthcare populations? The US Health and Human Services tends to use “vulnerable” or “vulnerable and/or underserved” when discussing needs of people who require healthcare services but do not reflect the typical healthcare technology user.

In human subject testing, the category generally includes the elderly, poor, pregnant women, children, and infants, and recently, incarcerated people. But for the purposes of access to healthcare services, it also includes rural populations, those with permanent and temporary disabilities, indigenous peoples and others who may object to being described as vulnerable. In these cases, people need services that may be difficult to find, therefore rendering them “underserved.”

I had a conversation with Dana Chisnell, a founding member of the US Digital Service now serving as Deputy Design Director at US DHS, who convinced me to use “underserved” as a descriptor for identifiers. While there will still be “vulnerable populations” requiring special services, “underserved” puts the onus of care on the service provider rather than the traits of an individual which may or may not reflect their needs, abilities or level of personal agency. This work follows my research interest at the Internet Safety Lab where we are changing the conversation around digital harms, where the outcome of a service or lack of service can be harmful.

What’s Next?

RIUP WG will begin by creating guidelines for cultivating a user-centric trust registry and promoting adoption within Underserved Communities. We will publish a Use Case for Trusted Identifiers for underserved populations. And with a universal design strategy we will emphasize, highlight and prioritize user scenarios/stories from vulnerable and underserved populations to improve services for all users. We will test the use case and user stories across different verticals and persons of varying backgrounds and cultures. And we will create a dictionary that is harmonized with industry terminology.

There are a lot of initiatives that we will be watching. NIST is drafting 800-63-4 Digital Identity Guidelines, so we will work on comments on how to incorporate the needs of underserved people. The HSS Office of the National Coordinator (ONC) referenced trust registries in its work on Social Determinants of Health for Medicaid and we are participating in its information forums. We also plan to update the MAAS draft to incorporate recommendations from these efforts.

Lots to do and a great time to get involved.

Great teamwork!

See more Digital Identity research posts:

CPPA Stakeholder Meeting Discusses “Dark Patterns”

On May 5, 2022, I participated in the California Privacy Protection Agency’s (CPPA) stakeholder meeting, making a public statement about “dark patterns” which I urged them to redefine as “harmful patterns,” and suggested changes to their definitions of “Consent” and “Intentional Action.”

As Jared Spool says, we should be looking at the UX outcome of design decisions, not just the intent, as many designers adopt strategies or work with underlying technologies whose outcomes can be harmful to the technology user and other stakeholders. These UI patterns may not have the intent to do harm. Often the designers’ intent is to provide convenience or a useful service.

Take accessibility overlays that intend to provide a better experience for people with visual or cognitive disabilities but have the effect of overriding necessary controls. Even patterns that affect user behavior, like staying on a page longer, clicking on a link, accepting default cookie settings, etc. may be intended to provide convenience to users, but unknowingly to both the designer and the user, there are processes underlying many of these tools that share data and information about the transaction that can be harmful.

CPRA is defining what it means to consent to data collection and what an intentional user action is. It addresses “dark patterns” as an intentional deception, when often the digital harm is not intentional, yet is deep-rooted. We are hoping to make these harms clearer and provide guidelines for addressing them through our ISL Safe Software Specification.

Read more about the CPPA stakeholder meeting and my statement on behalf of the Internet Safety Labs (formerly the Me2B Alliance):