Resilient Identifiers for Underserved Populations WG Charter Approved

I’m pleased to announce that the Charter for the Resilient Identifiers for Underserved Populations work group (RIUP WG) was approved by the Kantara Initiative Leadership Council earlier this week. This work group combines the legacy work groups (WGs) from the Identity Ecosystem Steering Group, which was formed in 2011 to provide a trust registry under the White House’s National Strategy for Trusted Identity in Cyberspace and absorbed by Kantara in 2018. I was a member of the UX Committee and wrote the User Experience Guidelines and Metrics document for the ID Ecosystem Framework Registry.

For the RIUP WG, two groups, Federated Identifiers for a Resilient Ecosystem (FIRE WG) and Healthcare ID Assurance (HIAWG) were combined to address identity assurance concerns for underserved people, who are often referred to as “vulnerable populations” by healthcare sector.

1) WG NAME (and any acronym or abbreviation of the name):  Resilient Identifiers for Underserved Populations Work Group (RIUP WG) 

(2) PURPOSE:  The purpose of the Work Group is to support vulnerable and underserved populations in America. At a high level, these populations include those with physical and cognitive disabilities, or who are homeless, impoverished, senior citizens, immigrants, incarcerated, institutionalized and otherwise underserved minority groups that need digital credentials to access online resources; particularly, online healthcare and financial resources. Without an easily reusable identifier, it is nearly impossible for these individuals to gain secure access to the resources and services that may be available to them. 

We will work, in collaboration with other private sector and public agencies towards establishing identifiers and access management (IAM) solutions that respect privacy, promote efficiency, limit redundancy, reduce barriers to use/adoption, increase interoperability, improve security, enhance safety and trust, eliminate identification errors, support resiliency, and achieve greater empowerment across the entire spectrum of online transactions. The RIUP WG will identify, coordinate, innovate and harmonize with ongoing and emerging identity initiatives, standards, and technologies, and communicate our findings to all relevant stakeholders, both in the US and, selectively, with other countries, under the leadership of the Kantara Initiative.  

(3) A SCOPE – Guidelines for Cultivating a User-Centric Trust and Promoting Adoption within Underserved Communities 

About “Underserved Populations”

Why does the RIUP WG use “underserved” rather than “vulnerable” when discussing the needs of healthcare populations? The US Health and Human Services tends to use “vulnerable” or “vulnerable and/or underserved” when discussing needs of people who require healthcare services but do not reflect the typical healthcare technology user.

In human subject testing, the category generally includes the elderly, poor, pregnant women, children, and infants, and recently, incarcerated people have been included in this description. But for the purposes of access to healthcare services, it also includes rural populations, those with permanent and temporary disabilities, indigenous peoples and others who may object to being described as vulnerable, yet need services that may be difficult to find, therefore rendering them “underserved.”

I had a conversation with Dana Chisnell, a founding member of the US Digital Service now serving as Deputy Design Director at US DHS, who convinced me to use “underserved” as a descriptor for identifiers. While there will still be “vulnerable populations” requiring special services, “underserved” puts the onus of care on the service provider rather than the traits of an individual which may or may not reflect their needs, abilities or level of personal agency. This work follows my research interest at the Internet Safety Lab where we are changing the conversation around digital harms, where the outcome of a service or lack of service can be harmful.

What’s Next?

RIUP WG will begin by creating guidelines for cultivating a user-centric trust registry and promoting adoption within Underserved Communities. We will publish a Use Case for Trusted Identifiers for underserved populations. And with a universal design strategy we will emphasize, highlight and prioritize user scenarios/stories from vulnerable and underserved populations to improve services for all users. We will test the use case and user stories across different verticals and persons of varying backgrounds and cultures. And we will create a dictionary that is harmonized with industry terminology.

There are a lot of initiatives that we will be watching. NIST is drafting 800-63-4 Digital Identity Guidelines, so we will work on comments on how to incorporate the needs of underserved people. The HSS Office of the National Coordinator (ONC) referenced trust registries in its work on Social Determinants of Health for Medicaid and we are participating in its information forums. We also plan to update the MAAS draft to incorporate recommendations from these efforts.

Lots to do and a great time to get involved.

Great teamwork!

Crypto, NFTs and Dadaism

POGs (Source: File:Pogslam.jpg – Wikimedia Commons)

Those of you interested in artists’ collaborative spaces may find the Dada.art platform unique. I found it while pondering the connection between NFTs (Non-Fungible Tokens) and Dadaism, an early 20th century, anti-capitalist art movement “expressing nonsense, irrationality, and anti-bourgeois protest in their works.” (I wondered to myself, half seriously, whether anyone had made NFTs from POGs, the 1990s collector’s item. Turns out someone has).

Of course the NFT platform is called DADA.art and they recently sold a collection of collaborative works as an NFT to Metapurse for 500 ETH (Etherium crypto coin). All proceeds were donated back to the community to provide a basic income (in ETH) to artists on the platform. Fascinating.

Safe Tech Audit Sketchnotes – IAC22

Zsofi Lang’s Sketchnotes from my talk “Safe Tech Audit: IA as a Framework for Respectful Design” from The Information Architecture Conference 2022:

Image

Designing Respectful Technology

Note: this article was originally published as Designing Respectful Tech: What is your relationship with technology? at Boxes and Arrows on February 24, 2022

You’ve been there before. You thought you could trust someone with a secret. You thought it would be safe, but found out later that they blabbed to everyone. Or, maybe they didn’t share it, but the way they used it felt manipulative. You gave more than you got and it didn’t feel fair. But now that it’s out there, do you even have control anymore?

Ok. Now imagine that person was your supermarket. 

Or your doctor. 

Or your boss.

Do you have a personal relationship with technology?

According to research at the Me2B Alliance, people do feel they have a relationship with technology. It’s emotional. It’s embodied. And it’s very personal.

How personal is it? Think about what it would be like if you placed an order at a cafe and they already knew your name, your email, your gender, your physical location, what you read, who you are dating, and that, maybe, you’ve been thinking of breaking up.

Source: “If your shop assistant was an app (hidden camera),” Forbrugerrådet Tænk (Danish Consumer Council), December 2014 (YouTube).

We don’t approve of gossipy behavior in our human relationships. So why do we accept it with technology? Sure, we get back some time and convenience, but in many ways it can feel locked in and unequal.

The Me2B Relationship Model

At the Me2B Alliance, we are studying digital relationships to answer questions like “Do people have a relationship with technology?” (They feel that they do). “What does that relationship feel like?” (It’s complicated). And “Do people understand the commitments that they are making when they explore, enter into and dissolve these relationships?” (They really don’t).

It may seem silly or awkward to think about our dealings with technology as a relationship, but like messy human relationships there are parallels. The Me2BA commitment arc with a digital technology resembles German psychologist George Levenger’s ABCDE relationship model 1, shown by the Orange icons in the image below. As with human relationships, we move through states of discovery, commitment and breakup with digital applications, too.

Source: Me2B Alliance, 2021

Our assumptions about our technology relationships are similar to the ones we have about our human ones. We assume when we first meet someone there is a clean slate, but this isn’t always true. There may be gossip about you ahead of your meeting. The other person may have looked you up on LinkedIn. With any technology, information about you may be known already, and sharing that data starts well before you sign up for an account.

The Invisible Parallel Dataverse

Today’s news frequently covers stories of personal and societal harm caused by digital media manipulation, dark patterns and personal data mapping. Last year, Facebook whistleblowerFrances Hauser exposed how the platform promotes content that they know from their own research causes depression and self-harm in teenage girls. They know this because they know what teenage girls click, post and share.

Technology enables data sharing at every point of the relationship arc, including after you stop using it. Worryingly, even our more trusted digital relationships may not be safe. The Me2B Alliance uncovered privacy violations in K-12 software, and described how abandoned website domains put children and families at risk when their schools forget to renew them. 

Most of the technologies that you (and your children) use have relationships with third party data brokers and others with whom they share your data. Each privacy policy, cookie consent and terms of use document on every website or mobile app you use defines a legal relationship, whether you choose to opt in or are locked in by some other process. That means you have a legal relationship with each of these entities from the moment you accessed the app or website, and in most cases, it’s one that you initiated and agreed to.

All the little bits of our digital experiences are floating out there and will stay out there unless we have the agency to set how that data can be used or shared and when it should be deleted. The Me2B Alliance has developed Rules of Engagement for respectful technology relationships and a Digital Harms Dictionary outlining types of violations, such as:

  • Collecting information without the user’s awareness or consent; 
  • contracts of adhesion, where users are forced to agree with terms of use (often implicitly) when they engage with the content; 
  • Loss or misuse of personally identifiable information; and 
  • Unclear or non-transparent information describing the technology’s policies or even what Me2B Deal they are getting.
Respectful relationships. Data minimization includes: No gossip, no eavesdropping, no stalking. Individual control and autonomy includes: No manipulation, no coercion. Respectful defaults includes Progressive Consent.
Source: Noreen Whysel, Me2B Alliance 2021. Image (right): Pixabay

Respectful technology relationships begin with minimizing the amount of data that is collected in the first place. Data minimization reduces the harmful effects of sensitive data getting into the wrong hands. 

Next, we should give people agency and control. Individual control over one’s data is a key part of local and international privacy laws like GDPR in Europe, and similar laws in CaliforniaColoradoand Virginia, which give consumers the right to consent to data collection, to know what data of theirs is collected and to request to view the data that was collected, correct it, or to have it permanently deleted.

Three Laws of Safe and Respectful Design

In his short story, I, Robot, Isaac Asimov introduced the famous “Three Laws of Robotics,” an ethical framework to avoid harmful consequences of machine activity. Today, IAs, programmers and other digital creators make what are essentially robots that help users do work and share information. Much of this activity is out of sight and mind, which is in fact how we, the digital technology users, like it. 

But what of the risks? It is important as designers of these machines to consider the consequences of the work we put into the world. I have proposed the following corollary to Asimov’s robotics laws:

  • First Law: A Digital Creator may not injure a human being or, through inaction, allow a human being to come to harm.
  • Second Law: A Digital Creator must obey the orders given by other designers, clients, product managers, etc. except where such orders would conflict with the First Law.
  • Third Law: A Digital Creator must protect its own existence as long as such protection does not conflict with the First or Second Law.1

Mike Monteiro in his well-known 2014 talk at An Event Apart on How Designers are Destroying the World discusses the second and third law a lot. While we take orders from the stakeholders of our work—the client, the marketers and the shareholders we design for—we have an equal and greater responsibility to understand and mitigate design decisions that have negative effects.

A Specification for Safe and Respectful Technology

The Me2B Alliance is working on a specification for safe and respectfully designed digital technologies—technologies that Do No Harm. These product integrity tests are conducted by a UX Expert and applied to each commitment stage that a person enters. These stages range from first-open, location awareness, cookie consent, promotional and loyalty commitments, and account creation, as well as the termination of the relationship.

Abby Covert’s IA Principles—particularly Findable, Accessible, Clear, Communicative and Controllable—are remarkably appropriate tests for ensuring that the people who use digital technologies have agency and control over the data they entrust to these products:

Findable: Are the legal documents that govern the technology relationship easy to find? What about support services for when I believe my data is incorrect, or being used inappropriately? Can I find a way to delete my account or delete my data?

Accessible: Are these resources easy to access by both human and machine readers and assistive devices? Are they hidden behind some “data paywall” such as a process that requires a change of commitment state, i.e. a data toll, to access?

Clear: Can the average user read and understand the information that explains what data is required for what purpose? Is this information visible or accessible when it is relevant?

Communicative: Does the technology inform the user when the commitment status changes? For example, does it communicate when it needs to access my location or other personal information like age, gender, medical conditions? Does it explain why it needs my data and how to revoke data access when it is no longer necessary?

Controllable: How much control do I have as a user? Can I freely enter into a Me2B Commitment or am I forced to give up some data just to find out what the Me2B Deal is in the first place? 

Abby’s other IA principles flow from the above considerations. A Useful product is one that does what it claims to do and communicates the deal you get clearly and accessibly. A Credible product is one that treats the user with respect and communicates its value. With user Control over data sharing and a clear understanding of the service being offered, the true Value of the service is apparent.

Over time the user will come to expect notice of potential changes to commitment states and will have agency over making that choice. These “Helpful Patterns”—clear and discoverable notice of state changes and opt-in commitments—build trust and loyalty, leading to a Delightful, or at least a reassuring, experience for your users.

What I’ve learned from working in the standards world is that Information Architecture Principles provide a solid framework for understanding digital relationships as well as structuring meaning. Because we aren’t just designing information spaces. We’re designing healthy relationships.


1 Levinger, G. (1983). “Development and change.” In H.H. Kelley et al. (Eds.), Close relationships (315–359). New York: W. H. Freeman and Company. https://www.worldcat.org/title/close-relationships/oclc/470636389

2  Asimov, I. (1950). I, Robot. Gnome Press.

Keep On Trackin’

Me2B Research: Consumer Views on Respectful Technology

In the research I’ve been doing on respectful technology relationships at the Me2B Alliance, it’s a combination of “I’ve got nothing to hide” and “I’ve got no other option”. People are deeply entangled in their technology relationships. Even when presented with overwhelmingly bad scores on Terms of Service and Privacy Policies, they will continue to use products they depend on or that give them access to their family, community, and in the case of Amazon an abundance of choice, entertainment and low prices. Even when they abandon a digital product or service, they are unlikely to delete their accounts. And the adtech SDKs they’ve agreed to track them keep on tracking.

IA Conference in Quarantine: On-Site to Online in 30 Days

The IA Conference ended its four week run, which as some of you may recall was originally a five day event In New Orleans with 12 preconference workshops and 60 talks in three tracks. The format changed to all prerecorded talks released in three tracks daily over a period of three weeks. We put the plenaries on Mondays and Fridays and special programming, like panel talks and poster sessions, on Wednesdays. We used Slack for daily AMAs and Zoom for weekend watch parties and Q&A sessions with plenaries. Other social and mentoring activities took place mornings, weekends and evenings.

The workshops which usually come first were all moved to the fourth week except for Jorge Arango’s IA Essentials. We had a lot of student scholarship attendees and didn’t want to make them wait until after the main conference.

We have a lot of amazing people to thank for puling it off, starting with dozens of volunteers whose stamina is inspiring. I honestly wasn’t sure we could hold people that long. But Jared Spool thought we could do it and Cheryl at Rosenfeld Media gave us some valuable advice about connecting through online platforms.

So, what did we do? Check out this presentation “Rapid Switch: How we turned a five day onsite event into a monthlong, online celebration,” presented at the 500 Members Celebration of the Digital Collaboration Practitioners.

New IDESG Service Empowers Organizations to Better Protect Digital Identities

Registry Is Key Step in Growing Healthy and Secure Online Identity Ecosystem
Marketwired Identity Ecosystem Steering Group (IDESG)

Jun 6, 2016 8:00 AM

WASHINGTON, DC–(Marketwired – Jun 6, 2016) – The Identity Ecosystem Steering Group (IDESG) — an independent, non-profit organization dedicated to creating the future of trusted digital identities — today announced a new service that empowers organizations to improve the way they handle identities. The Identity Ecosystem Framework (IDEF) Registry brings the digital identity community closer to realizing the White House’s vision for trusted identities in cyberspace.

Every organization involved in online identity transactions plays a key role in creating and sustaining a healthy online identity ecosystem. The IDEF Registry allows companies to independently assess their own identity management methods against common industry practices. Using the IDESG’s Identity Ecosystem Framework as a model, organizations can now master and build on commonly accepted criteria for interoperability, privacy, security and usability. Meeting milestones in these subject areas is essential to ensuring that digital identities are protected and trustworthy online.

“This is an essential step in creating a safer environment for online transactions,” said Salvatore D’Agostino, President of the IDESG and CEO of IDmachines, LLC. “By equipping organizations involved in online transactions with a tool to measure where they stand relative to accepted policies and best practices, we’re promoting a safer internet on two levels. We’re making industry-accepted best practices more accessible to organizations who want to meet them, and providing a structured benchmark to organizations and individuals that want to use safer protocols for their digital transactions and information management.”

The Registry is an actionable step in the Identity Revolution, and the first opportunity of its kind for online identity service providers and owners and operators of applications that register, issue, authenticate, authorize and use identity credentials to prove that they operate secure platforms for their customers. Those that voluntarily register with the Registry publicly demonstrate their dedication to best practices in identity management. In addition to increasing participating organizations’ value and trust in the marketplace, the Registry gives them access to their industry’s most cutting-edge methods for identity management.

Initial listers include some of the preeminent companies in the identity space, such as MorphoTrust and PRIVO.

“As a founding member of the IDESG, PRIVO understands the level of commitment, subject matter expertise and vision required to bring the Registry to life,” said PRIVO Co-founder and CEO Denise Tayloe. “We are very proud to be one of the first services to hold ourselves accountable to the IDEF requirements that support a privacy-preserving, interoperable, secure, easy-to-use credential for families we serve, in order to protect and enable young users to engage and transact online.”

The IDESG has a pipeline of applicants and anticipates significant demand to join these early adopters to complete the process. Listing in the IDEF Registry is currently free for those who self attest.

“An Internet built around the identity principles of the Identity Ecosystem Framework, is in the best interest of us all as individuals,” said Mark DiFraia, Senior Director of Market Development at MorphoTrust. “MorphoTrust is proud to be one of the first organizations to join the IDEF Registry because we made the investment to build our online identity solution from the ground up to deliver on the National Strategy for Trusted Identities in Cyberspace (NSTIC) Principles. It is our sincere hope that the combination of NSTIC principles, the IDEF and now the IDEF Registry apply the right amount of pressure to shape the behavior of online players for the benefit of us all.”

For more information on The Identity Ecosystem Framework Registry, visit IDEFRegistry.org.

About the Identity Ecosystem Steering Group (IDESG)

IDESG is a voluntary, public-private partnership dedicated to developing an Identity Ecosystem Framework (IDEF) and services to better online digital identity. The IDESG looks to advance the Identity Ecosystem called for in the National Strategy for Trusted Identities in Cyberspace (NSTIC). The NSTIC, signed by President Obama in 2011, envisions the identity ecosystem as an online environment where individuals and organizations will be able to trust each other because they follow agreed-upon standards and policies to obtain and authenticate their digital identities. Come see how IDESG is making this happen by joining us in the effort and taking advantage of our services at IDESG.org.

Contact:

Media Contact
Donna Armstrong
ConnellyWorks, Inc.
571-323-2585 x6140
donna@connellyworks.com

State of the Map Bonus: Satellite Selfie

SOTM Satellite Selfie, CartoDB

June 6-8 was OpenStreetMap’s State of the Map Conference at the United Nations. I volunteered at registration and during morning sessions and was able to attend interesting talks on OSM data in Wikipedia, the Red Cross presentation on OSM in disaster response and developing a GIS curriculum in higher education.

One of the highlights was a satellite selfie. Led by a team from DigitalGlobe, a group of about 20 attendees created a large UN-blue circle on the ground and waited for the WorldView 3 and GOI1 satellites to flyover for a routine scan. Orbiting at 15,000 miles per hour about 400 kilometers above Manhattan, the WorldView 3 was expected to take images that would include UN Plaza. The resulting satellite image collected at 11:44am is available on the CartoDB blog (image above), courtesy of CartoDB CEO, Javier de la Torre. Huge thanks to Josh Winer of DigitalGlobe who took time to explain the physics of satellite imagery and kept us entertained while we waited for our not-so-closeup.