Me2B Alliance

Background

The Me2B Alliance is a standards development organization comprised of software engineers, policy analysts, UX experts, business and philanthropic leaders who are committed to giving individuals more say in how technology treats people. We are setting up a rigorous independent testing and certification program for websites, apps and connected devices. The Me2B Alliance is comprised of working groups for Me-s (the consumer), B-s (the business) as well as the Policy and Legal and Certification working groups. Together, we are setting the standard for Respectful Technology.

My Role

My role at the Me2B Alliance is twofold: I am leading up the Research and Validation practice to provide user experience and other research services to the various working groups, exploring questions around the consumer experience of their relationship with digital technology.

Secondly, I am developing the product integrity testing framework for digital technologies, in particular mobile apps and websites. This framework, coupled with data integrity and security testing, makes up the requirements for Me2BA certification.

User Research Methods

Ethnographic Research

I am engaging consumers in one on one conversations about their relationship with technologies they use in their day to day lives. Research questions range from and their understanding of privacy policies, terms of use agreements and other agreements they make implicitly by using a technology. for example, do users change how they interact with a website when they are familiar with the legal terms of the website? And would a score make a difference?

Preference Testing

I performed a series of tests of the certification mark to be used as a symbol of trust in connected digital technologies. This included interviews, focus groups, unmoderated 5-Second preference tests and surveys.

Product Integrity Testing

I developed a UX Integrity framework for the Me2B Safe and Respectful Technology Framework (now published as the Me2B Safe Specification). This framework was based on an applicaiton of IA heuristics to ensure that notices of data collection, use and sharing is Clear, Findable, Accessible, Credible and Communicative or understandable by a wide audience of human and machine readible or accessible devices.

Tools

Interviews and Focus Groups: Zoom, UserInterviews.com, Surveymonkey

Preference Tests and 5 Second tests: UserInterviews.com

Collaboration: Microsoft Teams, Zoom, Microsoft365, Trello, Monday

Artifacts

Safe Tech Audit: IA as a Framework for Respectful Design (April 23, 2022)
Conference Presentation: Information Architecture Conference 2022

Spotlight Report #5: Me2B Alliance Validation Testing Report: Consumer Perception of Legal Policies in Digital Technology (January 18, 2022)

Spotlight Report #3: Me2B Alliance Validation Research: Consumer Sensitivity to Location Tracking by Websites and Mobile Apps (November 5, 2021)

Shedding Light on Dark Patterns: A Case Study on Digital Harms (April 28, 2021)
Conference Presentation: Information Architecture Conference 2021

Webinar: Me2B Research: Consumer Views on Respectful Technology

Future Plans

We are planning to conduct three focus groups per month of consumers and digital product designers/managers. The research will continue to evolve our understanding of how consumers experience their relationship and risks with respect to digital technologies.

Ethical Design: Evaluating Digital and IRL Experiences (and how one might support or hinder the other)

During the Winter 2022 intercession, I took part in the Living Lab General Education Seminar which is described on its website as follows:

The seminar is designed to make General Education more visible in our classrooms and courses. We will build an engaging environment for learning through exploration, implementation, and assessment of a variety of proven teaching practices using Oral Communication, Quantitative Literacy, Reading, Writing as our focus this year.

Living Lab General Education Seminar

In this course we learned how to apply High Impact Educational Principles and strategies for Place-Based Learning to create rewarding learning experiences for our students and colleagues.

The culmination of the course was to create a new or redesigned course assignment or project targeting one of the General Education Learning Goals and to implement it during the Spring 2022 semester. A complete description of my activity is located in the L4: Living Lab Learning Library.

The Exercise 

In a Data for Good lecture at Columbia’s Data Science Institute, dana boyd of Data+Society told the audience that her proudest achievements are often when she convinces a client not to create something that can potentially do harm.  

When does it make sense to NOT make a digital version of something that would be better designed IRL? Are there activities that are more suited to online than IRL? Or are there cases where a combination of both are appropriate? 

In this exercise, I shared a few articles about online activities that have had an impact on real life. We discussed both positive and negative reviews of online activities, including Pokemon Go (The Guardian), which is often  discussed in terms of its getting gamers to be more social and active, to Instagram (Wall Street Journal), which has been shown to have a negative effect on the self-esteem of teenaged girls. A third example was on how social media use during the pandemic is exacerbating to the political polarization of America (Harvard Berkman Klein Center) by removing the public commons from physical space to largely anonymous forums. 

After discussing these articles, students formed breakout groups to find a news article about an online activity and discussed the pros and cons of that activity online. And to also discuss how that activity could be replaced by or combined with an “In Real Life” (IRL) activity to improve the experience. Finally, they posted a reflection on the exercise to the class Slack group. 

High Impact Learning

This activity focuses on three learning outcomes: Reading, Information Literacy and Ethical Thinking. Students are asked to read an assigned text describing Digital vs IRL spaces and then in select an example from the reading of a digital experience that might be better In Real Life or paired with an IRL experience. After discussing in groups, they then share back to the class what they discussed and finally post a reflection on course discussion board about their understanding of the pros and cons of digital vs IRL for the chosen scenario.  

To address Information Literacy, students must find one additional example of digital applications where the IRL experience takes precedence over digital. What might someone gain from a physical experience that they can’t get from digital? When might a digital application enhance the IRL experience?  

And to expand their understanding of who is impacted by their design decisions, they then work in groups to make a stakeholder map (Giordano et al, 2018) showing who is affected by the designed experience of their chosen example. Who is participating in the experience? Who else might be affected by the experience? Or harmed? Who might be left out?  

In addition to the reading, information literacy and ethical thinking student learning outcomes, students gain gain from two High Impact Education Practices: Place Based Learning and Collaborative Assignments.  

Place-Based Learning: Students consider the physical and embodied experiences of IRL versus digital experiences  

Collaborative Assignments: Students participate in discussion of the pros and cons of selected digital experiences 

Outcomes and Future Development

This activity was part of the Ethics and Accessibility lecture in Week 7 of the Spring 2022 semester. It took a little over a half hour to complete. There was no out-of-class time except if a student wishes to post their reflection after class. If we had more time (and were not otherwise online this week) we might have been able to go outside and play Pokemon Go or survey people about their online and offline political activity on campus grounds. We may still try to create an online/IRL activity during a later session and follow up with a stakeholder map, which we did not have time to do. 

The activity was low stakes and ungraded. The only preparation was to find three articles to discuss as examples of Online activities that either replace or compromise IRL experiences. Because this assignment is ungraded, I plan to use it as part of the participation grade. I do not believe my course is part of the college-wide general education assessment initiative. It is an elective. 

Students enjoyed discussing online versus “In Real Life” very much. They are very aware of online activities that are creating unrealistic expectations for their real-life relationships and are concerned about exacerbating these experiences through their design careers. I would like to refine the activity and possibly replace a duller accessibility study that they do for credit and that could be done in class in groups or as a demonstration. Not being able to go outside or actually be IRL was an issue with this activity, though some students mentioned that it made it easier for everyone in their group to search for articles since they were all sitting at a computer anyway. 

Resources and Reflections 

My activity presentation for the Living Lab course is openly available at https://cuny907-my.sharepoint.com/:p:/g/personal/noreen_whysel27_login_cuny_edu/EeDP7sDKTAROh1Nle8uKlagB5bMXEem7EM4k6Lvh7nagBA?e=FgnQIx 

You can also read about this and other OER activities on the course blog at https://openlab.citytech.cuny.edu/l4/2022/03/28/ethical-design-evaluating-digital-and-irl-experiences-and-how-one-might-support-or-hinder-the-other/

The Slack channel where students posted their reflections is a private discussion space for students of the HE93 section of COMD3562. I will post an image with student names anonymized to show an example of the written output from this assignment. Students who wish to post their reflections publicly will be able to reply to the post on Open Lab. 

This article was originally posted on March 28, 2022 on CUNY Open Lab.

IDEF Registry

Client: OASIS/Identity Ecosystem Steering Group
Visit Website

My Role

I led user testing for the Identity Ecosystem Framework (IDEF) Registry as part of the National Strategy for Trusted Identity in Cyberspace (NSTIC), a White House initiative. The IDEF Registry, a digital identity standard assessment tool, launched its alpha version on June 6, 2016. Because development of the alpha version of the attestation form was ongoing, I was brought into an agile process with the goal to iterate improvements after the public launch. I worked directly with a contracted project manager, third party marketing and design companies, the Chair of the IDESG User Experience Committee and members of the IDEF Registry working group.

User Research

The goal of the user study was two-fold: first, to ensure that the assessment form was understandable to those users who wish to list their products and that it included sufficient and expected information needed to complete the form accurately, and second, to ensure that the registry listing itself was usable and understandable to users who are seeking identity solutions.

Test participants for the first goal included IDESG members and observers who provide identity services, including certification, authentication, authorization, registration and transaction intermediation, or who rely on identity services in their own internal systems and commercial products. We selected expert users because we expect that those who will be completing the attestation form have a high level of understanding of the privacy, security, interoperability and usability of their own products.

Tests included needs assessment interviews of 12 prospective users, followed by additional user tests of seven users. For the needs assessment, I interviewed 12 prospective study participants about their needs for identity standards assessment and how the current IDEF Registry assessment tool compares to similar industry and government standards. I wanted to understand if the IDEF tool addressed all of their concerns about privacy, security, interoperability and usability and to get a sense of whether the planned registry served their needs. General findings were presented in a Google slide presentation showing typical responses to eleven study questions, suggested improvements and the impact on the user expereince. These were discussed over two, 2- hour meetings of the IDEF Registry Working Group.

Usability Tests

After delivering my findings to the development team, I began to design usability tests. I employed an observational walkthrough of proposed and completed designs, an expert heuristics review, user surveys and follow-up interviews with seven registry users. I utilized card sorts, preference tests, cognitive walkthrough of wireframes and a live website, as well as observations and survey feedback of seven alpha site users as they completed the attestation form on the alpha website to develop recommendations for improvements.

I engaged four members of the User Experience Committee, all usability experts, to participate in a heuristic analysis using Neilsen-Norman Group’s 10 usability heuristics and Abby Covert’s IA Heuristics. These expert users primarily evaluated the assessment form, but also provided input on the usability of the registry listings themselves, as a proxy for typical registry listing users. Due to the early stage of development, the client did not wish to

Results

The results showed that while the IDEF was rigorous, the implementation of the assessment and registry listings needed improvement, particularly to address situations where more than one person or company department might need to be involved. There were a number of issues with the interface including layout and data visualizations that could use improvement. Since Usability was a major component of the assessment, I also developed a set of user experience guidelines and metrics for service providers to use in evaluating usability requirements of the attestation. These will be incorporated into the Usability section of the assessment guidance documents.

UPDATE (5/22/2017): As of late Spring 2017, nine companies have completed assessments. The website remains in alpha with my recommendations set for implementation when the next round of grant funding is approved. Should I be reengaged, the next studies will include user tests of participants seeking identity services.

UPDATE (6/15/2019): The Registry is currently 65% complete and has transferred to the Kantara Initiative’s Education Foundation as of December 2018. I am continuing to serve on an agile advisory team and am working on use cases for health care. I presented the registry and participated in roundtable discussions at the 2019 Health Information Summit in Washington, DC on June 4, 2019.

Note: I signed a Non-Disclosure Agreement and am unable to share any images aside from those made public at idecosystem.org and idefregistry.org. Detailed information about the project, the assessment and the User Experience Committee is available on the public IDESG Wiki. Some of the documents including a draft rewrite of the Usability Guidelines and Metrics have been made public at: https://wiki.idesg.org/wiki/index.php?title=Talk%3AUser_Experience_Guidelines_Metrics

Announcement:
The IDEF Registry: an open invite to commit to trusted digital identity solutions

Resources:
Identity Ecosystem Steering Group (IDESG)
IDEF Registry
Identity Ecosystem Framework – Baseline Functional Requirements

Announcement:
The IDEF Registry: an open invite to commit to trusted digital identity solutions

Resources:
Identity Ecosystem Steering Group (IDESG)
IDEF Registry
Identity Ecosystem Framework – Baseline Functional Requirements