Those of you interested in artists’ collaborative spaces may find the Dada.art platform unique. I found it while pondering the connection between NFTs (Non-Fungible Tokens) and Dadaism, an early 20th century, anti-capitalist art movement “expressing nonsense, irrationality, and anti-bourgeois protest in their works.” (I wondered to myself, half seriously, whether anyone had made NFTs from POGs, the 1990s collector’s item. Turns out someone has).
Of course the NFT platform is called DADA.art and they recently sold a collection of collaborative works as an NFT to Metapurse for 500 ETH (Etherium crypto coin). All proceeds were donated back to the community to provide a basic income (in ETH) to artists on the platform. Fascinating.
THE OCCASIONAL MENTOR is a semi-regular column based on questions I’ve answered on line in forums, and other career advice I’ve given over the prior month. Feel free to add your experience or challenge me in the comments, if you have had a different experience.
Asking a Question on Zoom
A great way to engage with a conference presentation is to ask a question. But how do you get a question answered on a Zoom webinar?
A strategy I like for getting noticed was introduced to me by Rachel Patterson at a recent Technology Transfer Days mentoring session on applying for a Small Business Innovation Research grant (SBIR). For SBIRs, Rachel says that if you want to get noticed by the selection team, you should always submit a question ahead of time, ask again in during the live q&a call, then follow up after the call to thank the speaker and ask a related question or continue the conversation. You can use this strategy for videoconferences on any topic.
Before the Event
Often organizers will forward the list of questions from a prospective audience to the speakers ahead of the session so they can address the topic in their talk. If the event you plan to attend offers a way to submit questions ahead of time, do that and make sure to include your name, contact information and a few words about your company or project/program, so they understand your needs. Otherwise think about what you want to learn from the speaker and make a list of questions you might want to ask during the talk.
Personalize Your Presence
When the event is virtual, such as a Zoom call or similar, edit your name so it has your full name. It may or may not be visible to the whole audience but the hosts and speakers (usually co-hosts on Zoom) will see it. You can also add your company name, location or a brief phrase emoji, but keep in mind only a small part of it will be visible on the gallery view.
If the hosts are using the Zoom Q&A feature to collect questions, post your question and let your custom name speak for itself. The session host may be the only people who can see the question, but usually anyone can, so treat it as if public. (Be careful about posting personal information in a “public“ forum).
Submit Your Question
Ideally, the talk hosts will invite people to ask the question or summarize the context of a previously submitted question at some point during the call. Zoom has a feature called “Questions and Answers” that hosts can activate to take questions as they come up during the call. They may alternatively ask attendees to post questions to the chat feed. Be sure to submit questions in the way the host requires or your question could be lost in a long scrolling chat feed.
If you submitted a question prior to the talk, you should also post your question to chat or Questions and Answers, just in case your question is addressed during the talk without inviting you to have the floor or without giving you credit. If they do give you credit, you may get an additional chance to ask that question or a related one during the Q&A session.
When They Call on You
If you are lucky enough to get called on to speak your question online, introduce yourself, add 5-10 words about your organization or work and then ask your question. Make sure your question aligns with something the speaker said in his presentation. I learned this technique at in person entrepreneur events from Andrea Madho, founder of Lab141, an online, small-batch garment platform, who was in my cohort at the Startup Leadership Program. You are not only giving the speaker background on who you are and what your context is, but giving a chance for audience members to know you and perhaps reach out to connect.
You may also be able to post a chat message to just the presenters, if the host allows that setting. There will be settings that allow audience members to see and interact in the chat with all presenters, just the host, presenters plus audience or you may be able to chat directly with any individual person (but I try to avoid that if I don’t know the person, since it can be distracting and possibly creepy).
Read the Room
Notice the reaction to others who are asking questions about their own companies or who seem to be overtly selling. Are salesy comments and chat posts tolerated and built on or ignored? Are the speaker and organizers friendly to questions that are narrowly concerned with a specific company’s problems or are they brushing them off? What topics are getting brushed off?
After the Talk is Over
In an in person, F2F setting you usually have the possibility to ask a question after the session if the speaker sticks around or if there is a social hour. Online venues don’t usually stay open for long afterward so the opportunity to chat informally is limited if it isn’t explicitly given time. If they do extend the session, use the time to add to the conversation, show your interest and ask more questions.
If you try the above and still don’t feel like you were heard or acknowledged you can contact the event organizer to find out the best way to get in touch with a speaker after the event. Often, the speaker will provide contact details. Capture those details and follow up. And don’t feel weird about it. They expect it. That’s why they put their contact details on the first and last slide.
When you do follow up, whether it’s direct contact, LinkedIn request, or intro from the organizer or another party, be sure to mention something specific about the talk. If you got to ask a question, remind them of it. I don’t have a good rule for how long to wait. I usually give a day or two for the inbox to clear, but you can join (or start) a twitter conversation immediately.
These strategies are helpful for getting you noticed and also helps others on the call follow your lead and engage with you, making an otherwise cold and impersonal event feel more social.
On May 5, 2022, I participated in the California Privacy Protection Agency’s (CPPA) stakeholder meeting, making a public statement about “dark patterns” which I urged them to redefine as “harmful patterns,” and suggested changes to their definitions of “Consent” and “Intentional Action.”
As Jared Spool says, we should be looking at the UX outcome of design decisions, not just the intent, as many designers adopt strategies or work with underlying technologies whose outcomes can be harmful to the technology user and other stakeholders. These UI patterns may not have the intent to do harm. Often the designers’ intent is to provide convenience or a useful service.
Take accessibility overlays that intend to provide a better experience for people with visual or cognitive disabilities but have the effect of overriding necessary controls. Even patterns that affect user behavior, like staying on a page longer, clicking on a link, accepting default cookie settings, etc. may be intended to provide convenience to users, but unknowingly to both the designer and the user, there are processes underlying many of these tools that share data and information about the transaction that can be harmful.
CPRA is defining what it means to consent to data collection and what an intentional user action is. It addresses “dark patterns” as an intentional deception, when often the digital harm is not intentional, yet is deep-rooted. We are hoping to make these harms clearer and provide guidelines for addressing them through our ISL Safe Software Specification.
Read more about the CPPA stakeholder meeting and my statement on behalf of the Internet Safety Labs (formerly the Me2B Alliance):
The Me2B Alliance is a standards development organization comprised of software engineers, policy analysts, UX experts, business and philanthropic leaders who are committed to giving individuals more say in how technology treats people. We are setting up a rigorous independent testing and certification program for websites, apps and connected devices. The Me2B Alliance is comprised of working groups for Me-s (the consumer), B-s (the business) as well as the Policy and Legal and Certification working groups. Together, we are setting the standard for Respectful Technology.
My Role
My role at the Me2B Alliance is twofold: I am leading up the Research and Validation practice to provide user experience and other research services to the various working groups, exploring questions around the consumer experience of their relationship with digital technology.
Secondly, I am developing the product integrity testing framework for digital technologies, in particular mobile apps and websites. This framework, coupled with data integrity and security testing, makes up the requirements for Me2BA certification.
User Research Methods
Ethnographic Research
I am engaging consumers in one on one conversations about their relationship with technologies they use in their day to day lives. Research questions range from and their understanding of privacy policies, terms of use agreements and other agreements they make implicitly by using a technology. for example, do users change how they interact with a website when they are familiar with the legal terms of the website? And would a score make a difference?
Preference Testing
I performed a series of tests of the certification mark to be used as a symbol of trust in connected digital technologies. This included interviews, focus groups, unmoderated 5-Second preference tests and surveys.
Product Integrity Testing
I developed a UX Integrity framework for the Me2B Safe and Respectful Technology Framework (now published as the Me2B Safe Specification). This framework was based on an applicaiton of IA heuristics to ensure that notices of data collection, use and sharing is Clear, Findable, Accessible, Credible and Communicative or understandable by a wide audience of human and machine readible or accessible devices.
Tools
Interviews and Focus Groups: Zoom, UserInterviews.com, Surveymonkey
Preference Tests and 5 Second tests: UserInterviews.com
Collaboration: Microsoft Teams, Zoom, Microsoft365, Trello, Monday
We are planning to conduct three focus groups per month of consumers and digital product designers/managers. The research will continue to evolve our understanding of how consumers experience their relationship and risks with respect to digital technologies.
The seminar is designed to make General Education more visible in our classrooms and courses. We will build an engaging environment for learning through exploration, implementation, and assessment of a variety of proven teaching practices using Oral Communication, Quantitative Literacy, Reading, Writing as our focus this year.
Living Lab General Education Seminar
In this course we learned how to apply High Impact Educational Principles and strategies for Place-Based Learning to create rewarding learning experiences for our students and colleagues.
The culmination of the course was to create a new or redesigned course assignment or project targeting one of the General Education Learning Goals and to implement it during the Spring 2022 semester. A complete description of my activity is located in the L4: Living Lab Learning Library.
The Exercise
In a Data for Good lecture at Columbia’s Data Science Institute, dana boyd of Data+Society told the audience that her proudest achievements are often when she convinces a client not to create something that can potentially do harm.
When does it make sense to NOT make a digital version of something that would be better designed IRL? Are there activities that are more suited to online than IRL? Or are there cases where a combination of both are appropriate?
In this exercise, I shared a few articles about online activities that have had an impact on real life. We discussed both positive and negative reviews of online activities, including Pokemon Go (The Guardian), which is often discussed in terms of its getting gamers to be more social and active, to Instagram (Wall Street Journal), which has been shown to have a negative effect on the self-esteem of teenaged girls. A third example was on how social media use during the pandemic is exacerbating to the political polarization of America (Harvard Berkman Klein Center) by removing the public commons from physical space to largely anonymous forums.
After discussing these articles, students formed breakout groups to find a news article about an online activity and discussed the pros and cons of that activity online. And to also discuss how that activity could be replaced by or combined with an “In Real Life” (IRL) activity to improve the experience. Finally, they posted a reflection on the exercise to the class Slack group.
High Impact Learning
This activity focuses on three learning outcomes: Reading, Information Literacy and Ethical Thinking. Students are asked to read an assigned text describing Digital vs IRL spaces and then in select an example from the reading of a digital experience that might be better In Real Life or paired with an IRL experience. After discussing in groups, they then share back to the class what they discussed and finally post a reflection on course discussion board about their understanding of the pros and cons of digital vs IRL for the chosen scenario.
To address Information Literacy, students must find one additional example of digital applications where the IRL experience takes precedence over digital. What might someone gain from a physical experience that they can’t get from digital? When might a digital application enhance the IRL experience?
And to expand their understanding of who is impacted by their design decisions, they then work in groups to make a stakeholder map (Giordano et al, 2018) showing who is affected by the designed experience of their chosen example. Who is participating in the experience? Who else might be affected by the experience? Or harmed? Who might be left out?
In addition to the reading, information literacy and ethical thinking student learning outcomes, students gain gain from two High Impact Education Practices: Place Based Learning and Collaborative Assignments.
Place-Based Learning: Students consider the physical and embodied experiences of IRL versus digital experiences
Collaborative Assignments: Students participate in discussion of the pros and cons of selected digital experiences
Outcomes and Future Development
This activity was part of the Ethics and Accessibility lecture in Week 7 of the Spring 2022 semester. It took a little over a half hour to complete. There was no out-of-class time except if a student wishes to post their reflection after class. If we had more time (and were not otherwise online this week) we might have been able to go outside and play Pokemon Go or survey people about their online and offline political activity on campus grounds. We may still try to create an online/IRL activity during a later session and follow up with a stakeholder map, which we did not have time to do.
The activity was low stakes and ungraded. The only preparation was to find three articles to discuss as examples of Online activities that either replace or compromise IRL experiences. Because this assignment is ungraded, I plan to use it as part of the participation grade. I do not believe my course is part of the college-wide general education assessment initiative. It is an elective.
Students enjoyed discussing online versus “In Real Life” very much. They are very aware of online activities that are creating unrealistic expectations for their real-life relationships and are concerned about exacerbating these experiences through their design careers. I would like to refine the activity and possibly replace a duller accessibility study that they do for credit and that could be done in class in groups or as a demonstration. Not being able to go outside or actually be IRL was an issue with this activity, though some students mentioned that it made it easier for everyone in their group to search for articles since they were all sitting at a computer anyway.
The Slack channel where students posted their reflections is a private discussion space for students of the HE93 section of COMD3562. I will post an image with student names anonymized to show an example of the written output from this assignment. Students who wish to post their reflections publicly will be able to reply to the post on Open Lab.
This article was originally posted on March 28, 2022 on CUNY Open Lab.
You’ve been there before. You thought you could trust someone with a secret. You thought it would be safe, but found out later that they blabbed to everyone. Or, maybe they didn’t share it, but the way they used it felt manipulative. You gave more than you got and it didn’t feel fair. But now that it’s out there, do you even have control anymore?
Ok. Now imagine that person was your supermarket.
Or your doctor.
Or your boss.
Do you have a personal relationship with technology?
According to research at the Me2B Alliance, people do feel they have a relationship with technology. It’s emotional. It’s embodied. And it’s very personal.
How personal is it? Think about what it would be like if you placed an order at a cafe and they already knew your name, your email, your gender, your physical location, what you read, who you are dating, and that, maybe, you’ve been thinking of breaking up.
We don’t approve of gossipy behavior in our human relationships. So why do we accept it with technology? Sure, we get back some time and convenience, but in many ways it can feel locked in and unequal.
The Me2B Relationship Model
At the Me2B Alliance, we are studying digital relationships to answer questions like “Do people have a relationship with technology?” (They feel that they do). “What does that relationship feel like?” (It’s complicated). And “Do people understand the commitments that they are making when they explore, enter into and dissolve these relationships?” (They really don’t).
It may seem silly or awkward to think about our dealings with technology as a relationship, but like messy human relationships there are parallels. The Me2BA commitment arc with a digital technology resembles German psychologist George Levenger’s ABCDE relationship model 1, shown by the Orange icons in the image below. As with human relationships, we move through states of discovery, commitment and breakup with digital applications, too.
Our assumptions about our technology relationships are similar to the ones we have about our human ones. We assume when we first meet someone there is a clean slate, but this isn’t always true. There may be gossip about you ahead of your meeting. The other person may have looked you up on LinkedIn. With any technology, information about you may be known already, and sharing that data starts well before you sign up for an account.
The Invisible Parallel Dataverse
Today’s news frequently covers stories of personal and societal harm caused by digital media manipulation, dark patterns and personal data mapping. Last year, Facebook whistleblowerFrances Hauser exposed how the platform promotes content that they know from their own research causes depression and self-harm in teenage girls. They know this because they know what teenage girls click, post and share.
Technology enables data sharing at every point of the relationship arc, including after you stop using it. Worryingly, even our more trusted digital relationships may not be safe. The Me2B Alliance uncovered privacy violations in K-12 software, and described how abandoned website domains put children and families at risk when their schools forget to renew them.
Most of the technologies that you (and your children) use have relationships with third party data brokers and others with whom they share your data. Each privacy policy, cookie consent and terms of use document on every website or mobile app you use defines a legal relationship, whether you choose to opt in or are locked in by some other process. That means you have a legal relationship with each of these entities from the moment you accessed the app or website, and in most cases, it’s one that you initiated and agreed to.
All the little bits of our digital experiences are floating out there and will stay out there unless we have the agency to set how that data can be used or shared and when it should be deleted. The Me2B Alliance has developed Rules of Engagement for respectful technology relationships and a Digital Harms Dictionary outlining types of violations, such as:
Collecting information without the user’s awareness or consent;
contracts of adhesion, where users are forced to agree with terms of use (often implicitly) when they engage with the content;
Loss or misuse of personally identifiable information; and
Unclear or non-transparent information describing the technology’s policies or even what Me2B Deal they are getting.
Respectful technology relationships begin with minimizing the amount of data that is collected in the first place. Data minimization reduces the harmful effects of sensitive data getting into the wrong hands.
Next, we should give people agency and control. Individual control over one’s data is a key part of local and international privacy laws like GDPR in Europe, and similar laws in California, Coloradoand Virginia, which give consumers the right to consent to data collection, to know what data of theirs is collected and to request to view the data that was collected, correct it, or to have it permanently deleted.
Three Laws of Safe and Respectful Design
In his short story, I, Robot, Isaac Asimov introduced the famous “Three Laws of Robotics,” an ethical framework to avoid harmful consequences of machine activity. Today, IAs, programmers and other digital creators make what are essentially robots that help users do work and share information. Much of this activity is out of sight and mind, which is in fact how we, the digital technology users, like it.
But what of the risks? It is important as designers of these machines to consider the consequences of the work we put into the world. I have proposed the following corollary to Asimov’s robotics laws:
First Law: A Digital Creator may not injure a human being or, through inaction, allow a human being to come to harm.
Second Law: A Digital Creator must obey the orders given by other designers, clients, product managers, etc. except where such orders would conflict with the First Law.
Third Law: A Digital Creator must protect its own existence as long as such protection does not conflict with the First or Second Law.1
Mike Monteiro in his well-known 2014 talk at An Event Apart on How Designers are Destroying the World discusses the second and third law a lot. While we take orders from the stakeholders of our work—the client, the marketers and the shareholders we design for—we have an equal and greater responsibility to understand and mitigate design decisions that have negative effects.
A Specification for Safe and Respectful Technology
The Me2B Alliance is working on a specification for safe and respectfully designed digital technologies—technologies that Do No Harm. These product integrity tests are conducted by a UX Expert and applied to each commitment stage that a person enters. These stages range from first-open, location awareness, cookie consent, promotional and loyalty commitments, and account creation, as well as the termination of the relationship.
Abby Covert’s IA Principles—particularly Findable, Accessible, Clear, Communicative and Controllable—are remarkably appropriate tests for ensuring that the people who use digital technologies have agency and control over the data they entrust to these products:
Findable: Are the legal documents that govern the technology relationship easy to find? What about support services for when I believe my data is incorrect, or being used inappropriately? Can I find a way to delete my account or delete my data?
Accessible: Are these resources easy to access by both human and machine readers and assistive devices? Are they hidden behind some “data paywall” such as a process that requires a change of commitment state, i.e. a data toll, to access?
Clear: Can the average user read and understand the information that explains what data is required for what purpose? Is this information visible or accessible when it is relevant?
Communicative: Does the technology inform the user when the commitment status changes? For example, does it communicate when it needs to access my location or other personal information like age, gender, medical conditions? Does it explain why it needs my data and how to revoke data access when it is no longer necessary?
Controllable: How much control do I have as a user? Can I freely enter into a Me2B Commitment or am I forced to give up some data just to find out what the Me2B Deal is in the first place?
Abby’s other IA principles flow from the above considerations. A Useful product is one that does what it claims to do and communicates the deal you get clearly and accessibly. A Credible product is one that treats the user with respect and communicates its value. With user Control over data sharing and a clear understanding of the service being offered, the true Value of the service is apparent.
Over time the user will come to expect notice of potential changes to commitment states and will have agency over making that choice. These “Helpful Patterns”—clear and discoverable notice of state changes and opt-in commitments—build trust and loyalty, leading to a Delightful, or at least a reassuring, experience for your users.
What I’ve learned from working in the standards world is that Information Architecture Principles provide a solid framework for understanding digital relationships as well as structuring meaning. Because we aren’t just designing information spaces. We’re designing healthy relationships.
1 Levinger, G. (1983). “Development and change.” In H.H. Kelley et al. (Eds.), Close relationships (315–359). New York: W. H. Freeman and Company. https://www.worldcat.org/title/close-relationships/oclc/470636389
THE OCCASIONAL MENTOR is a semi-regular column based on questions I’ve answered on Quora, heard on Slack groups, and other career advice I’ve given over the prior month. Feel free to challenge me in the comments, if you have a different experience.
Kill Your Darlings
I am working on a project with a friend who is acting as a client for a capstone project with an agile development class. She complained to me that the students were unable to create a simple one-sheet deliverable featuring a proposed design. The problem: WordPress hasn’t been set up yet. It didn’t occur to them that they could mock it up in a drawing program or simply sketch it by hand.
When I do in-class studios, I will often make the design students work entirely on paper and whiteboard, no computers allowed, to ideate and create a paper prototype. It can be done in two hours end to end. Is the final deliverable App Store ready? Of course not. But it is enough to move quite a bit toward a testable idea.
Students today, and especially developers, don’t understand the power of a piece of paper that you can throw away. When you start coding (or drafting in WordPress) too soon, you get too married to the code, making it hard later on to incorporate new learnings from your user research. It’s better practice to stay as low fi as possible for as long as possible. That’s at least one day of a five day sprint. Sometimes two (testing the paper artifacts with users). Then “Kill your darlings” before they become too dear.
Note: The phrase “Kill your darlings” (or “murder your babies”) is often attributed to William Faulkner and is a feature of many descriptions of the Beat poets: William Borroughs, Allan Ginsberg, Jack Kerouac et al. In fact, the concept “murder your babies” can be traced to Sir Arthur Quiller-Couch, a British writer and literary critic in a 1916 lecture series at Cambridge. (Quiller-Couch, Sir Arthur (2000) [1916]. “XII. On Style”. On the Art of Writing: Lectures Delivered in the University of Cambridge, 1913–1914 (Online ed.). Bartleby.com.)
In the research I’ve been doing on respectful technology relationships at the Me2B Alliance, it’s a combination of “I’ve got nothing to hide” and “I’ve got no other option”. People are deeply entangled in their technology relationships. Even when presented with overwhelmingly bad scores on Terms of Service and Privacy Policies, they will continue to use products they depend on or that give them access to their family, community, and in the case of Amazon an abundance of choice, entertainment and low prices. Even when they abandon a digital product or service, they are unlikely to delete their accounts. And the adtech SDKs they’ve agreed to track them keep on tracking.