How do we build healthy digital communities?

 Design to prevent harm from the foundations. Ask yourself: “How can this be misused to harm others?”
2  There is no silver bullet. 
Create a suite of tools.

Overview of Health Projects

Periscope Chat Jury · 2016  
Designer
A randomly-selected jury of viewers votes on comments that have been reported by other viewers or flagged by our system. I collaborated on the design with our design lead Tyler Hansen. I presented and demoed the project with the Android lead Lien Mamitsuka at Twitter's All Hands.
Learn more below
Periscope Chat Moderators · 2018
Design Manager
Broadcasters can choose their own moderators. Moderators can quickly mute anyone in the chat. I supported the work of designers Asli Kimya and Sherly Liu.
Twitter Conversational
Incentives · 2019  
Project Lead & Designer
I led a cross-functional team tasked with strategy & conceptual design for rethinking incentives for healthy conversation on Twitter. I facilitated remote design sprints that culminated in presentations to the Head of Product. We staffed a new team to pursue the ideas generated during our explorations.
Twitter Hack Week Winner: Reporting Blockchain · 2019
Designer
One of my last hurrahs at Twitter entailed winning Hack Week with Arnold Jun. We proposed a virtual currency for rewarding accurate reporting of tweets that violated Twitter's Terms of Service and created the basis for a public record of violations.

How do we reduce the number of abusive and spam comments on Periscope?

Chat Jury

01. Quick Overview

Voting
A group of viewers is randomly selected to vote on a comment flagged by an algorithm or reported by a viewer. Experience is optimized for being fast, intuitive, and clear, so a good judgment can be passed even under the pressure of time.
Consequences
If your comment was voted spam or abusive, commenting is temporarily disabled. Offenses stack and can result in commenting limits across all broadcasts. You can still send hearts to inspire you to do better next time.
Reporting
Tap on a comment to report it. Categorizing a comment as spam or abusive will trigger the jury system.
Chat carousel
We added a chat carousel, so you could easily navigate through the displayed chats if you accidentally missed it.
Voting result transparency
If you were on the jury, we'll let you know the verdict, and the outcome for the commenter (if any).
Moderation settings
Whenever possible, we wanted to give people control over whether to use our moderation system. Default is ON.

02. Goal · Context · Constraints

Goal
Reduce the number of abusive and spam comments seen by broadcasters and viewers.
Context
Experiencing abuse on a social platform, both as a target or a bystander is systemically harmful. It harms people on an individual level, harms communities, and harms the overall platform. People don't want to express themselves or contribute to a platform that has unhealthy dynamics and feels unsafe.
Constraints
Live-streaming is live. That means that any moderation solution has to be as fast as possible (and every second counts). People love Periscope because our very low chat latency means people can have natural flowing conversations. This was our competitive advantage.

03. Principles

Creating guiding principles for any project is always a good idea. And when a project has sensitive ethical implications, it is a critical necessity.

03. Explorations & Research

Voting with icons

Our hypothesis for this design was that using icons was going to be faster (because people would need to read less) and the bigger buttons would make it less error prone.

Research

While people found the buttons attractive, it took them more time to understand what each one represented, relying on the subtle labels for clarification. The timer at the bottom was also easy to miss.

Safe Chat

Initially, we considered this to be a positive albeit very simple step towards moderation tools. After hearing feedback, we pivoted to the jury system which better balanced the burden of safety management with control and transparency.

Research

People were uncomfortable looking at a list of offensive words (obscured here for this very reason). Adding a word you dislike to the list was even worse.

04. Final

Voting & Reporting Architecture

05. Reflections

Chat jury is an innovative approach to creating healthy digital communities that married human intelligence and ethics with technology that provided immediate on-demand content moderation at scale. Its creation reflected our best values and highest aspirations for empowering communities to protect themselves and determine their own values.  It has eased the enormous burden of managing bad actors on broadcasters, so they can focus on what they do best: creating the best broadcasts for their audience. As time went on, we learned that the randomly selected jury was not always a community aligned with the values of the broadcaster. We eventually addressed this gap by introducing broadcaster-appointed chat moderators that could mute commenters. We also upgraded the basic phrase-matching algorithm by incorporating a machine learning model trained on data that reflected our terms of service and community guidelines.