CHI 2019 Annotated Bibliography (Part 1)

After the 2019 CHI conference (technically the ACM CHI Conference on Human Factors in Computing Systems) and blogging about our own paper on design approaches to privacy, I wanted to highlight other work that I found interesting or thought provoking in a sort of annotated bibliography. Listed in no particular order, though most relate to one or more themes that I’m interested in (privacy, design research, values in design practice, critical approaches, and speculative design).

(I’m still working through the stack of CHI papers that I downloaded to read, so hopefully this is part 1 of two or three posts).

  • James Pierce. 2019. Smart Home Security Cameras and Shifting Lines of Creepiness: A Design-Led Inquiry. Paper 45, 14 pages. https://doi.org/10.1145/3290605.3300275 — Pierce uses a design-led inquiry to illustrate and investigate three data practices of IoT products and services (digital leakage, hole-and-corner applications, and foot-in-the-door devices), providing some conceptual scaffolding for thinking about how privacy emerges differently in relation to varying technical (and social) configurations. Importantly, I like that Pierce is pushing design researchers to go beyond conceptualizing privacy as “creepiness”, through his exploration of three tropes of data practices.
  • Renee Noortman, Britta F. Schulte, Paul Marshall, Saskia Bakker, and Anna L. Cox. 2019. HawkEye – Deploying a Design Fiction Probe. Paper 422, 14 pages. https://doi.org/10.1145/3290605.3300652 — Building on Shulte’s concept of a “design probe,” Noortman et al. participants interact with a (beautifully designed!) control panel in the home over 3 weeks to act in the role of a caregiver in a design fiction about dementia care. The paper furthers the use of design fiction as a participatory and embodied experience, and as a data collection tool for research. The authors provide some useful reflections on the ways participants imagined and helped build out the fictional world in which they were participating.
  • Yaxing Yao, Justin Reed Basdeo, Smirity Kaushik, and Yang Wang. 2019. Defending My Castle: A Co-Design Study of Privacy Mechanisms for Smart Homes. Paper 198, 12 pages. https://doi.org/10.1145/3290605.3300428 — Yao et al. use co-design techniques to explore privacy concerns and potential privacy mechanisms with a range of participants (including diversity in age). Some interesting ideas arise from participants, such as creating an IoT “incognito mode,” as well as raising concerns about accessibility for these systems. Sometimes tensions arise, with participants wanting to trust IoT agents like Alexa as a ‘true friend’ who won’t spy on them, yet harboring some distrust of the companies creating these systems. I like that the authors point to a range of modalities for where we might place responsibility for IoT privacy – in the hardware, apps, platform policy, or operating modes. It’s a nice tie into questions others have asked about how responsibility for privacy is distributed, or what happens when we “handoff” responsibility for protecting values from one part of a sociotechnical system to another part.
  • Kristina Andersen and Ron Wakkary. 2019. The Magic Machine Workshops: Making Personal Design Knowledge. Paper 112, 13 pages. https://doi.org/10.1145/3290605.3300342 — Andersen and Wakkary outline a set of workshop techniques to help participants generate personal materials. I appreciate the commitments made in the paper, such as framing workshops as something that should benefit participants themselves, as well as researchers, in part by centering the workshop on the experience of individual participants. They propose a set of workshop elements; it’s nice to see these explicated here, as they help convey a lot of tacit knowledge about running workshops (the details of which are often abbreviated in most papers’ methods sections). I particularly like the “prompt” element to help provide a quick initial goal for participants to engage in while situating the workshop. While the example workshops used in the paper focus on making things out of materials, I’m curious if some of the outlined workshop elements might be useful in other types of workshop-like activities.
  • Laura Devendorf, Kristina Andersen, Daniela K. Rosner, Ron Wakkary, and James Pierce. 2019. From HCI to HCI-Amusement: Strategies for Engaging what New Technology Makes Old. Paper 35, 12 pages. https://doi.org/10.1145/3290605.3300265 – Devendorf et al. start by (somewhat provocatively) asking what it might be like to explore a “non-contribution” in HCI. The paper walks through a set of projects and works its way to a set of reflections about the norms of HCI research focusing on the “technological new,” asking what it might mean instead to take the present or the banal more seriously. The paper also starts to ask what types of epistemologies are seen as legitimate in HCI. The paper calls for “para-research” within HCI as a way to focus attention on what is left out or unseen through dominant HCI practices.
  • Colin M. Gray and Shruthi Sai Chivukula. 2019. Ethical Mediation in UX Practice. Paper 178, 11 pages. https://doi.org/10.1145/3290605.3300408 – Through a set of case study observations and interview, Gray and Chivukula study how ethics are conducted in practice by UX designers. The paper provides a lot of good detail about ways UX designers bring ethics to the forefront and some of the challenges they face. The authors contribute a set of relationships or mediators, connecting individual designers’ practices to organizational practices to applied ethics.
  • Sarah E. Fox, Kiley Sobel, and Daniela K. Rosner. 2019. Managerial Visions: Stories of Upgrading and Maintaining the Public Restroom with IoT. Paper 493, 15 pages. https://doi.org/10.1145/3290605.3300723 – Through interviews, participant observations, and analysis of media materials, Fox et al. investigate managerial labor in regulating access to public bathroom resources. They craft a story of regulation (in a broad sense), about how the bathroom’s management is entangled among local politics and on-the-ground moral beliefs, corporate values, imagined future efficiencies through technology, and strategic uses of interior and technological design. This entanglement allows for particular types of control, allowing some access to resources and making it harder for others.
  • William Gaver, Andy Boucher, Michail Vanis, Andy Sheen, Dean Brown, Liliana Ovalle, Naho Matsuda, Amina Abbas-Nazari, and Robert Phillips. 2019. My Naturewatch Camera: Disseminating Practice Research with a Cheap and Easy DIY Design. Paper 302, 13 pages. https://doi.org/10.1145/3290605.3300532 – Gaver et al. detail a DIY nature camera, shown in partnership with a BBC television series and built by over 1000 people. Interestingly, while similar tools could be used for citizen science efforts, the authors are clear that they are instead trying to create a type of public engagement with research that focuses on creating more intimate types of encounters, and engaging people with less technical expertise in making. The cameras help create intimate “encounters” with local wildlife (plus the paper includes some cute animal photos!).
  • Sandjar Kozubaev, Fernando Rochaix, Carl DiSalvo, and Christopher A. Le Dantec. 2019. Spaces and Traces: Implications of Smart Technology in Public Housing. Paper 439, 13 pages. https://doi.org/10.1145/3290605.3300669 — Kozubaev et al.’s work adds to a growing body of work questioning and reframing what the “home” means in relation to smart home technology. The authors conduct design workshops with residents (and some managers) in US public housing, providing insight into housing situations where (1) the “home” is not a single-family middle class grouping, and (2) the potential end users of smart home technologies may not have control or consent over the technologies used, and are already subject to various forms of state surveillance.
  • Shruthi Sai Chivukula, Chris Watkins, Lucca McKay, and Colin M. Gray. 2019. “Nothing Comes Before Profit”: Asshole Design In the Wild. Paper LBW1314, 6 pages. https://doi.org/10.1145/3290607.3312863 — This late breaking work by Chivukala et al investigates the /r/assholedesign subreddit to explore the concept of “asshole design,” particularly in comparison to the concept of “dark patterns.” They find that asshole design uses some dark pattern strategies, but that dark patterns tend to trick users into doing certain things, while asshole design often restricts uses of products and more often include non-digital artifacts. I think there may be an interesting future regulatory discussion about asshole design (and dark patterns). On one hand, one might consider whether dark pattern or asshole design practices might fit under the FTC’s definition of “unfair and deceptive practices” for possible enforcement action against companies. On the other, as some legislators are introducing bills to ban the use of dark patterns – it becomes very important to think carefully about how dark patterns are defined, and what might get included and excluded in those definitions; the way that this work suggests a set of practices related to, but distinct from, dark patterns could help inform future policy discussions.
Advertisements

Where’s the Rest of Design? Or, Bringing Design to the Privacy Table: Broadening “Design” in “Privacy by Design” Through HCI [Paper Talk]

This post is based on a talk given at the 2019 ACM CHI Conference on Human Factors in Computing Systems (CHI 2019), in Glasgow, UK. The full research paper by Richmond Wong and Deirdre Mulligan that the talk is based on, “Bringing Design to the Privacy Table: Broadening “Design” in “Privacy by Design” Through the Lens of HCI” can be found here: [Official ACM Version] [Open Access Pre-Print Version]

In our paper “Bringing Design to the Privacy Table: Broadening Design in Privacy by Design,” we conduct a curated literature review to make two conceptual argument arguments:

  1. There is a broad range of design practices used in human computer interaction (HCI) research which have been underutilized in Privacy By Design efforts.
  2. Broadening privacy by design’s notion of what “design” can do can help us more fully address privacy, particularly in situations where we don’t yet know what concepts or definitions of privacy are at stake.

But let me start with some background and motivation. I’m both a privacy researcher—studying studying how to develop technologies that respect privacy—and I’m a design researcher, who designs things to learn about the world.

I was excited several years ago to hear about a growing movement called “Privacy By Design,” the idea that privacy protections should be embedded into products and organizational practice during the design of products, rather than trying to address privacy retroactively. Privacy By Design has been put forward in regulatory guidance from the US and other countries, and more recently by the EU’s General Data Protection Regulation. Yet these regulations don’t provide a lot of guidance about what Privacy By Design means in practice.

In interactions with and field observations of the interdisciplinary Privacy By Design community—including lawyers, regulators, academics, practitioners, and technical folks—I’ve  found that there is a lot of recognition of the complexity of privacy: that it’s an essentially contested concept, there are many conceptualizations of privacy; privacy from companies is different than privacy from governments; there are different privacy harms, and so forth.

PowerPoint Slide Show - [Bringing Privacy to the Table CHI Talk 2.0.pptx] 5_10_2019 10_14_25 AM

Privacy by Design conceptualizes “design” in a relatively narrow way

But the discussion of “design” seems much less complex. I had assumed Privacy By Design would have meant applying HCI’s rich breadth of design approaches toward privacy initiatives – user centered design, participatory design, value sensitive design, speculative design, and so on.

Instead, design seemed to be used narrowly, as either a way to implement the law via compliance engineering, or to solve specific privacy problems. Design was largely framed as a deductive way to solve a problem, using approaches such as encryption techniques or building systems to comply with fair information practices. While these are all important and necessary privacy initiatives, but I kept finding myself asking, “where’s the rest of design?” Not just the deductive problem solving aspects of design, but also its the inductive, exploratory, and forward looking aspects.

PowerPoint Slide Show - [Bringing Privacy to the Table CHI Talk 2.0.pptx] 5_10_2019 10_16_05 AM
There’s an opportunity for Privacy By Design to make greater use of the breadth of design approaches used in HCI

There’s a gap here between the way the Privacy By Design views design and the way the HCI community views design. Since HCI researchers and practitioners are in a position to help support or implement privacy by design initiatives, it’s important to try to help broaden the notion of design in Privacy By Design to more fully bridge this gap.

So our paper aims to fulfill 2 goals:

  1. Design in HCI is more than just solving problems. We as HCI privacy researchers can more broadly engage the breadth of design approaches in HCI writ large. And there are opportunities to build connections among the HCI privacy research community and HCI design research community & research through design community to use design in relation to privacy in multiple ways.
  2. Privacy By Design efforts risk missing out on the full benefits that design can offer if it sticks with a narrower solution and compliance orientation to design. From HCI, we can help build bridges with interdisciplinary Privacy By Design community, and engage them in understanding a broader view of design.  

So how might we characterize the breadth of ways that HCI uses design in relation to privacy? In the paper, we conduct a curated review of HCI research to explore and breadth and richness of how design practices are used in relation to privacy. We searched for HCI papers that use both the terms “privacy” and “design,” curating a corpus of 64 papers. Reading through each paper, we openly coded each one by asking a set of questions including: Why is design used; who is design done by; and for whom is design done? Using affinity diagramming on the open codes, we came up with a set of categories, or dimensions, which we used to re-code the corpus. In this post I’m going to focus on the dimensions that emerged when we looked at the “why design?” question, which we call the purposes of design.

PowerPoint Slide Show - [Bringing Privacy to the Table CHI Talk 2.0.pptx] 5_10_2019 10_15_10 AM

We use 4 purposes to discuss the breadth of reasons why design might be used in relation to privacy

 We describe 4 purposes of design. They are:

  • Design to solve a privacy problem;
  • Design to inform or support privacy;
  • Design to explore people and situations; and
  • Design to critique, speculate, and present critical alternatives.

Note that we use these to talk about how design has been used in privacy research specifically, not about all design writ large (that would be quite a different and broader endeavor!). In practice these categories are not mutually exclusive, and are not the only way to look at the space, but looking at them separately helps give some analytical clarity.  Let’s briefly walk through each of these design purposes.

To Solve a Privacy Problem

First, design is seen as a way to solve a privacy problem – which occurred most often in the papers we looked at. And I think this is often how we think about design colloquially, as a set of practices to solve problems. This is often how design is discussed in Privacy By Design discussions as well.

When viewing design in this way, privacy is presented a problem that has already been well-defined at the before the design process, and a solution is designed to address that definition of the problem. A lot of responsibility for protecting privacy here is thus placed in the technical system.

For instance, if a problem of privacy is defined as the harms that result from long term data processing and aggregation, we might design a system that limits data retention. If a problem of privacy is defined as not being identified, we might design a system to be anonymous.

To Inform or Support Privacy

Second, design is seen as a way to inform or support actors who must make privacy-relevant choices, rather than solving a privacy problem outright. This was also common in our set of papers.  Design to inform or support privacy views problems posed by privacy as an information or tools problem. If users receive information in better ways, or have better tools, then they can make more informed choices about how to act in privacy-preserving ways.

A lot of research has been done on how to design usable privacy policies or privacy notices – but it’s still up to the user to read the notice and make a privacy relevant decision. Other types of design work in this vein includes designing privacy icons, controls, dashboards, visualizations, as well as educational materials and activities.

In these approaches, a lot of responsibility for protecting privacy is placed in the choices that people make, informed by a design artifact. The protection of privacy doesn’t arise from the design of the system itself, but rather by how a person chooses to use the system. This orientation towards privacy fits well with US regulations around privacy that make individuals manage and control their own data.

To Explore People and Situations (Related to Privacy)

Third is using design to explore people and situations. Design is used as a mode of inquiry, to better understand what privacy or the experience of privacy means to certain people, in certain situations. Design here is not necessarily about solving an immediate problem.

Techniques like design probes or collaborative design workshops are some approaches here. For example, a project I presented at CSCW 2018 involved presenting booklets with conceptual designs of potentially invasive products to technology practitioners in training. We weren’t looking to gather feedback in order to develop these conceptual ideas into usable products. Instead, the goal was to use these conceptual design ideas as provocations to better understand the participants’ worldviews. How are they conceptualizing privacy when they see these designs? How do their reactions help us understand where they place responsibility for addressing privacy?

Here, privacy is understood as a situated experience, which emerges from practices from particular groups in specific contexts or situations. The goal is less about solving a privacy problem, and more about understanding how privacy gets enacted and experienced.

To Critique, Speculate, or Present Critical Alternatives About Privacy

Fourth is design to critique, speculate, or present critical alternatives. (By critical I don’t mean bad or mean, but instead I mean critical like reflexive reflection or careful analysis).  Design here is not about exploring the world as it is, but focuses on how the world could be. Often this consists of creating create conceptual designs that provoke, to create a space to surface and discuss social values. These help us discuss worlds we might strive to achieve or ones we want to avoid. Privacy in this case is situated in different possible sociotechnical configurations of the world, thinking about privacy’s social, legal, and technical aspects together.

For example, in a project I presented at DIS 2017, we created advertisements for fictional sensing products, like a bodily implant for workplace employees. This helped us raise questions beyond basic data collection and use ones. The designs helped us ask questions about how is privacy implicated in the workplace, or through employment law? Can consent really occur with these power dynamics? It also helped us ask normative questions, such as: Who gets to have privacy and who doesn’t? Who or what should be responsible for protecting privacy? Might we look to technical design, to regulations, to market mechanisms, or to individual choice to protect privacy?

Design Is a Political, Values-Laden Choice

So in summary these are the 4 purposes of design that we identified in this paper: using design to solve, to inform and support, to explore, and to critique and speculate. Again, in practice, they’re not discrete categories. Many design approaches, like user centered design, or participatory design, use design for multiple design purposes.

PowerPoint Slide Show - [Bringing Privacy to the Table CHI Talk 2.0.pptx] 5_10_2019 10_27_56 AM

Using design in different ways suggests different starting points for how we might think about privacy

But this variety of purposes for how design relates to privacy is also a reminder that design isn’t a neutral process, but is itself political and values-laden. (Not political in terms of liberal and conservative, but political in the sense that there is power and social implications in the choices we make about how to use design). Each design purpose suggests a different starting place for how we orient ourselves towards conceptualizing and operationalizing privacy. We might think about privacy as:

  • a technical property;
  • as a user-made choice;
  • as situated experiences;
  • as privacy as sociotechnically situated.

Privacy can be many and all of these things at once, but the design methods we choose, and the reasons why we choose to use design helps to suggest or foreclose different orientations toward privacy. These choices also suggest that responsibility for privacy might be placed in different places — such as in a technical system, in a person’s choices, in a platform’s policies, in the law, in the market, and so forth.

PowerPoint Slide Show - [Bringing Privacy to the Table CHI Talk 2.0.pptx] 5_10_2019 10_15_39 AM

Research using design to solve and design to inform and support appeared more often in the papers that we looked at

Now I’ve been discussing these 4 design purposes equally, but they weren’t equal in our corpus. Allowing each paper to be coded for multiple categories, a little over half the papers we looked at used design to solve a privacy problem and a little over half used design to inform or support. Less than a quarter used design to explore; even fewer used design to critique and speculate. We don’t claim that the exact percentages are representative of all the privacy literature, but there’s a qualitative difference here, where most of the work we reviewed uses design to solve privacy problems or support and inform privacy.

We are arguing for a big tent approach in privacy by design: using design in all of these ways helps us address a broader set of conceptions of privacy.

This suggests that there’s an opportunity for us to build bridges between the HCI privacy research community, which has rich domain expertise; and the HCI design research & research through design communities, which have rich design methods expertise, particularly using design in ways to explore, and to critique and speculate.

So that’s Argument 1, that we have the opportunity to build new bridges among HCI communities to more fully make use of each others’ expertise, and a broader range of design methods and purposes.

Argument 2 is that Privacy By Design has largely (with some exceptions) thought about design as a problem solving process.  Privacy By Design research and practice could expand on that thinking of design to make more use of a fuller breadth of uses of design that are reflected in HCI.

Implications for Design Collaboration

So what might some of these collaborations within and across fields look like, if we want to make use of more of design’s breadth? For example if we as privacy researchers, develop a set of usable privacy tools to inform and support most people’s privacy decision making; that might be complemented with design to explore so that we can better understand the often marginalized populations for whom those tools don’t work. For instance Diana Freed et al.’s work shows that social media privacy and security tools can be used against victims of intimate partner violence, violating their privacy and safety. Or, an emerging set of problems we face is thinking about privacy in physically instrumented spaces: how does consent work, what conceptions of privacy and privacy risk are at play? We can complement design to solve and design to support efforts with design to critique and speculate; to craft future scenarios that try to understand what concepts of privacy might be at play, and how privacy can surface differently when technical, social, or legal aspects of the world change.

From a design research perspective, I think there’s growing interest in the design research community to create provocative artifacts to try to surface discussions about privacy, particularly in relation to new and emerging technologies. Critically reflecting on my own design research work, I think it can be tempting to just speak to other designers and resort to conceptions of privacy that say “surveillance is creepy” and not dig deeper into other approaches to privacy. But by collaborating with privacy researchers, we can bring more domain expertise and theoretical depth to these design explorations and speculations, and engage a broader set of privacy stakeholders.

Industry privacy practitioners working on privacy by design initiatives might consider incorporating more UX researchers and designers form their organizations, as privacy allies and as design experts.  Approaches that use design to critique and speculate may also align well with privacy practitioners’ stated desire to find contextual and anticipatory privacy tools to help “think around corners”, as reported by Ken Bamberger and Deirdre Mulligan.

Privacy By Design regulators could incorporate more designers (in addition to engineers and computer scientists) in regulatory discussions about privacy by design, so that this richness of design practice isn’t lost when the words “by design” are written in the law.

Moreover, there’s an opportunity here for us an HCI community to bring HCI’s rich notions of what design can mean to Privacy By Design, so that beyond being a problem solving process, it is also seen as a process that also makes use of the multi-faceted, inductive, and exploratory uses of design that this community engages in.


 

Paper Citation: Richmond Y. Wong and Deirdre K. Mulligan. 2019. Bringing Design to the Privacy Table: Broadening “Design” in “Privacy by Design” Through the Lens of HCI. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI ’19). ACM, New York, NY, USA, Paper 262, 17 pages. DOI: https://doi.org/10.1145/3290605.3300492

Utilizing Design’s Richness in “Privacy by Design”

This post summarizes a research paper, Bringing Design to the Privacy Table, written by Richmond Wong and Deirdre Mulligan. The paper will be presented at the 2019 ACM Conference on Human Factors in Computing Systems (CHI 2019) on Wednesday, May 8 at the 4pm “Help Me, I’m Only Human” paper session.

How might the richness and variety in human computer interaction (HCI) design practices and approaches be utilized in addressing privacy during the development of technologies?

U.S. policy recommendations and the E.U.’s General Data Protection have helped concept of privacy by design (PBD)—embedding privacy protections into products during the initial design phase, rather than retroactively—gain traction. Yet while championing “privacy by design,” these regulatory discussions offer little in the way of concrete guidance about what “by design” means in technical and design practice. Engineering communities have begun developing privacy engineering techniques, to use design as a way to find privacy solutions. Many privacy engineering tools focus on design solutions that translate high level principles into implementable engineering requirements.  However, design in HCI has a much richer concept of what “design” might entail: it also includes thinking about design as a way to explore the world and to critique and speculate about the world. Embracing this richness of design approaches can help privacy by design more fully approach the privacy puzzle.

To better understand the richness of ways design practices can related to privacy, we conducted a curated review of 64 HCI research papers that discuss both privacy and design. One thing we looked at was how each paper viewed the purpose of design in relation to privacy. (Papers could be classified into multiple categories, so percentages add up to over 100). We found four main design purposes:

  • To Solve a Privacy Problem (56% of papers) – This aligns with the common perception of design, that design is used to solve problems. This includes creating system architectures and data management systems in ways that collect and use data in privacy-preserving ways. The problems posed by privacy are generally well-defined before the design process; a solution is then designed to address that problem.
  • To Inform or Support Privacy (52%) – Design is also used to inform or support people who must make privacy-relevant choices, rather than solving a privacy problem outright. A lot of these papers use design to increase the usability of privacy notices and controls to allow end users to more easily make choices about their privacy. These approaches generally assume that if people have the “right” types of tools and information, then they will choose to act in more privacy-preserving ways.
  • To Explore People and Situations (22%) – Design can be used as a form of inquiry to understand people and situations. Design activities, probes, or conceptual design artifacts might be shared with users and stakeholders to understand their experiences and concerns about privacy. Privacy is thus viewed here as relating to different social and cultural contexts and practices; design is used as a way to explore what privacy means in these different situations.
  • To Critique, Speculate, or Present Critical Alternatives (11%) – Design can be used to create spaces in which people can discuss values, ethics, and morals—including privacy. Rather than creating immediately deployable design solutions, design here works like good science fiction: creating conceptual designs that try to provoke people into think about relationships among technical, social, and legal aspects of privacy and ask questions such as who gets (or doesn’t get) to have privacy, or who should be responsible for providing privacy.

One thing we found interesting is how some design purposes tend to narrowly define what privacy means or define privacy before the design process, whereas others view privacy as more socially situated and use the process of design itself to help define privacy.

For those looking towards how these dimensions might be useful in privacy by design practice, we mapped our dimensions onto a range of design approaches and methodologies common in HCI, in the table below.

Design Approach(es) Dimensions of Design Purposes How does design relate to privacy?
Software Engineering Solve a problem; Inform and support Conceptions and the problem of privacy solved are defined in advance. Lends itself well to problems related data privacy, or privacy issues to be addressed at a system architecture level.
User-Centered Design Solve a problem; Inform and support; Explore Could have conception of privacy defined in advance, or it might surface from users. Lends itself well to individual-based conceptions of privacy
Participatory Design; Value Centered Design Solve a problem; Inform and support; Explore; Surface stakeholder conceptions of privacy, involve stakeholders in the design process
Resistance, Re-Design, Re-Appropriation Practices Solve a problem; Critique Shows breakdown or contestation in current conceptions of privacy
Speculative and Critical Design Explore; Critique Critique current conceptions of privacy, explores and shows potential ways privacy might emerge in new situations

These findings can be of use to several communities:

  • HCI privacy researchers and PBD researchers might use this work to reflect on dominant ways in which design has been used thus far (to solve privacy problems, and to inform or support privacy), and begin to explore a broader range of design purposes and approaches in privacy work.
  • HCI design researchers might use this work to see how expertise in research through design methods could be married with privacy domain expertise, suggesting potential new collaborations and engagements.
  • Industry Privacy Practitioners can begin reaching out to UX researchers and designers in their own organizations both as design experts and as allies in privacy by design initiatives. In particularly, the forward-looking aspects of speculative and critical design approaches may also align well with privacy practitioners’ desire to find contextual and anticipatory privacy tools to help “think around corners”.
  • Policymakers should include designers (in addition to engineers and computer scientists) in regulatory discussions about privacy by design (or other “governance by design” initiatives). Many regulators seem to view “design” in “privacy by design” as a way to implement decisions made in law, or as a relatively straightforward way to solve privacy problems. However, this narrow view risks hiding the politics of design; what is left unexamined in these discussions is that different design approaches also suggest different orientations and conceptualizations of privacy. HCI design practices, which have already been used in relation to privacy, suggest a broader set of ways to approach privacy by design.

Our work aims to bridge privacy by design research and practice with HCI’s rich variety of design research. By doing so, we can help encourage more holistic discussions about privacy, drawing connections among privacy’s social, legal, and technical aspects.


Download a pre-print version of the full paper here.

Paper Citation:
Richmond Y. Wong and Deirdre K. Mulligan. 2019. Bringing Design to the Privacy Table: Broadening “Design” in “Privacy by Design” Through the Lens of HCI. In CHI Conference on Human Factors in Computing Systems Proceedings (CHI 2019), May 4–9, 2019, Glasgow, Scotland UK. ACM, New York, NY, USA, 17 pages. https://doi.org/10.1145/3290605.3300492

This post is crossposted on Medium

Tensions of a Digitally-Connected World in Cricket Wireless’ Holiday Ad Campaign

In the spirit of taking a break over the holidays, this is more of a fun post with some very rough thoughts (though inspired by some of my prior work on paying attention to and critiquing narratives and futures portrayed by tech advertising). The basic version is that the Cricket Wireless 2018 Holiday AdFour the Holidays (made by ad company Psyop), portrays a narrative that makes a slight critique of an always-connected world and suggests that physical face-to-face interaction is a more enjoyable experience for friends than digital sharing. While perhaps a over-simplistic critique of mobile technology use, the twin messages of “buy a wireless phone plan to connect with friends” and “try to disconnect to spend time with friends” highlight important tensions and contradictions present in everyday digital life.

But let’s look at the ad in a little more detail!

Last month, while streaming Canadian curling matches (it’s more fun than you might think, case in point, I’ve blogged about the sport’s own controversy with broom technology) there was a short Cricket ad playing with a holiday jingle. And I’m generally inclined to pay attention to an ad with a good jingle. Looking it up online brought up a 3 minute long short film version expanding upon the 15 second commercial (embedded above), which I’ll describe and analyze below.

It starts with Cricket’s animated characters Ramon (the green blob with hair), Dusty (the orange fuzzy ball), Chip (the blue square), and Rose (the green oblong shape) on a Hollywood set, “filming” the aforementioned commercial, singing their jingle:

The four, the merrier! Cricket keeps us share-ier!

Four lines of unlimited data, for a hundred bucks a month!

After their shoot is over, Dusty wants the group to watch fireworks from the Cricket water tower (which is really the Warner Brothers Studio water tower, though maybe we should call it Chekov’s water tower in this instance) on New Year’s Eve. Alas, the crew has other plans, and everyone flies to their holiday destinations: Ramon to Mexico, Dusty to Canada, Chip to New York, and Rose to Aspen.

The video then shows each character enjoying the holidays in their respective locations with their smartphones. Ramon uses his phone to take pictures of food shared on a family table; Rose uses hers to take selfies on a ski lift.

The first hint that there might be a message critiquing an always-connected world is when the ad shows Dusty in a snowed-in, remote Canadian cabin. Presumably this tells us that he gets a cell signal up there, but in this scene, he is not using his phone. Rather, he’s making cookies with his two (human) nieces (not sure how that works, but I’ll suspend my disbelief), highlighting a face-to-face familial interaction using a traditional holiday group activity.

The second hint that something might not be quite right is the dutch angel establishing shot of New York City in the next scene. The non-horizontal horizon line (which also evokes the off-balance establishing shot of New York from an Avengers: Infinity War trailer) visually puts the scene off balance. But the moment quickly passes, as we see Chip on the streets of New York taking instagram selfies.

2 Dutch angles of New York

Dutch angle of New York from Cricket Wireless’ “Four the Holidays” (left) and Marvel’s Avengers Infinity War (right)

Then comes a rapid montage of photos and smiling selfies that the group is sending and sharing with each other, in a sort of digital self-presentation utopia. But as the short film has been hinting at, this utopia is not reflective of the characters’ lived experience.

The video cuts to Dusty, skating alone on a frozen pond, successfully completing a trick, but then realizes that he has no one to share the moment with. He then sings “The four the merrier, Cricket keeps us share-ier” in a minor key as re-envisions clouds in the sky as the form of the four friends. The minor key and Dusty’s singing show skepticism in the lyrics’ claim that being share-ier is indeed merrier.

The minor key continues, as Ramon sings while envisioning a set of holiday lights as the four friends, and Rose sees a department store window display as the four friends. Chip attends a party where the Cricket commercial (from the start of the video) airs on a TV, but is still lonely. Chip then hails a cab, dramatically stating in a deep voice “Take me home.”

In the last scene, Chip sits atop the Cricket Water Tower (or, Chekov’s Water Tower returns!) at 11:57pm on New Year’s Eve, staring alone at his phone, discontent. This is the clearest signal about the lack of fulfillment he finds from his phone, and by extension, the digitally mediated connection with his friends.

Immediately this is juxtaposed with Ramon singing with his guitar from the other side of the water tower, still in the minor key. Chip hears him and immediately becomes happier, and the music shifts to a major key as Rose and Dusty enter as the tempo picks up, and the drums and orchestra of instruments join in. And the commercial ends with the four of them watching New Year’s fireworks together. It’s worth noting the lyrics at the end:

Ramon: The four the merrier…

Chip [spoken]: Ramon?! You’re here!

Rose: There’s something in the air-ier

All: That helps us connect, all the season through. The four, the merrier

Dusty: One’s a little harrier (So hairy!)

All: The holidays are better, the holidays are better, the holidays are better with your crew.

Nothing here is explicitly about Cricket wireless, or the value of being digitally connected. It’s also worth noting that the phone that Chip was previously staring at is nowhere to be found after he sees Ramon. There is some ambiguous use of the word “connect,” which could refer to both a face-to-face interaction or a digitally mediated one, but the tone of the scene and emotional storyline bringing the four friends physically together seems to suggest that connect refers to the value of face-to-face interaction.

So what might this all mean (beyond the fact that I’ve watched this commercial too many times and have the music stuck in my head)? Perhaps the larger and more important point is that the commercial/short film is emblematic of a series of tensions around connection and disconnection in today’s society. Being digitally connected is seen as a positive that allows for greater opportunity (and greater work output), but at the same time discontent is reflected in culture and media, ranging from articles on tech addiction, to guides on grayscaling iPhones to combat color stimulation, to disconnection camps. There’s also a moralizing force behind these tensions: to be a good employee/student/friend/family member/etc, we are told that we must be digitally connected and always-on, but at the same time, we are told that we must also be dis-connected or interact face-to-face in order to be good subjects.

In many ways, the tensions expressed in this video — an advertisement for a wireless provider trying to encourage customers to sign up for their wireless plans, while presenting a story highlighting the need to digitally disconnect — parallels the tensions that Ellie Harmon and Melissa Mazmanian find in their analysis of media discourse of smartphones: that there is both a push for individuals to integrate the smartphone into everyday life, and to dis-integrate the smartphone from everyday life. What is fascinating to me here is that this video from Cricket exhibits both of those ideas at the same time. As Harmon and Mazmanian write,

The stories that circulate about the smartphone in American culture matter. They matter for how individuals experience the device, the ways that designers envision future technologies, and the ways that researchers frame their questions.

While Four the Holidays doesn’t tell the most complex or nuanced story about connectivity and smartphone use, the narrative that Cricket and Psyop created veers away from a utopian imagining of the world with tech, and instead begins to reflect  some of the inherent tensions and contradictions of smartphone use and mobile connectivity that are experienced as a part of everyday life.

Eliciting Values Reflections by Engaging Privacy Futures Using Design Workbooks [Talk]

This blog post is a version of a talk I gave at the 2018 ACM Computer Supported Cooperative Work and Social Computing (CSCW) Conference based on a paper written with Deirdre Mulligan, Ellen Van Wyk, John Chuang, and James Pierce, entitled Eliciting Values Reflections by Engaging Privacy Futures Using Design Workbooks, which was honored with a best paper award. Find out more on our project page, our summary blog post, or download the paper: [PDF link] [ACM link]

In the work described in our paper, we created a set of conceptual speculative designs to explore privacy issues around emerging biosensing technologies, technologies that sense human bodies. We then used these designs to help elicit discussions about privacy with students training to be technologists. We argue that this approach can be useful for Values in Design and Privacy by Design research and practice.

Continue reading →

Engaging Technologists to Reflect on Privacy Using Design Workbooks

This post summarizes a research paper, Eliciting Values Reflections by Engaging Privacy Futures Using Design Workbooks, co-authored with Deirdre Mulligan, Ellen Van Wyk, John Chuang, and James Pierce. The paper will be presented at the ACM Conference on Computer-Supported Cooperative Work and Social Computing (CSCW) on Monday November 5th (in the afternoon Privacy in Social Media session). Full paper available here.

Recent wearable and sensing devices, such as Google GlassStrava, and internet-connected toys have raised questions about ways in which privacy and other social values might be implicated by their development, use, and adoption. At the same time, legal, policy, and technical advocates for “privacy by design” have suggested that privacy should embedded into all aspects of the design process, rather than being addressed after a product is released, or rather than being addressed as just a legal issue. By advocating that privacy be addressed through technical design processes, the ability for technology professionals to surface, discuss, and address privacy and other social values becomes vital.

Companies and technologists already use a range of tools and practices to help address privacy, including privacy engineering practices, or making privacy policies more readable and usable. But many existing privacy mitigation tools are either deductive, or assume that privacy problems already known and well-defined in advance. However we often don’t have privacy concerns well-conceptualized in advance when creating systems. Our research shows that design approaches (drawing on a set of techniques called speculative design and design fiction) can help better explore, define, perhaps even anticipate, the what we mean by “privacy” in a given situation. Rather than trying to look at a single, abstract, universal definition of privacy, these methods help us think about privacy as relations among people, technologies, and institutions in different types of contexts and situations.

Creating Design Workbooks

We created a set of design workbooks — collections of design proposals or conceptual designs, drawn together to allow designers to investigate, explore, reflect on, and expand a design space. We drew on speculative design practices: in brief, our goal was to create a set of slightly provocative conceptual designs to help engage people in reflections or discussions about privacy (rather than propose specific solutions to problems posed by privacy).

A set of sketches that comprise the design workbook

Inspired by science fiction, technology research, and trends from the technology industry, we created a couple dozen fictional products, interfaces, and webpages of biosensing technologies, or technologies that sense people. These included smart camera enabled neighborhood watch systems, advanced surveillance systems, implantable tracking devices, and non-contact remote sensors that detect people’s heartrates. In earlier design work, we reflected on how putting the same technologies in different types of situations, scenarios, and social contexts, would vary the types of privacy concerns that emerged (such as the different types of privacy concerns that would emerge if advanced miniatures cameras were used by the police, by political advocates, or by the general public). However, we wanted to see how non-researchers might react to and discuss the conceptual designs.

How Did Technologists-In-Training View the Designs?

Through a series of interviews, we shared our workbook of designs with masters students in an information technology program who were training to go into the tech industry. We found several ways in which they brought up privacy-related issues while interacting with the workbooks, and highlight three of those ways here.

TruWork — A product webpage for a fictional system that uses an implanted chip allowing employers to keep track of employees’ location, activities, and health, 24/7.

First, our interviewees discussed privacy by taking on multiple user subject positions in relation to the designs. For instance, one participant looked at the fictional TruWork workplace implant design by imagining herself in the positions of an employer using the system and an employee using the system, noting how the product’s claim of creating a “happier, more efficient workplace,” was a value proposition aimed at the employer rather than the employee. While the system promises to tell employers whether or not their employees are lying about why they need a sick day, the participant noted that there might be many reasons why an employee might need to take a sick day, and those reasons should be private from their employer. These reflections are valuable, as prior work has documented how considering the viewpoints of direct and indirect stakeholders is important for considering social values in design practices.

CoupleTrack — an advertising graphic for a fictional system that uses an implanted chip for people in a relationship wear in order to keep track of each other’s location and activities.

A second way privacy reflections emerged was when participants discussed the designs in relation to their professional technical practices. One participant compared the fictional CoupleTrack implant to a wearable device for couples that he was building, in order to discuss different ways in which consent to data collection can be obtained and revoked. CoupleTrack’s embedded nature makes it much more difficult to revoke consent, while a wearable device can be more easily removed. This is useful because we’re looking for ways workbooks of speculative designs can help technologists discuss privacy in ways that they can relate back to their own technical practices.

Airport Tracking System — a sketch of an interface for a fictional system that automatically detects and flags “suspicious people” by color-coding people in surveillance camera footage.

A third theme that we found was that participants discussed and compared multiple ways in which a design could be configured or implemented. Our designs tend to describe products’ functions but do not specify technical implementation details, allowing participants to imagine multiple implementations. For example, a participant looking at the fictional automatic airport tracking and flagging system discussed the privacy implication of two possible implementations: one where the system only identifies and flags people with a prior criminal history (which might create extra burdens for people who have already served their time for a crime and have been released from prison); and one where the system uses behavioral predictors to try to identify “suspicious” behavior (which might go against a notion of “innocent until proven guilty”). The designs were useful at provoking conversations about the privacy and values implications of different design decisions.

Thinking About Privacy and Social Values Implications of Technologies

This work provides a case study showing how design workbooks and speculative design can be useful for thinking about the social values implications of technology, particularly privacy. In the time since we’ve made these designs, some (sometimes eerily) similar technologies have been developed or released, such as workers at a Swedish company embedding RFID chips in their hands, or Logitech’s Circle Camera.

But our design work isn’t meant to predict the future. Instead, what we tried to do is take some technologies that are emerging or on the near horizon, and think seriously about ways in which they might get adopted, or used and misused, or interact with existing social systems — such as the workplace, or government surveillance, or school systems. How might privacy and other values be at stake in those contexts and situations? We aim for for these designs to help shed light on the space of possibilities, in an effort to help technologists make more socially informed design decisions in the present.

We find it compelling that our design workbooks helped technologists-in-training discuss emerging technologies in relation to everyday, situated contexts. These workbooks don’t depict far off speculative science fiction with flying cars and spaceships. Rather they imagine future uses of technologies by having someone look at a product website, or a amazon.com page or an interface and thinking about the real and diverse ways in which people might experience those technology products. Using these techniques that focus on the potential adoptions and uses of emerging technologies in everyday contexts helps raise issues which might not be immediately obvious if we only think about positive social implications of technologies, and they also help surface issues that we might not see if we only think about social implications of technologies in terms of “worst case scenarios” or dystopias.

Paper Citation:

Richmond Y. Wong, Deirdre K. Mulligan, Ellen Van Wyk, James Pierce, and John Chuang. 2017. Eliciting Values Reflections by Engaging Privacy Futures Using Design Workbooks. Proc. ACM Hum.-Comput. Interact. 1, CSCW, Article 111 (December 2017), 26 pages. DOI: https://doi.org/10.1145/3134746


This post is crossposted with the ACM CSCW Blog

Exploring Implications of Everyday Brain-Computer Interface Adoption through Design Fiction

This blog post is a version of a talk I gave at the 2018 ACM Designing Interactive Systems (DIS) Conference based on a paper written with Nick Merrill and John Chuang, entitled When BCIs have APIs: Design Fictions of Everyday Brain-Computer Interface Adoption. Find out more on our project page, or download the paper: [PDF link] [ACM link]

In recent years, brain computer interfaces, or BCIs, have shifted from far-off science fiction, to medical research, to the realm of consumer-grade devices that can sense brainwaves and EEG signals. Brain computer interfaces have also featured more prominently in corporate and public imaginations, such as Elon Musk’s project that has been said to create a global shared brain, or fears that BCIs will result in thought control.

Most of these narratives and imaginings about BCIs tend to be utopian, or dystopian, imagining radical technological or social change. However, we instead aim to imagine futures that are not radically different from our own. In our project, we use design fiction to ask: how can we graft brain computer interfaces onto the everyday and mundane worlds we already live in? How can we explore how BCI uses, benefits, and labor practices may not be evenly distributed when they get adopted?

Continue reading →