Where’s the Rest of Design? Or, Bringing Design to the Privacy Table: Broadening “Design” in “Privacy by Design” Through HCI [Paper Talk]

This post is based on a talk given at the 2019 ACM CHI Conference on Human Factors in Computing Systems (CHI 2019), in Glasgow, UK. The full research paper by Richmond Wong and Deirdre Mulligan that the talk is based on, “Bringing Design to the Privacy Table: Broadening “Design” in “Privacy by Design” Through the Lens of HCI” can be found here: [Official ACM Version] [Open Access Pre-Print Version]

In our paper “Bringing Design to the Privacy Table: Broadening Design in Privacy by Design,” we conduct a curated literature review to make two conceptual argument arguments:

  1. There is a broad range of design practices used in human computer interaction (HCI) research which have been underutilized in Privacy By Design efforts.
  2. Broadening privacy by design’s notion of what “design” can do can help us more fully address privacy, particularly in situations where we don’t yet know what concepts or definitions of privacy are at stake.

But let me start with some background and motivation. I’m both a privacy researcher—studying studying how to develop technologies that respect privacy—and I’m a design researcher, who designs things to learn about the world.

I was excited several years ago to hear about a growing movement called “Privacy By Design,” the idea that privacy protections should be embedded into products and organizational practice during the design of products, rather than trying to address privacy retroactively. Privacy By Design has been put forward in regulatory guidance from the US and other countries, and more recently by the EU’s General Data Protection Regulation. Yet these regulations don’t provide a lot of guidance about what Privacy By Design means in practice.

In interactions with and field observations of the interdisciplinary Privacy By Design community—including lawyers, regulators, academics, practitioners, and technical folks—I’ve  found that there is a lot of recognition of the complexity of privacy: that it’s an essentially contested concept, there are many conceptualizations of privacy; privacy from companies is different than privacy from governments; there are different privacy harms, and so forth.

PowerPoint Slide Show - [Bringing Privacy to the Table CHI Talk 2.0.pptx] 5_10_2019 10_14_25 AM

Privacy by Design conceptualizes “design” in a relatively narrow way

But the discussion of “design” seems much less complex. I had assumed Privacy By Design would have meant applying HCI’s rich breadth of design approaches toward privacy initiatives – user centered design, participatory design, value sensitive design, speculative design, and so on.

Instead, design seemed to be used narrowly, as either a way to implement the law via compliance engineering, or to solve specific privacy problems. Design was largely framed as a deductive way to solve a problem, using approaches such as encryption techniques or building systems to comply with fair information practices. While these are all important and necessary privacy initiatives, but I kept finding myself asking, “where’s the rest of design?” Not just the deductive problem solving aspects of design, but also its the inductive, exploratory, and forward looking aspects.

PowerPoint Slide Show - [Bringing Privacy to the Table CHI Talk 2.0.pptx] 5_10_2019 10_16_05 AM
There’s an opportunity for Privacy By Design to make greater use of the breadth of design approaches used in HCI

There’s a gap here between the way the Privacy By Design views design and the way the HCI community views design. Since HCI researchers and practitioners are in a position to help support or implement privacy by design initiatives, it’s important to try to help broaden the notion of design in Privacy By Design to more fully bridge this gap.

So our paper aims to fulfill 2 goals:

  1. Design in HCI is more than just solving problems. We as HCI privacy researchers can more broadly engage the breadth of design approaches in HCI writ large. And there are opportunities to build connections among the HCI privacy research community and HCI design research community & research through design community to use design in relation to privacy in multiple ways.
  2. Privacy By Design efforts risk missing out on the full benefits that design can offer if it sticks with a narrower solution and compliance orientation to design. From HCI, we can help build bridges with interdisciplinary Privacy By Design community, and engage them in understanding a broader view of design.  

So how might we characterize the breadth of ways that HCI uses design in relation to privacy? In the paper, we conduct a curated review of HCI research to explore and breadth and richness of how design practices are used in relation to privacy. We searched for HCI papers that use both the terms “privacy” and “design,” curating a corpus of 64 papers. Reading through each paper, we openly coded each one by asking a set of questions including: Why is design used; who is design done by; and for whom is design done? Using affinity diagramming on the open codes, we came up with a set of categories, or dimensions, which we used to re-code the corpus. In this post I’m going to focus on the dimensions that emerged when we looked at the “why design?” question, which we call the purposes of design.

PowerPoint Slide Show - [Bringing Privacy to the Table CHI Talk 2.0.pptx] 5_10_2019 10_15_10 AM

We use 4 purposes to discuss the breadth of reasons why design might be used in relation to privacy

 We describe 4 purposes of design. They are:

  • Design to solve a privacy problem;
  • Design to inform or support privacy;
  • Design to explore people and situations; and
  • Design to critique, speculate, and present critical alternatives.

Note that we use these to talk about how design has been used in privacy research specifically, not about all design writ large (that would be quite a different and broader endeavor!). In practice these categories are not mutually exclusive, and are not the only way to look at the space, but looking at them separately helps give some analytical clarity.  Let’s briefly walk through each of these design purposes.

To Solve a Privacy Problem

First, design is seen as a way to solve a privacy problem – which occurred most often in the papers we looked at. And I think this is often how we think about design colloquially, as a set of practices to solve problems. This is often how design is discussed in Privacy By Design discussions as well.

When viewing design in this way, privacy is presented a problem that has already been well-defined at the before the design process, and a solution is designed to address that definition of the problem. A lot of responsibility for protecting privacy here is thus placed in the technical system.

For instance, if a problem of privacy is defined as the harms that result from long term data processing and aggregation, we might design a system that limits data retention. If a problem of privacy is defined as not being identified, we might design a system to be anonymous.

To Inform or Support Privacy

Second, design is seen as a way to inform or support actors who must make privacy-relevant choices, rather than solving a privacy problem outright. This was also common in our set of papers.  Design to inform or support privacy views problems posed by privacy as an information or tools problem. If users receive information in better ways, or have better tools, then they can make more informed choices about how to act in privacy-preserving ways.

A lot of research has been done on how to design usable privacy policies or privacy notices – but it’s still up to the user to read the notice and make a privacy relevant decision. Other types of design work in this vein includes designing privacy icons, controls, dashboards, visualizations, as well as educational materials and activities.

In these approaches, a lot of responsibility for protecting privacy is placed in the choices that people make, informed by a design artifact. The protection of privacy doesn’t arise from the design of the system itself, but rather by how a person chooses to use the system. This orientation towards privacy fits well with US regulations around privacy that make individuals manage and control their own data.

To Explore People and Situations (Related to Privacy)

Third is using design to explore people and situations. Design is used as a mode of inquiry, to better understand what privacy or the experience of privacy means to certain people, in certain situations. Design here is not necessarily about solving an immediate problem.

Techniques like design probes or collaborative design workshops are some approaches here. For example, a project I presented at CSCW 2018 involved presenting booklets with conceptual designs of potentially invasive products to technology practitioners in training. We weren’t looking to gather feedback in order to develop these conceptual ideas into usable products. Instead, the goal was to use these conceptual design ideas as provocations to better understand the participants’ worldviews. How are they conceptualizing privacy when they see these designs? How do their reactions help us understand where they place responsibility for addressing privacy?

Here, privacy is understood as a situated experience, which emerges from practices from particular groups in specific contexts or situations. The goal is less about solving a privacy problem, and more about understanding how privacy gets enacted and experienced.

To Critique, Speculate, or Present Critical Alternatives About Privacy

Fourth is design to critique, speculate, or present critical alternatives. (By critical I don’t mean bad or mean, but instead I mean critical like reflexive reflection or careful analysis).  Design here is not about exploring the world as it is, but focuses on how the world could be. Often this consists of creating create conceptual designs that provoke, to create a space to surface and discuss social values. These help us discuss worlds we might strive to achieve or ones we want to avoid. Privacy in this case is situated in different possible sociotechnical configurations of the world, thinking about privacy’s social, legal, and technical aspects together.

For example, in a project I presented at DIS 2017, we created advertisements for fictional sensing products, like a bodily implant for workplace employees. This helped us raise questions beyond basic data collection and use ones. The designs helped us ask questions about how is privacy implicated in the workplace, or through employment law? Can consent really occur with these power dynamics? It also helped us ask normative questions, such as: Who gets to have privacy and who doesn’t? Who or what should be responsible for protecting privacy? Might we look to technical design, to regulations, to market mechanisms, or to individual choice to protect privacy?

Design Is a Political, Values-Laden Choice

So in summary these are the 4 purposes of design that we identified in this paper: using design to solve, to inform and support, to explore, and to critique and speculate. Again, in practice, they’re not discrete categories. Many design approaches, like user centered design, or participatory design, use design for multiple design purposes.

PowerPoint Slide Show - [Bringing Privacy to the Table CHI Talk 2.0.pptx] 5_10_2019 10_27_56 AM

Using design in different ways suggests different starting points for how we might think about privacy

But this variety of purposes for how design relates to privacy is also a reminder that design isn’t a neutral process, but is itself political and values-laden. (Not political in terms of liberal and conservative, but political in the sense that there is power and social implications in the choices we make about how to use design). Each design purpose suggests a different starting place for how we orient ourselves towards conceptualizing and operationalizing privacy. We might think about privacy as:

  • a technical property;
  • as a user-made choice;
  • as situated experiences;
  • as privacy as sociotechnically situated.

Privacy can be many and all of these things at once, but the design methods we choose, and the reasons why we choose to use design helps to suggest or foreclose different orientations toward privacy. These choices also suggest that responsibility for privacy might be placed in different places — such as in a technical system, in a person’s choices, in a platform’s policies, in the law, in the market, and so forth.

PowerPoint Slide Show - [Bringing Privacy to the Table CHI Talk 2.0.pptx] 5_10_2019 10_15_39 AM

Research using design to solve and design to inform and support appeared more often in the papers that we looked at

Now I’ve been discussing these 4 design purposes equally, but they weren’t equal in our corpus. Allowing each paper to be coded for multiple categories, a little over half the papers we looked at used design to solve a privacy problem and a little over half used design to inform or support. Less than a quarter used design to explore; even fewer used design to critique and speculate. We don’t claim that the exact percentages are representative of all the privacy literature, but there’s a qualitative difference here, where most of the work we reviewed uses design to solve privacy problems or support and inform privacy.

We are arguing for a big tent approach in privacy by design: using design in all of these ways helps us address a broader set of conceptions of privacy.

This suggests that there’s an opportunity for us to build bridges between the HCI privacy research community, which has rich domain expertise; and the HCI design research & research through design communities, which have rich design methods expertise, particularly using design in ways to explore, and to critique and speculate.

So that’s Argument 1, that we have the opportunity to build new bridges among HCI communities to more fully make use of each others’ expertise, and a broader range of design methods and purposes.

Argument 2 is that Privacy By Design has largely (with some exceptions) thought about design as a problem solving process.  Privacy By Design research and practice could expand on that thinking of design to make more use of a fuller breadth of uses of design that are reflected in HCI.

Implications for Design Collaboration

So what might some of these collaborations within and across fields look like, if we want to make use of more of design’s breadth? For example if we as privacy researchers, develop a set of usable privacy tools to inform and support most people’s privacy decision making; that might be complemented with design to explore so that we can better understand the often marginalized populations for whom those tools don’t work. For instance Diana Freed et al.’s work shows that social media privacy and security tools can be used against victims of intimate partner violence, violating their privacy and safety. Or, an emerging set of problems we face is thinking about privacy in physically instrumented spaces: how does consent work, what conceptions of privacy and privacy risk are at play? We can complement design to solve and design to support efforts with design to critique and speculate; to craft future scenarios that try to understand what concepts of privacy might be at play, and how privacy can surface differently when technical, social, or legal aspects of the world change.

From a design research perspective, I think there’s growing interest in the design research community to create provocative artifacts to try to surface discussions about privacy, particularly in relation to new and emerging technologies. Critically reflecting on my own design research work, I think it can be tempting to just speak to other designers and resort to conceptions of privacy that say “surveillance is creepy” and not dig deeper into other approaches to privacy. But by collaborating with privacy researchers, we can bring more domain expertise and theoretical depth to these design explorations and speculations, and engage a broader set of privacy stakeholders.

Industry privacy practitioners working on privacy by design initiatives might consider incorporating more UX researchers and designers form their organizations, as privacy allies and as design experts.  Approaches that use design to critique and speculate may also align well with privacy practitioners’ stated desire to find contextual and anticipatory privacy tools to help “think around corners”, as reported by Ken Bamberger and Deirdre Mulligan.

Privacy By Design regulators could incorporate more designers (in addition to engineers and computer scientists) in regulatory discussions about privacy by design, so that this richness of design practice isn’t lost when the words “by design” are written in the law.

Moreover, there’s an opportunity here for us an HCI community to bring HCI’s rich notions of what design can mean to Privacy By Design, so that beyond being a problem solving process, it is also seen as a process that also makes use of the multi-faceted, inductive, and exploratory uses of design that this community engages in.


 

Paper Citation: Richmond Y. Wong and Deirdre K. Mulligan. 2019. Bringing Design to the Privacy Table: Broadening “Design” in “Privacy by Design” Through the Lens of HCI. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems (CHI ’19). ACM, New York, NY, USA, Paper 262, 17 pages. DOI: https://doi.org/10.1145/3290605.3300492

Advertisements

Utilizing Design’s Richness in “Privacy by Design”

This post summarizes a research paper, Bringing Design to the Privacy Table, written by Richmond Wong and Deirdre Mulligan. The paper will be presented at the 2019 ACM Conference on Human Factors in Computing Systems (CHI 2019) on Wednesday, May 8 at the 4pm “Help Me, I’m Only Human” paper session.

How might the richness and variety in human computer interaction (HCI) design practices and approaches be utilized in addressing privacy during the development of technologies?

U.S. policy recommendations and the E.U.’s General Data Protection have helped concept of privacy by design (PBD)—embedding privacy protections into products during the initial design phase, rather than retroactively—gain traction. Yet while championing “privacy by design,” these regulatory discussions offer little in the way of concrete guidance about what “by design” means in technical and design practice. Engineering communities have begun developing privacy engineering techniques, to use design as a way to find privacy solutions. Many privacy engineering tools focus on design solutions that translate high level principles into implementable engineering requirements.  However, design in HCI has a much richer concept of what “design” might entail: it also includes thinking about design as a way to explore the world and to critique and speculate about the world. Embracing this richness of design approaches can help privacy by design more fully approach the privacy puzzle.

To better understand the richness of ways design practices can related to privacy, we conducted a curated review of 64 HCI research papers that discuss both privacy and design. One thing we looked at was how each paper viewed the purpose of design in relation to privacy. (Papers could be classified into multiple categories, so percentages add up to over 100). We found four main design purposes:

  • To Solve a Privacy Problem (56% of papers) – This aligns with the common perception of design, that design is used to solve problems. This includes creating system architectures and data management systems in ways that collect and use data in privacy-preserving ways. The problems posed by privacy are generally well-defined before the design process; a solution is then designed to address that problem.
  • To Inform or Support Privacy (52%) – Design is also used to inform or support people who must make privacy-relevant choices, rather than solving a privacy problem outright. A lot of these papers use design to increase the usability of privacy notices and controls to allow end users to more easily make choices about their privacy. These approaches generally assume that if people have the “right” types of tools and information, then they will choose to act in more privacy-preserving ways.
  • To Explore People and Situations (22%) – Design can be used as a form of inquiry to understand people and situations. Design activities, probes, or conceptual design artifacts might be shared with users and stakeholders to understand their experiences and concerns about privacy. Privacy is thus viewed here as relating to different social and cultural contexts and practices; design is used as a way to explore what privacy means in these different situations.
  • To Critique, Speculate, or Present Critical Alternatives (11%) – Design can be used to create spaces in which people can discuss values, ethics, and morals—including privacy. Rather than creating immediately deployable design solutions, design here works like good science fiction: creating conceptual designs that try to provoke people into think about relationships among technical, social, and legal aspects of privacy and ask questions such as who gets (or doesn’t get) to have privacy, or who should be responsible for providing privacy.

One thing we found interesting is how some design purposes tend to narrowly define what privacy means or define privacy before the design process, whereas others view privacy as more socially situated and use the process of design itself to help define privacy.

For those looking towards how these dimensions might be useful in privacy by design practice, we mapped our dimensions onto a range of design approaches and methodologies common in HCI, in the table below.

Design Approach(es) Dimensions of Design Purposes How does design relate to privacy?
Software Engineering Solve a problem; Inform and support Conceptions and the problem of privacy solved are defined in advance. Lends itself well to problems related data privacy, or privacy issues to be addressed at a system architecture level.
User-Centered Design Solve a problem; Inform and support; Explore Could have conception of privacy defined in advance, or it might surface from users. Lends itself well to individual-based conceptions of privacy
Participatory Design; Value Centered Design Solve a problem; Inform and support; Explore; Surface stakeholder conceptions of privacy, involve stakeholders in the design process
Resistance, Re-Design, Re-Appropriation Practices Solve a problem; Critique Shows breakdown or contestation in current conceptions of privacy
Speculative and Critical Design Explore; Critique Critique current conceptions of privacy, explores and shows potential ways privacy might emerge in new situations

These findings can be of use to several communities:

  • HCI privacy researchers and PBD researchers might use this work to reflect on dominant ways in which design has been used thus far (to solve privacy problems, and to inform or support privacy), and begin to explore a broader range of design purposes and approaches in privacy work.
  • HCI design researchers might use this work to see how expertise in research through design methods could be married with privacy domain expertise, suggesting potential new collaborations and engagements.
  • Industry Privacy Practitioners can begin reaching out to UX researchers and designers in their own organizations both as design experts and as allies in privacy by design initiatives. In particularly, the forward-looking aspects of speculative and critical design approaches may also align well with privacy practitioners’ desire to find contextual and anticipatory privacy tools to help “think around corners”.
  • Policymakers should include designers (in addition to engineers and computer scientists) in regulatory discussions about privacy by design (or other “governance by design” initiatives). Many regulators seem to view “design” in “privacy by design” as a way to implement decisions made in law, or as a relatively straightforward way to solve privacy problems. However, this narrow view risks hiding the politics of design; what is left unexamined in these discussions is that different design approaches also suggest different orientations and conceptualizations of privacy. HCI design practices, which have already been used in relation to privacy, suggest a broader set of ways to approach privacy by design.

Our work aims to bridge privacy by design research and practice with HCI’s rich variety of design research. By doing so, we can help encourage more holistic discussions about privacy, drawing connections among privacy’s social, legal, and technical aspects.


Download a pre-print version of the full paper here.

Paper Citation:
Richmond Y. Wong and Deirdre K. Mulligan. 2019. Bringing Design to the Privacy Table: Broadening “Design” in “Privacy by Design” Through the Lens of HCI. In CHI Conference on Human Factors in Computing Systems Proceedings (CHI 2019), May 4–9, 2019, Glasgow, Scotland UK. ACM, New York, NY, USA, 17 pages. https://doi.org/10.1145/3290605.3300492

This post is crossposted on Medium

Eliciting Values Reflections by Engaging Privacy Futures Using Design Workbooks [Talk]

This blog post is a version of a talk I gave at the 2018 ACM Computer Supported Cooperative Work and Social Computing (CSCW) Conference based on a paper written with Deirdre Mulligan, Ellen Van Wyk, John Chuang, and James Pierce, entitled Eliciting Values Reflections by Engaging Privacy Futures Using Design Workbooks, which was honored with a best paper award. Find out more on our project page, our summary blog post, or download the paper: [PDF link] [ACM link]

In the work described in our paper, we created a set of conceptual speculative designs to explore privacy issues around emerging biosensing technologies, technologies that sense human bodies. We then used these designs to help elicit discussions about privacy with students training to be technologists. We argue that this approach can be useful for Values in Design and Privacy by Design research and practice.

Continue reading →

Engaging Technologists to Reflect on Privacy Using Design Workbooks

This post summarizes a research paper, Eliciting Values Reflections by Engaging Privacy Futures Using Design Workbooks, co-authored with Deirdre Mulligan, Ellen Van Wyk, John Chuang, and James Pierce. The paper will be presented at the ACM Conference on Computer-Supported Cooperative Work and Social Computing (CSCW) on Monday November 5th (in the afternoon Privacy in Social Media session). Full paper available here.

Recent wearable and sensing devices, such as Google GlassStrava, and internet-connected toys have raised questions about ways in which privacy and other social values might be implicated by their development, use, and adoption. At the same time, legal, policy, and technical advocates for “privacy by design” have suggested that privacy should embedded into all aspects of the design process, rather than being addressed after a product is released, or rather than being addressed as just a legal issue. By advocating that privacy be addressed through technical design processes, the ability for technology professionals to surface, discuss, and address privacy and other social values becomes vital.

Companies and technologists already use a range of tools and practices to help address privacy, including privacy engineering practices, or making privacy policies more readable and usable. But many existing privacy mitigation tools are either deductive, or assume that privacy problems already known and well-defined in advance. However we often don’t have privacy concerns well-conceptualized in advance when creating systems. Our research shows that design approaches (drawing on a set of techniques called speculative design and design fiction) can help better explore, define, perhaps even anticipate, the what we mean by “privacy” in a given situation. Rather than trying to look at a single, abstract, universal definition of privacy, these methods help us think about privacy as relations among people, technologies, and institutions in different types of contexts and situations.

Creating Design Workbooks

We created a set of design workbooks — collections of design proposals or conceptual designs, drawn together to allow designers to investigate, explore, reflect on, and expand a design space. We drew on speculative design practices: in brief, our goal was to create a set of slightly provocative conceptual designs to help engage people in reflections or discussions about privacy (rather than propose specific solutions to problems posed by privacy).

A set of sketches that comprise the design workbook

Inspired by science fiction, technology research, and trends from the technology industry, we created a couple dozen fictional products, interfaces, and webpages of biosensing technologies, or technologies that sense people. These included smart camera enabled neighborhood watch systems, advanced surveillance systems, implantable tracking devices, and non-contact remote sensors that detect people’s heartrates. In earlier design work, we reflected on how putting the same technologies in different types of situations, scenarios, and social contexts, would vary the types of privacy concerns that emerged (such as the different types of privacy concerns that would emerge if advanced miniatures cameras were used by the police, by political advocates, or by the general public). However, we wanted to see how non-researchers might react to and discuss the conceptual designs.

How Did Technologists-In-Training View the Designs?

Through a series of interviews, we shared our workbook of designs with masters students in an information technology program who were training to go into the tech industry. We found several ways in which they brought up privacy-related issues while interacting with the workbooks, and highlight three of those ways here.

TruWork — A product webpage for a fictional system that uses an implanted chip allowing employers to keep track of employees’ location, activities, and health, 24/7.

First, our interviewees discussed privacy by taking on multiple user subject positions in relation to the designs. For instance, one participant looked at the fictional TruWork workplace implant design by imagining herself in the positions of an employer using the system and an employee using the system, noting how the product’s claim of creating a “happier, more efficient workplace,” was a value proposition aimed at the employer rather than the employee. While the system promises to tell employers whether or not their employees are lying about why they need a sick day, the participant noted that there might be many reasons why an employee might need to take a sick day, and those reasons should be private from their employer. These reflections are valuable, as prior work has documented how considering the viewpoints of direct and indirect stakeholders is important for considering social values in design practices.

CoupleTrack — an advertising graphic for a fictional system that uses an implanted chip for people in a relationship wear in order to keep track of each other’s location and activities.

A second way privacy reflections emerged was when participants discussed the designs in relation to their professional technical practices. One participant compared the fictional CoupleTrack implant to a wearable device for couples that he was building, in order to discuss different ways in which consent to data collection can be obtained and revoked. CoupleTrack’s embedded nature makes it much more difficult to revoke consent, while a wearable device can be more easily removed. This is useful because we’re looking for ways workbooks of speculative designs can help technologists discuss privacy in ways that they can relate back to their own technical practices.

Airport Tracking System — a sketch of an interface for a fictional system that automatically detects and flags “suspicious people” by color-coding people in surveillance camera footage.

A third theme that we found was that participants discussed and compared multiple ways in which a design could be configured or implemented. Our designs tend to describe products’ functions but do not specify technical implementation details, allowing participants to imagine multiple implementations. For example, a participant looking at the fictional automatic airport tracking and flagging system discussed the privacy implication of two possible implementations: one where the system only identifies and flags people with a prior criminal history (which might create extra burdens for people who have already served their time for a crime and have been released from prison); and one where the system uses behavioral predictors to try to identify “suspicious” behavior (which might go against a notion of “innocent until proven guilty”). The designs were useful at provoking conversations about the privacy and values implications of different design decisions.

Thinking About Privacy and Social Values Implications of Technologies

This work provides a case study showing how design workbooks and speculative design can be useful for thinking about the social values implications of technology, particularly privacy. In the time since we’ve made these designs, some (sometimes eerily) similar technologies have been developed or released, such as workers at a Swedish company embedding RFID chips in their hands, or Logitech’s Circle Camera.

But our design work isn’t meant to predict the future. Instead, what we tried to do is take some technologies that are emerging or on the near horizon, and think seriously about ways in which they might get adopted, or used and misused, or interact with existing social systems — such as the workplace, or government surveillance, or school systems. How might privacy and other values be at stake in those contexts and situations? We aim for for these designs to help shed light on the space of possibilities, in an effort to help technologists make more socially informed design decisions in the present.

We find it compelling that our design workbooks helped technologists-in-training discuss emerging technologies in relation to everyday, situated contexts. These workbooks don’t depict far off speculative science fiction with flying cars and spaceships. Rather they imagine future uses of technologies by having someone look at a product website, or a amazon.com page or an interface and thinking about the real and diverse ways in which people might experience those technology products. Using these techniques that focus on the potential adoptions and uses of emerging technologies in everyday contexts helps raise issues which might not be immediately obvious if we only think about positive social implications of technologies, and they also help surface issues that we might not see if we only think about social implications of technologies in terms of “worst case scenarios” or dystopias.

Paper Citation:

Richmond Y. Wong, Deirdre K. Mulligan, Ellen Van Wyk, James Pierce, and John Chuang. 2017. Eliciting Values Reflections by Engaging Privacy Futures Using Design Workbooks. Proc. ACM Hum.-Comput. Interact. 1, CSCW, Article 111 (December 2017), 26 pages. DOI: https://doi.org/10.1145/3134746


This post is crossposted with the ACM CSCW Blog

Exploring Implications of Everyday Brain-Computer Interface Adoption through Design Fiction

This blog post is a version of a talk I gave at the 2018 ACM Designing Interactive Systems (DIS) Conference based on a paper written with Nick Merrill and John Chuang, entitled When BCIs have APIs: Design Fictions of Everyday Brain-Computer Interface Adoption. Find out more on our project page, or download the paper: [PDF link] [ACM link]

In recent years, brain computer interfaces, or BCIs, have shifted from far-off science fiction, to medical research, to the realm of consumer-grade devices that can sense brainwaves and EEG signals. Brain computer interfaces have also featured more prominently in corporate and public imaginations, such as Elon Musk’s project that has been said to create a global shared brain, or fears that BCIs will result in thought control.

Most of these narratives and imaginings about BCIs tend to be utopian, or dystopian, imagining radical technological or social change. However, we instead aim to imagine futures that are not radically different from our own. In our project, we use design fiction to ask: how can we graft brain computer interfaces onto the everyday and mundane worlds we already live in? How can we explore how BCI uses, benefits, and labor practices may not be evenly distributed when they get adopted?

Continue reading →

Assembling Critical Practices Reading List Posted

At the Berkeley School of Information, a group of researchers interested in the areas of critically-oriented design practices, critical social theory, and STS have hosted a reading group called “Assembling Critical Practices,” bringing together literature from these fields, in part to track their historical continuities and discontinuities, as well as to see new opportunities for design and research when putting them in conversation together.
I’ve posted our reading list from our first iterations of this group. Sections 1-3 focus on critically-oriented HCI, early critiques of AI, and an introduction to critical theory through the Frankfurt School. This list comes from an I School reading group put together in collaboration with Anne Jonas and Jenna Burrell.

Section 4 covers a broader range of social theories. This comes from a reading group sponsored by the Berkeley Social Science Matrix organized by myself and Anne Jonas with topic contributions from Nick Merrill, Noura Howell, Anna Lauren Hoffman, Paul Duguid, and Morgan Ames (Feedback and suggestions are welcome! Send an email to richmond@ischool.berkeley.edu).

Table of Contents:

See the whole reading list on this page.

Interrogating Biosensing Privacy Futures with Design Fiction (video)

 

I presented this talk in November 2017, at the Berkeley I School PhD Research Reception. The talk discusses findings from 2 of our papers:

Richmond Y. Wong, Ellen Van Wyk and James Pierce. (2017). Real-Fictional Entanglements: Using Science Fiction and Design Fiction to Interrogate Sensing Technologies. In Proceedings of the ACM Conference on Designing Interactive Systems (DIS ’17). https://escholarship.org/uc/item/7r229796

Richmond Y. Wong, Deirdre K. Mulligan, Ellen Van Wyk, James Pierce and John Chuang. (2017). Eliciting Values Reflections by Engaging Privacy Futures Using Design Workbooks. Proceedings of the ACM Human Computer Interaction (CSCW 2018 Online First). 1, 2, Article 111 (November 2017), 27 pages. https://escholarship.org/uc/item/78c2802k

More about this project and some of the designs can be found here: biosense.berkeley.edu/projects/sci-fi-design-fiction/