Values in Design (VID), or more fully, Human Values in the Design of Information Systems and Technology, is a movement that goes beyond traditional requirements engineering to consider individual and social values as equally important inputs to the technology design process.
Beginning as an effort led by Helen Nissenbaum, Geoffrey C. Bowker, and Susan Leigh Star, the VID program recognized that technology is far from neutral, and the design process is predicated on a series of human decisions – decisions that are steeped in the personal and professional values held by designers. In this way, values become deeply embedded into our technologies, from the level of bits and algorithms up to the large systems and infrastructures that enable and constrain our daily activities. Drawing attention to the values embedded by system and technology designers, as well as the (usually different) sets of values held by those who use the technology, the VID approach informs and shapes the design process to result in devices, experiences, and information policies that resonate with their uses in context.
For example, the rise of location-based applications on smart phones has led to issues of privacy and security. Initially positioned to be positively informative about where your friends, loved ones, and colleagues are located, this level of geographic accountability soon became problematic for some. As detailed in an ACM article on Values in Design:
Such technologies can cause tension in social values as the benefit of potential meetings with friends causes problems of attention and interrogation, as when a paramour says, “You said you were going to the store, then the library, and then home, but you never checked in. Where were you?” a GPS-based network applications may increase locational accountability because, unlike a phone call that might originate anywhere, GPS-enabled application carries information about specific geographic location. In principle, a user can work around “stalking” and other problematic situations with some mobile apps such as Tall Tales and Google Latitude that allow a user to lie about location, but equating privacy with lying creates its own values-centric problems. An “open hand” of location-based transparency can easily become a “backhand” when geographic privacy and autonomy are compromised. (Knobel & Bowker, 2011)
Over the past ten years, we have held a number of week-long VID Doctoral Workshops at Santa Clara University, NYU, UC-Irvine, Aalto University (Finland), and University of Rotterdam (Netherlands), as well as many shorter workshops delivered by VID Fellows at conferences and seminars. Our community of VID Fellows has grown to over 100 doctoral students and faculty who integrate this perspective of values as critical to designing technology that better aligns with people’s needs, goals, and preferences. Our next planned doctoral workshop will take place in Summer 2015, hosted by Georgia Tech, co-directed by Carl DiSalvo and Cory Knobel.
The EVOKE Lab & Studio continues to pursue a number of projects that use and further the VID perspective, from both theoretical and methodological directions. Stop by the Lab to get involved with projects that look to create better, more socially situated, and values-driven technologies!