Call For Participation


Privacy is a scarce commodity in the age of digital connectedness, where the exchange of personal data is almost a prerequisite for participation. While this is an issue for all, there are particular groups whose digital privacy is disproportionately endangered and ignored. We identify these users as “vulnerable,” recognizing that the term reflects a diverse range of populations, conditions and challenges, including people with disabilities, older adults, children, those with mental health conditions, people living with stigmatised conditions, surviving domestic abuse or living in countries where freedom of speech is not exercised. In this workshop, we explore the ways in which the privacy and security of vulnerable user groups are at risk when using digital platforms, and endeavour to extract requirements to facilitate the design of technologies that are inclusive of, or personalised to a more inclusive range of privacy and security needs.


This workshop is a continuation of the two workshops  that focused on disability and access as well as starting to explore the wider issue of vulnerable groups and situations. This workshop will be of interest to researchers and practitioners who are interested in building privacy/security solutions that are inclusive of people described as ``vulnerable.'' While traditionally inclusive design has addressed physical accessibility needs arising from age, disability or the environment, there is also a need to design systems that fulfil the psychological needs of vulnerable users or marginalized groups. The workshop deliberately avoids any concrete definitions of what “vulnerable” means in this context; we encourage a diverse discussion of any group that could be deemed as vulnerable, without prejudice. However, these groups may or may not include domestic abuse victims, young children, older adults, individuals with learning difficulties, individuals suffering from long-term stigmatised or mental health conditions (e.g. HIV, dementia, addiction, depression), individuals with short-term physical and mental health conditions (e.g. recovering from injury), asylum seekers, individuals who self-harm, refugees, and individuals with limited technology experience. Although not exhaustive by any means, we seek to explore technology-related privacy and security challenges for some of these user groups.

The HCI community has recognised that designing for the “average” user may not reflect the needs of vulnerable individuals. This has led to the creation of inclusive, value-sensitive design methodologies, which are still to be fully utilised to facilitate the design of privacy and security solutions. Many privacy and security solutions are designed for and evaluated with a narrow range of users (e.g., technology literate, young, without any disability), leaving vulnerable users needs unfulfilled. This is despite privacy being a human right recognized by the United Nations, alongside the right for vulnerable citizens to be given the chance to fully participate in the ICT revolution.

While security and privacy are universal concerns, there may be unique challenges for vulnerable users given their different capabilities, needs and considerations. For instance, location sharing is a feature of a number of social media platforms, which for many individuals is a way of adding contexts to posts or pictures. However, for survivors of domestic violence, allowing this information to be inadvertently broadcasted may endanger the individual. Such privacy and security concerns are important and may vary considerably between different populations and across different conceptualizations of vulnerabilities. For example, the medical records of HIV patients are especially sensitive, and therefore a greater degree of care than those of others’, creating challenges for systems or individuals responsible for keeping records secure.

Exploring the privacy and security challenges for vulnerable users provides an ideal opportunity to discuss the delicate balance between keeping people safe, and undermining their privacy. Moreover, considering the specific requirements of particular populations not only has the potential to help improve technology for that group, but also beyond making it an important area of discussion.

To make these solutions more inclusive, we need to consider a wider range of potential users and contexts of use, as well as the individual and the cultural, social and physical environment in which these solutions are used.


i) To discuss the experiences, challenges and requirements of vulnerable or marginalized groups

ii) To share studies of privacy and security solutions or contexts for more vulnerable groups of users and whether their needs are

being met.

iii) To identify the specific needs of these populations, any common concerns and start to explore how they may be met.

Important Dates

Workshop paper submission deadline: Friday, May 25, 2018

Workshop paper acceptance notification to authors: Saturday, June 9, 2018

Workshop camera-ready papers due: Wednesday, June 20, 2018

Workshop date: August 12th, 2018

Anonymization: Papers are NOT to be anonymized

Formatting: Use SOUPS Word or Latex templates

Scope and Focus

Many privacy and security solutions are designed for and evaluated with a narrow range of users (e.g., technology literate, physically capable, young), and the solutions make assumptions about the environment and the user interaction capabilities and methods (e.g., keyboard, mouse, touch screen, audio, camera). However, these solutions (e.g., authentication, CAPTCHAs, anti-phishing tools) are used by a much wider variety of people, and in more varied situations than ever evaluated with. While there are accessible and environment-aware solutions, their research has been very narrow, often targeted at specific disability conditions (e.g., vision impairment) or situational deficiency (e.g., viewing private information in a public space). In general, marginalized groups, situational impairments, emotions, stress, and the social context are under-represented when designing privacy and security solutions. To make these solutions more inclusive, we need to take into consideration a wider range of potential users, including marginalised and stigmatised groups and a wider range of contexts of use and needs. We also need to consider not only the individual but also the cultural, social and physical environment in which they exist.


i) To discuss the experiences, challenges and requirements of vulnerable or marginalized groups

ii) To identify research questions that address the specific needs of these populations, and any common concerns and explore methods for answering them.

iii) To identify barriers to and ethical concerns of research with these populations, and discuss avenues

for research in consideration of those concerns.

We expect participation from those who value the protection of user and corporate privacy and security

when using digitally-connected technologies, who seek to ensure that technology is accessible and

appropriate for a wider user base, or who endeavour to improve the experience of vulnerable groups

and those with extra ordinary privacy or security requirements. 

We are soliciting short papers (1 page) for brief discussion during the workshop:

i) Proposals of design principles, processes, methodologies and/or solutions for specific situations, or generalizable to support a wide range of groups or operational environments

ii) Design fictions describing future technologies that would support a group of users or a range of

groups or operational environments

Submissions should be made via the WIPS 2018 HotCRP site:

Questions about the workshop, including submissions, should be sent to the organizers (see below).

Lynne Coventry is a research professor in the school of health and life sciences. She is director of PactLab – a research group exploring the role of technology in our everyday lives. Her research focuses on the interaction between psychology, design and security/privacy behaviours for a wide range of user types and contexts of use, ranging from children and cyberbullying, security compliance in the workplace through to older adults, assistive technology design to privacy for stigmatised groups.

Abigail Marsh is a Ph.D. candidate in the Societal Computing program, part of the Institute for Software Research, at Carnegie Mellon University, where she is advised by Dr. Lorrie Cranor. Her research is focused on the usable privacy concerns of children, particularly within families with pre-adolescent and adolescent children. She will begin a tenure-track professorship at Macalester College in the Fall of 2018.

Yang Wang is an Assistant Professor in the School of Information Studies (aka iSchool) at Syracuse University where he co-directs the Social Computing Systems (SALT) lab. His research is centered around usable privacy and security, and social computing. He has been examining privacy issues and building privacy-enhancing technologies in different domains such as personalized systems, social media, and online behavioral advertising. He has been leading an inclusive authentication project funded by The National Institute on Disability and Rehabilitation Research (NIDRR) and an inclusive privacy project funded by the National Science Foundation (NSF) CAREER program in the US. 

Privacy and Security for Everyone, Anytime, Anywhere

3rd Workshop on Inclusive Privacy and Security (WIPS)





Scope and Focus