Description
1. What are people’s expectations about what information needs to be protected and in what context?
2. Which privacy-enhancing techniques could be feasibly applied to limit the devices’ listening? How do people perceive their trade-offs and acceptability?
3. Which interfaces and affordances would allow users to express their privacy preferences and explore the implications of their choices?
This dissertation explores these questions through a combination of surveys, user studies, and prototype evaluations. Major conclusions include:
1. People exhibit nuanced and heterogeneous preferences, notably in relation to other members of their households, and are especially wary of undisclosed data flows to third parties. They are most protective of financial data and other information that can cause them harm.
2. Block-listing and filtering approaches may be most feasible to implement. In combination with existing techniques and privacy-friendly design choices, they can address immediate user requirements. However, more complex privacy needs must be addressed with content-based controls, which require additional research in privacy and natural language understanding.
3. People appreciate the control install-time and runtime permissions provide over their own data. However, both have challenges with their user experience. Transparency-based approaches may be comparatively frictionless.