摘要:This article explores the gap between privacy and design in the context of "lateral privacy"—privacy issues arising among users of a service rather than from the service provider—on social networking sites (SNSs) and other platforms by analyzing the privacy concerns lodged against the introduction of Facebook's News Feed in 2006. Our analysis reveals that the dominant theory of privacy put forth by regulators, privacy as individual control, offers little insight into the experiences of privacy violation claimed by users. More importantly, we show that this theory is ill-equipped to guide the design of SNSs and platforms to avoid similar harms in the future. A rising tide of privacy blunders on social networking sites and platforms drives the search for new regulatory approaches, and privacy regulators across the globe are increasingly demanding that the Fair Inform ation Practice Principles, the embodiment of privacy as individual control, inform the design of technical systems through Privacy By Design. The call for Privacy By Design—the practice of embedding privacy protections into products and services at the design phase, rather than after the fact—connects to growing policymaker recognition of the power of technology to not only implement, but also to settle policy through architecture, configuration, interfaces, and default settings. We argue that regulators would do well to ensure that the concept of privacy they direct companies to embed affords the desirable forms of protection for privacy. Ideally, there would be a widely used set of methods and tools to aid in translating privacy into design. Today, neither is true. We identify three gaps in the "informational self-determination" approach that limit its responsiveness to lateral privacy design decisions in SNSs and platforms and then explore three alternative theories of privacy that provide compelling explanations of the privacy harms exemplified in platform environments. Based on this descriptive utility, we argue that these theories provide more robust grounding for efforts by SNSs and platform developers to address lateral privacy concerns in the design of technical artifacts. Unlike FIPPs, which can be applied across contexts, these theories require privacy to be discovered, not just implem ented. To bridge this discovery gap, we turn to the field of Human Computer Interaction ("HCI") and dip into the related field of Value Sensitive Design ("VSD") to identify tools and methodologies that would aid designers in discovering and ultimately embedding these contextual, socially- oriented understandings of privacy in technical artifacts. Finally, we provide some tentative thoughts on the form and substance of regulations that would prompt corporations to invest in these HCI approaches to privacy.