idnits 2.17.00 (12 Aug 2021) /tmp/idnits17597/draft-bray-privacy-choices-01.txt: Checking boilerplate required by RFC 5378 and the IETF Trust (see https://trustee.ietf.org/license-info): ---------------------------------------------------------------------------- No issues found here. Checking nits according to https://www.ietf.org/id-info/1id-guidelines.txt: ---------------------------------------------------------------------------- No issues found here. Checking nits according to https://www.ietf.org/id-info/checklist : ---------------------------------------------------------------------------- ** The document seems to lack a Security Considerations section. ** The document seems to lack an IANA Considerations section. (See Section 2.2 of https://www.ietf.org/id-info/checklist for how to handle the case when there are no actions for IANA.) Miscellaneous warnings: ---------------------------------------------------------------------------- == The copyright year in the IETF Trust and authors Copyright Line does not match the current year -- The document date (April 11, 2015) is 2590 days in the past. Is this intentional? Checking references for intended status: Proposed Standard ---------------------------------------------------------------------------- (See RFCs 3967 and 4897 for information about using normative references to lower-maturity documents in RFCs) No issues found here. Summary: 2 errors (**), 0 flaws (~~), 1 warning (==), 1 comment (--). Run idnits with the --verbose option for more detailed information about the items above. -------------------------------------------------------------------------------- 2 Network Working Group T. Bray, Ed. 3 Internet-Draft Textuality Services 4 Intended status: Standards Track April 11, 2015 5 Expires: October 13, 2015 7 Privacy Choices for Internet Data Services 8 draft-bray-privacy-choices-01 10 Abstract 12 This document argues in favor of Internet service providers deploying 13 technologies which offer increased privacy to users of their 14 services. The discussion is independent of any particular privacy 15 technology. The approach is to consider common objections to the the 16 deployment of such technologies, and show that these objections are 17 not well-founded. 19 Status of This Memo 21 This Internet-Draft is submitted in full conformance with the 22 provisions of BCP 78 and BCP 79. 24 Internet-Drafts are working documents of the Internet Engineering 25 Task Force (IETF). Note that other groups may also distribute 26 working documents as Internet-Drafts. The list of current Internet- 27 Drafts is at http://datatracker.ietf.org/drafts/current/. 29 Internet-Drafts are draft documents valid for a maximum of six months 30 and may be updated, replaced, or obsoleted by other documents at any 31 time. It is inappropriate to use Internet-Drafts as reference 32 material or to cite them other than as "work in progress." 34 This Internet-Draft will expire on October 13, 2015. 36 Copyright Notice 38 Copyright (c) 2015 IETF Trust and the persons identified as the 39 document authors. All rights reserved. 41 This document is subject to BCP 78 and the IETF Trust's Legal 42 Provisions Relating to IETF Documents 43 (http://trustee.ietf.org/license-info) in effect on the date of 44 publication of this document. Please review these documents 45 carefully, as they describe your rights and restrictions with respect 46 to this document. Code Components extracted from this document must 47 include Simplified BSD License text as described in Section 4.e of 48 the Trust Legal Provisions and are provided without warranty as 49 described in the Simplified BSD License. 51 Table of Contents 53 1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . 2 54 2. Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 55 3. Terminology . . . . . . . . . . . . . . . . . . . . . . . . . 3 56 4. Background . . . . . . . . . . . . . . . . . . . . . . . . . 3 57 4.1. Asymmetric failure cost . . . . . . . . . . . . . . . . . 3 58 4.2. Privacy technology cost . . . . . . . . . . . . . . . . . 3 59 5. Common objections to privacy-technology deployment . . . . . 4 60 5.1. Free public data . . . . . . . . . . . . . . . . . . . . 4 61 5.2. Privacy at user option . . . . . . . . . . . . . . . . . 4 62 5.3. Failures of privacy technology . . . . . . . . . . . . . 4 63 5.4. Affordability of privacy technology . . . . . . . . . . . 5 64 Author's Address . . . . . . . . . . . . . . . . . . . . . . . . 5 66 1. Introduction 68 Privacy issues are becoming increasingly important to users of 69 Internet services, and the providers of those services must choose 70 how much privacy it is appropriate to provide their users. 72 When discussing the deployment of privacy technology, certain 73 objections are encountered repeatedly: That privacy protection is 74 inappropriate for freely-public information and "brochure-ware", that 75 it is too flawed to be worthwhile, that privacy choices are best left 76 to end users, and that the cost of deploying privacy protection is 77 too high. 79 This document considers these these arguments and shows that they are 80 flawed; the conclusion is that in almost every case, the best choice 81 for a service provider and its users is the one that maximmizes 82 privacy. 84 2. Summary 86 This document attempts to establish the following: 88 1. Whether or not information is considered "public" is not a good 89 criterion for choosing whether or not to deploy privacy 90 technologies for its users. 92 2. Privacy choices are difficult and context-dependent, so it's 93 inappropriate to ask users to make them. 95 3. Privacy techologies offer benefits to users of data services even 96 when those technologies are imperfect. 98 4. Cost should not be a significant factor while considering the 99 deployment of privacy technologies. 101 3. Terminology 103 The term "data service" means any Internet-mediated offering that is 104 accessible to the general public. Examples would include Web sites, 105 HTTP APIs, streaming media, and various flavors of chat. 107 In this document, "privacy protection" means technology whose 108 deployment increases the cost and difficulty, for anyone but the user 109 and provider of a data service, of ascertaining who is accessing 110 which services and what messages are being exchanged between the user 111 and the service. Obvious examples are encryption and authentication 112 technologies. 114 4. Background 116 This section establishes two background facts that will serve to 117 support this document's central arguments. 119 4.1. Asymmetric failure cost 121 There are two classes of privacy-related failure in the operation of 122 data services. A positive failure occurs when privacy was provided 123 but was not necessary; a negative failure is when privacy was not 124 provided, but was necessary for prudent use of the data service. 126 The cost of these failure classes is not symmetric; negative failures 127 can endanger businesses, property, and lives, while positive failures 128 usually incur at most a little extra expense. 130 4.2. Privacy technology cost 132 A wide variety of privacy technologies are available to Internet data 133 service providers. They include public-key infrastructure, 134 transport-level encryption, server-side encryption-at-rest, and 135 token-based authentication/authorization technologies which reduce 136 the use of passwords. 138 In every case, the monetary and engineering cost of acquiring the 139 necessary resources and deploying the required software has been 140 falling steadily in recent years, both absolutely and as a proportion 141 of the total cost of service development and deployment. 143 5. Common objections to privacy-technology deployment 145 5.1. Free public data 147 It is reasonable to question whether, for freely-available public 148 data, such as the contents of an online reference work or a 149 promotional Web site, it makes sense to deploy privacy protection. 151 Unfortunately, it is very difficult to predict when a person 152 accessing online information might suffer negative consequences. For 153 example, some governments criminalize certain behaviors to the extent 154 that accessing free public reference documents concerning that 155 behavior could lead to arrest and prosecution. 157 Bearing this in mind, and given the asymmetric cost of privacy 158 failure modes, the conclusion is that the "public" or "free" status 159 of information is not a good argument against the deployment of 160 privacy technology. 162 There is another, subtler point: If one groups available data 163 services into those which are non-controversial and thus require no 164 privacy protection, and those which are controversial and do, some 165 will conclude that anything with privacy protection must be 166 controversial and thus subject to suspicion. This effect is better 167 avoided. 169 5.2. Privacy at user option 171 It is often argued that privacy choices are best left to the users of 172 data services; and thus, that opt-in privacy is an appropriate 173 strategy. 175 However, the technical and social factors forming the context for 176 such choices are complex; even experts often disagree on privacy 177 requirements. Thus, the end-users of a data service are likely not 178 well-equipped to make good choices. 180 Bearing this in mind, and given the asymmetric cost of privacy 181 failure modes, it is usually best to remove the necessity for making 182 these choices, by always providing the maximum practical amount of 183 privacy protection. 185 5.3. Failures of privacy technology 187 Internet privacy technologies are known to be imperfect. 188 Cryptography algorithms have been compromised and there is widespread 189 dissatisfaction with the PKI infrastructure. 191 Furthermore, it is widely agreed that an attacker who wishes to 192 attack a target's privacy has many means, ranging from social 193 engineering to hardware hacking to zero-day exploits, to bypass 194 privacy protection. 196 Therefore, it is reasonable to question the deployment of privacy 197 protection, which may create an unrealistic expectation of safety 198 when in fact that is not achievable. 200 However, this line of argument fails on economic grounds. 201 Deployments of privacy technology, however imperfect, generally have 202 the effect of increasing the cost to an attacker of invading end- 203 users' privacy. Every time that cost goes up, certain surveillance 204 activities, whether by government bodies or criminals, become 205 uneconomic and will be abandoned, with the effect of globally 206 increasing the security and privacy of Internet data services. 208 5.4. Affordability of privacy technology 210 Privacy technologies are not free; there are monetary costs for 211 accessing PKI infrastructure, and bandwidth/computation costs related 212 to encryption, authentication, and authorization. 214 Service providers may find it difficult to justify such expenses, 215 particularly those who have severe budget constraints. 217 However, the monotonic decline in privacy technology costs decreases 218 the force of this argument with every passing year. It is hard to 219 imagine a situation where an organization can afford to acquire 220 server resources, domain names, internet connectivity, and software 221 deployment expertise, but still cannot afford to offer privacy 222 protection. 224 There is a subtle related issue: Those who are operating on low 225 budgets are often providing data services to disadvantaged groups, 226 whose members may be in particular need of privacy protection. 228 Author's Address 230 Tim Bray (editor) 231 Textuality Services 233 Email: tbray@textuality.com 234 URI: https://www.tbray.org/