人工智能时代的隐私保护
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:Privacy Protection in the Age of Artificial Intelligence
  • 作者:郑志峰
  • 英文作者:Zheng Zhifeng;
  • 关键词:人工智能 ; 隐私保护 ; 个人信息 ; 算法 ; 大数据
  • 英文关键词:artificial intelligence;;privacy protection;;personal information;;algorithm;;big data
  • 中文刊名:DOUB
  • 英文刊名:Science of Law(Journal of Northwest University of Political Science and Law)
  • 机构:西南政法大学民商法学院;
  • 出版日期:2018-12-27 10:20
  • 出版单位:法律科学(西北政法大学学报)
  • 年:2019
  • 期:v.37;No.234
  • 基金:中国法学会青年调研项目(CLS(2017)Y02);; 重庆市教育委员会人文社会科学规划项目(17SKG005);; 西南政法大学校级青年教师学术创新团队项目(2017XZCXTD-04);西南政法大学人工智能法律研究院科研创新重点项目(2018-RGZN-JS-ZD-03)
  • 语种:中文;
  • 页:DOUB201902005
  • 页数:10
  • CN:02
  • ISSN:61-1470/D
  • 分类号:53-62
摘要
伴随大数据、物联网、深度学习等技术的发展,一个"万物互联、人人在线、事事算法"的人工智能时代正在到来。数据是人工智能的基础,算法是人工智能的本质,人工智能越智能就越依赖数据的喂养和算法的支持,由此引发严重的隐私危机。一方面,人工智能极大地增强了隐私入侵的能力,带来了更多的隐私获取性;另一方面,人工智能侵害隐私的行为极具迷惑性,造成的损害后果更加严重。对此,传统隐私保护法律框架显得捉襟见肘,既无法有效保护人们的隐私,也难以充分发挥个人信息的利用价值,而欧盟最新的《一般数据保护法》则做了诸多有益的探索。面对人工智能时代提出的新挑战,我国应当完善隐私保护的法律体系,重视隐私保护的技术路径,探索隐私保护的市场机制,确立隐私保护的伦理原则。
        With the development of technologies such as Big Data, Internet of Things, and deep learning, the age of artificial intelligence is coming, "everything connected, everyone's online, and everything algorithm" have become features of the times.The data is the basis of artificial intelligence,the algorithm is the essence of artificial intelligence,the more intelligent the artificial intelligence, the more dependent on the feeding of data and the support of the algorithm, which triggers a serious privacy crisis.On the one hand, artificial intelligence greatly enhances the ability of privacy intrusion and brings about more privacy gains. On the other hand, the behavior of artificial intelligence infringing privacy is very confusing, and the resulting damage is even more serious.In this regard, the traditional privacy protection legal framework seems to be stretched,which is neither effective in protecting people's privacy nor fully exploiting the value of personal information,and The General Data Protection Regulation has made many useful responses.Faced with the new challenges of artificial intelligence, China should improve the legal system of privacy protection, attach importance to the technical path of privacy protection, explore the market mechanism for privacy protection, and establish the ethical principles of privacy protection.
引文
﹝1﹞Joseph Jerome. Why Artificial Intelligence May Be the Next Big Privacy Trend﹝N﹞.IAPP, Oct 17,2016.
    ﹝2﹞Ryan Calo. Robots and Privacy, in Patrick Lin, Keith Abney, George A. Bekey(eds):Robot Ethics:The Ethical and Social Implications of Robotics﹝M﹞.The MIT Press,2012.
    ﹝3﹞[美]特蕾莎·M·佩顿,西奥多·克莱普尔.大数据时代的隐私﹝M﹞.郑淑红,译.上海:上海科学技术出版社,2017.
    ﹝4﹞Tim Moynihan. Alexa and Google Home Record What You Say. But What Happens to That Data? ﹝N﹞.Wired, November 5,2016.
    ﹝5﹞Saito Rodger. Artificial Intelligence is Going to Supercharge Surveillance﹝N﹞.Sanvada, January 25, 2018.
    ﹝6﹞Adrienne LaFrance. How Self-Driving Cars Will Threaten Privacy﹝N﹞.The Atlantic, Mar 21, 2016.
    ﹝7﹞John Frank Weaver. Robots Are People Too: How Siri, Google Car, and Artificial Intelligence Will Force Us to Change Our Laws﹝M﹞.Praeger, 2014.
    ﹝8﹞Richard Bates, Nicholas Blackmore.The Privacy Challenges of Big Data and Artificial Intelligence﹝N﹞.KennedysLaw,June 1,2017.
    ﹝9﹞George Nott. ‘Explainable Artificial Intelligence’: Cracking Open the Black Box of AI﹝N﹞.Computerworld, April 10, 2017.
    ﹝10﹞Kaori Ishii. Comparative Legal Study on Privacy and Personal Data Protection for Robots Equipped with Artificial Intelligence: Looking at Functional and Technological Aspects﹝J﹞.AI & Soc,2017,(32):7-20.
    ﹝11﹞B.J. Fogg. Persuasive Technologies: Using Computers to Change What We Think and Do﹝M﹞.Morgan Kaufmann Publishers,2003.
    ﹝12﹞Samuel D. Warren, Louis D. Brandeis. The Right to Privacy﹝J﹞.Harv. L. Rev.,1890,(4):205.
    ﹝13﹞Hannah Arendt. The Human Condition﹝M﹞.The University of Chicago Press,1958.
    ﹝14﹞Cass R. Sunstein. Infotopia: How Many Minds Produce Knowledge﹝M﹞.Oxford University Press,2006.
    ﹝15﹞羽生.人民网二评算法推荐:别被算法困在“信息茧房”[N/OL].人民网,2017-9-20[2018-10-31]. http://opinion.people.com.cn/n1/2017/0919/c1003-29544724.html.
    ﹝16﹞张新宝.从隐私到个人信息:利益再衡量的理论与制度安排﹝J﹞.中国法学,2015,(3):38-59.
    ﹝17﹞Omer Tene. Privacy Law’s Midlife Crisis: A Critical Assessment of the Second Wave of Global Privacy Laws﹝J﹞.Ohio St. L.J.,2013,(74):1220.
    ﹝18﹞Deirdre K. Mulligan, Jennifer King. Briging the Gap Between Privacy and Design﹝J﹞.U. Pa. J. Const. L.,2011-2012,(14):990-1000.
    ﹝19﹞Fred H. Cate.The Failure of Fair Informalion Practice Princples, in Jane K. Winn (ed.):Consumer Protection in the Age of the ‘Information Economy’ ﹝M﹞.Routledge,2006.
    ﹝20﹞吴汉东.人工智能时代的制度安排与法律规制﹝J﹞.法律科学,2017,(5):128-134.
    ﹝21﹞吴军.智能时代:大数据与智能革命重新定义未来﹝M﹞.北京:中信出版集团,2016.
    ﹝22﹞Giovanni Buttarelli. A Smart Approach: Counteract the Bias in Artificial Intelligence﹝N﹞.Europa,November 8,2016.
    ﹝23﹞Alexandra Rengel. Privacy-Invading Technologies and Recommendations for Designing a Better Future for Privacy Rights﹝J﹞.Intercultural Hum. Rts. L. Rev.,2013,8:224-229.
    ﹝24﹞徐明.大数据时代的隐私危机及其侵权法应对﹝J﹞.中国法学,2017,(1):134-149.
    ﹝25﹞Rand Hindi. Will Artificial Intelligence Be Illegal in Europe Next Year? ﹝N﹞. Entrepreneur, August 9,2017.
    ﹝26﹞Lilian Edwards, Michael Veale. Slave to the Algorithm: Why a Right to an Explanation Is Probably Not the Remedy You Are Looking for﹝J﹞.Duke L. & Tech. Rev.,2017-2018,(16):49.
    ﹝27﹞田新月.欧盟《一般数据保护条例》新规则评析﹝J﹞.武大国际法评论,2016,(2):466-479.
    ﹝28﹞Simon Davies. Why Privacy by Design is the Next Crucial Step for Privacy Protection﹝R﹞.ICOMP,2010.
    ﹝29﹞Ann Cavoukian. Privacy by Design﹝R﹞.Information and Privacy Commissioner,Ontario,Canada,2013.
    ﹝30﹞Tom Field. Privacy by Redesign: A New Concept﹝N﹞.BankInfoSecurity, June 28, 2011.
    ﹝31﹞Ryan Calo. Peeping HALs: Making Sense of Artificial Intelligence and Privacy﹝J﹞.Eur. J. Legal Stud.,2010, (2):190.
    ﹝32﹞Ruth Janal. Data Portability - A Tale of Two Concepts﹝J﹞.J. Intell. Prop. Info. Tech. & Elec. Com. L.2017, (8):60.
    ﹝33﹞Anna Ohlden. Landmark Resolution Passed To Preserve The Future Of Privacy﹝N﹞. Science 2.0, Oct. 29,2010.
    ﹝34﹞Kashmir Hill. Why “Privacy by Design” Is the New Corporate Hotness﹝N﹞.Forbes,Jul. 28, 2011.
    ﹝35﹞崔聪聪,等.个人信息保护法研究﹝M﹞.北京:北京邮电大学出版社,2015.
    ﹝36﹞G.W. van Blarkom, J.J. Borking, J.G.E. Olk. Handbook of Privacy and Privacy-Enhancing Technologies:The Case of Intelligent Software Agents﹝M﹞.College bescherming persoonsgegevens,2003.
    ﹝37﹞Information Commissioner’s Office. Data Protection Guidance Note: Privacy Enhancing Technologies﹝R﹞.Information Commissioner’s Office, UK,2007.
    ﹝38﹞Seda Gürses. PETs and Their Users: a Critical Review of the Potentials and Limitations of the Privacy as Confidentiality Paradigm﹝J﹞. Identity in the Information Society,2010,(3):555-558.
    ﹝39﹞ENISA. Privacy by Design in Big Data:An Overview of Privacy Enhacing Technologies in the Era of Big Data Analytics﹝R﹞.European Union Agency for Network and Information Security,2015.
    ﹝40﹞Andy Greenberg. Apple’s ‘Differential Privacy’ Is About Collecting Your Data—But Not Your Data﹝N﹞.Wired, June 13,2016.
    ﹝41﹞Inga Kroenera, David Wrighta. A Strategy for Operationalizing Privacy by Design﹝J﹞.The Information Society,2014,(30):361.
    ﹝42﹞Information Commissioner’s Office. Conducting Privacy Impact Assessments Code of Practice﹝R﹞.Information Commissioner’s Office, UK,2014.
    ﹝43﹞[美]劳伦斯·莱斯格.代码2.0:网络空间中的法律﹝M﹞.李旭,沈伟伟,译.北京:清华大学出版社,2009.
    ﹝44﹞刘德良.个人信息的财产权保护﹝J﹞.法学研究,2007,(3):87-91.
    ﹝45﹞Tom Simonite. Sell Your Personal Data for $8 a Month﹝N﹞.MIT Technology Review, February 12, 2014.
    ﹝46﹞Eduardo Porter. Your Data Is Crucial to a Robotic Age, Shouldn’t You Be Paid for It? ﹝N﹞.The New York Times, March 6, 2018.
    ﹝47﹞Stephen Gardner. Artificial Intelligence Poses Data Privacy Challenges﹝N﹞.Bloomberg,October 26,2016.
    ﹝48﹞张玉宏,秦志光,肖乐.大数据算法的歧视本质﹝J﹞.自然辩证法研究,2017,(5):83-85.
    ﹝49﹞丁晓东.算法与歧视:从美国教育平权案看算法伦理与法律解释﹝J﹞.中外法学,2017,(6):1609-1623.
    ﹝50﹞John Brandon. How Social Robots Will Invade Our Homes and What We Can Do About It﹝N﹞.Fox News, December 22, 2017.
    (1)需要说明的是,各国家和地区对于隐私(privacy)、数据(data)、个人信息(information)等概念的使用习惯不同。例如,我国台湾地区常使用“个人资料”,欧盟及其成员国多采用“个人数据”,日韩通常使用“个人信息”,美国则习惯使用“个人隐私”。从各国和地区使用的语境来看,三者基本是通用的。然而,从我国《民法总则》第110条、第111条以及第127条的规定来看,隐私、数据、个人信息三者似乎是相互区分的。笔者认为,隐私和个人信息具有交叉性,隐私起码包括空间隐私和信息隐私,许多个人信息都属于隐私的一种;至于数据则是从数据控制者角度所说的各类信息的集合,不仅包括个人信息,还包括系统自动生成的数据等。此外,特别要强调的是,自互联网技术出现以来,通过侵害个人信息来侵害隐私的现象急剧增加。故此,个人信息的保护也成为人工智能时代隐私保护的重头戏。
    (2)例如,谷歌、特斯拉等自动驾驶汽车都发生过多起严重的交通事故。See Karissa Bell, Google Says It Bears Some Responsibility for First Accident Caused by Self-Driving Car, Mashable, Mar. 1, 2016; Chuck Jones, Tesla’s Autonomous Driving Fatal Accident Needs To Be Kept In Context, Forbes, July 1, 2016.
    (3)有关空间隐私、自决隐私和信息隐私的详细论述,See Herman T.Tavani, Ethics & Technology: Controversies, Questions,and Strategies for Ethical Computing, Wiley,2015,pp.117-119.
    (4)公平信息实践原则由美国于上世纪70年代提出,是全球最为重要的个人信息保护法律指南,已经有一百多个国家在践行该指导原则。例如,1980年经济合作与发展组织的《关于隐私保护与个人信息跨国流通指南的建议》(简称“OECD指南”)、1990年的《联合国个人信息保护指南》、1995年欧盟《数据保护指令》、2005年亚太经合组织的《APEC个人隐私保护框架》(简称“APEC隐私框架”)等都可以看作是践行公平信息实践原则的代表性版本。See Deirdre K. Mulligan & Jennifer King, Briging the Gap Between Privacy and Design,14 U. Pa. J. Const. L. 989,2012,p.990; Frederik Zuiderveen Borgesius, Jonathan Gray, Mireille van Eechoud, Open Data, Privacy, and Fair Information principles: Towards a Balancing Framework,30 Berkeley Technology Law Journal 2073,2016,pp.2101-2102.
    (5)据美国2008年一项调查显示,用户每年将所有隐私政策阅读一遍就需要花费244个小时。时至今日,用户耗费时间必定成倍增加。See Susan Landan,Control Use of Data to Protect Privacy, 347 Science Issue 504,2015,p.504.
    (6)欧盟《一般数据保护法》已于2018年5月25日正式生效。近期,欧盟各成员国已经在陆续通过该法案。
    (7)需要说明的是,欧盟《一般数据保护法》第25条使用的词语为“data protection by design”,而非“privacy by design”。但正如前文注释所说,由于国外并不严格区分data(数据)、privacy(隐私)和personal information(个人信息)三个概念。故此,美国和加拿大多使用privacy by design(隐私设计理论),欧盟则称之为data protection by design(数据保护设计理论),其他还存在privacy protection by design(隐私保护设计理论),privacy and data protection by design(隐私和数据保护设计理论)等表述。为行文方便,我们统称为“隐私设计理论”。
    (8)当前,我国学者对于是否要制定单独的人格权编争议较大。
    (9)例如,《最高人民法院、最高人民检察院关于办理侵犯公民个人信息刑事案件适用法律若干问题的解释》的第5条就对侵害不同个人信息“情节严重”的情形做了区分,对于行踪轨迹信息、通信内容、征信信息、财产信息只需要五十条以上即构成情节严重;而对于住宿信息、通信记录、健康生理信息、交易信息等个人信息则需要五百条以上。这也反应了法律区分不同类型的个人信息分别进行保护的理念。
    (10)参见欧盟《一般数据保护法》第8条。
    (11)当然,被遗忘权的行使是有条件的,同时也是有例外的。参见郑志峰:《网络社会的被遗忘权研究》,《法商研究》2015年第6期,第60页。
    (12)有关算法的解释权问题,请参见张凌寒:《商业自动化决策的算法解释权研究》,《法律科学》2018年第3期,第65-74页。
    (13)有关敏感信息与一般信息的区分,请参见齐爱民:《拯救信息社会中的人格》,北京大学出版社2009年版,第102页。
    (14)《人工智能伦理与数据保护宣言》确立了在人工智能发展中保护人权的六项伦理原则,其中第四项明确规定了隐私设计理论,并进一步将其发展为伦理设计理论。具体可参见吴沈括等译:《人工智能伦理与数据保护宣言》(全文). https://xiaobaiban.net/news/read-6fPcgW9MU3_2B5egWb.htm,2018年10月31日访问。
    (15)关于该法案中隐私设计条款的规定,具体可参见Sec. 107. Accountability of Administration Discussion Draft: Consumer Privacy Bill of Rights Act of 2015.
    (16)需要说明的是,隐私默认保护(privacy by default)应当是隐私设计理论的组成部分,但欧盟《一般数据保护条例》第25条将两者并列放在一起。
    (17)数据本身的价值并不容易评估,但并非不能确定。See Rhett Allain, How Much Does Your Data Cost?,Wired,June 21,2011.
    (18)会议明确提出,任何人工智能系统的创建、开发和使用都应充分尊重人权,特别是保护个人数据和隐私权以及人的尊严不被损害的权利,并应提供解决方案,使个人能够控制和理解人工智能系统。与此同时,会议还确立了在人工智能发展中保护人权的六项伦理原则。具体可参见吴沈括等译:《人工智能伦理与数据保护宣言》(全文). https://xiaobaiban.net/news/read-6fPcgW9MU3_2B5egWb.htm,2018年10月31日访问。
    (19)例如,许多公司都在研发没有刹车、油门、方向盘的自动驾驶汽车,欲将驾驶任务全部交由系统完成。对此,笔者认为,这种研发思路违背了控制原则,将人机关系中的主导权全部交由机器,这是十分危险的。See Gail Sullivan, Google’s New Driverless Car Has No Brakes or Steering Wheel, The Washington Post, May 28, 2014.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700