用户名: 密码: 验证码:
元规制模式下的数据保护与算法规制——以欧盟《通用数据保护条例》为研究样本
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:Data Protection and Algorithms Regulation Basing on the Mode of Meta-Regulation: Taking EU General Data Protection Regulation as a Sample
  • 作者:程莹
  • 英文作者:Cheng Ying;
  • 关键词:数据保护 ; 算法 ; 透明度 ; 大数据 ; 欧盟《通用数据保护条例》
  • 英文关键词:data protection;;algorithms;;transparency;;big data;;EU general data protection regulation
  • 中文刊名:DOUB
  • 英文刊名:Science of Law(Journal of Northwest University of Political Science and Law)
  • 机构:中国政法大学人权研究院;
  • 出版日期:2019-06-11 13:44
  • 出版单位:法律科学(西北政法大学学报)
  • 年:2019
  • 期:v.37;No.236
  • 语种:中文;
  • 页:DOUB201904005
  • 页数:8
  • CN:04
  • ISSN:61-1470/D
  • 分类号:50-57
摘要
算法和数据保护之间形成了互相掣肘又互相促进的复杂关系,实际体现了科技与法律的互动关系。算法技术的突破加剧权力失衡和技术风险,导致个人权利实现效果不彰。个人法益保护有赖强化数据控制者责任。在监管机构缺乏必要资源或信息时,适宜采用元规制模式,即通过正反面激励,促使数据控制者本身针对问题做出自我规制式回应。这种模式切实体现在欧盟数据保护改革中。在检讨GDPR第22条算法条款的基础上,应发展数据控制者自我治理机制予以补足。在透明度原则和问责原则指引下,数据控制者有义务通过数据保护影响评估、经设计的数据保护等工具,构建完备的算法审查机制;同时通过革新算法解释方法矫正权力失衡,为个人权利救济提供保障。
        The algorithm and the data protection form a complex relationship that mutually restrains and promotes each other, which reflects the interaction between technology and law. Breakthroughs in algorithmic technology have exacerbated power imbalances and technical risks, making the individual rights-approach ineffective. Personal legal protection depends on strengthening the responsibility of data controllers. When the regulator lacks the necessary resources or information, it is appropriate to adopt the meta-regulation model. Through positive and negative incentives, the data controllers themselves should make self-regulatory responses to the problems. This model is reflected in the EU data protection reform. Based on reviewing the Article 22 which is called algorithm clause of GDPR, the data controller self-governance mechanism should be developed to complement. Under the guidance of transparency and accountability principle, data controllers are obliged to construct a complete algorithmic review mechanism through tools such as data protection impact assessment and data protection by design. At the same time, innovative algorithmic interpretation methods could help correct power imbalances and guarantee the personal relief.
引文
[1]Hildebrandt M.Profiling.From data to knowledge[J].Datenschutz und Datensicherheit – DuD,2006,30(9):548-552.
    [2][美]弗兰克·帕斯奎尔.黑箱社会[M].赵亚男,译.北京:中信出版社,2015.
    [3]Mayer-Sch?nberger V.Generational development of data protection in Europe[M]// Philip E A,Marc R.Technology and Privacy:The New Landscape,Cambridge:MIT Press,1997.
    [4]Hildebrandt.M.The Dawn of a Critical Transparency Right for the Profiling Era[M]// Bus,J.,et al.(eds.) Digital Enlightenment Yearbook,2012.
    [5]Herbert Burkert.Towards a new generation of data protection legislation[M]// Serge G.,et al.(eds.) Reinventing Data Protection?,Springer,2009.
    [6]Sloan R H,Warner R.When Is an Algorithm Transparent?Predictive Analytics,Privacy,and Public Policy[J].IEEE Security & Privacy,2018,16(3):18-25.
    [7]Macenaite.M.The “Riskification” of European Data Protection Law through a two-fold Shift[J].European Journal of Risk Regulation,2017,8:(3):506-540.
    [8][德]罗纳德·巴赫曼,吉多·肯珀,等.大数据时代下半场——数据治理、驱动与变现[M].刘志则,等译.北京:北京联合出版公司,2017.
    [9]裴炜.个人信息大数据与刑事正当程序的冲突及其调和[J].法学研究,2018(2):42-61.
    [10]Beck.“Ulrich Five minutes with Ulrich Beck:‘Digital freedom risk is one of the most important risks we face in modern society’”.LSE European Politics and Policy (EUROPP) Blog (02 Apr 2014)EB-OL http://eprints.lse.ac.uk/72044/,accessed August 15,2018.
    [11]刘刚.风险规制:德国的理论与实践[M].北京:法律出版社,2012.
    [12]David Garland.The Culture of Control:Crime and Social Order in Contemporary Society[M].The University of Chicago Press,2001.
    [13]Teubner Gunther,Firenze B.Juridification.Concepts,Aspects,Limits,Solutions[M]// G.Teubner .Juridification of Social Spheres:A Comparative Analysis in the Areas of Labor,Corporate,Antitrust and Social welfare law,1987.
    [14][英]罗伯特·鲍德温,[英]马丁·凯夫,[英]马丁·洛奇.牛津规制手册[M].宋华琳,等译.上海:上海三联书店,2017 .
    [15][英]科林·斯科特.规制、治理与法律:前沿问题研究[M].安永康,译.宋华琳,校.北京:清华大学出版社,2018.
    [16]周汉华.探索激励相容的个人数据治理之道——中国个人信息保护法的立法方向[J].法学研究,2018,(2):1-23.
    [17]Bygrave L A.Minding the Machine:Article 15 of the EC Data Protection Directive and Automated Profiling[J].Computer Law & Security Review,2001,17(1):17-24.
    [18]Sandra W,Brent M.Floridi L.Why a right to explanation of automated decision-making does not exist in the general data protection regulation[J].International Data Privacy Law,2017,7(2):76-99.
    [19][美]劳伦斯·莱斯格.代码2.0:网络空间中的法律[M].李旭,沈伟伟,译.北京:清华大学出版社,2009.
    [20]Goodman B,Flaxman S.European Union regulations on algorithmic decision-making and a "right to explanation"[J].Ai Magazine,2016,38(3):1-9.
    [21]Weitzner et al.Transparent Accountable Data Mining:New Strategies for Privacy Protection[M].MIT Technical Report,2016.
    [22]Kroll J A,Huey J,Barocas S,et al.Accountable Algorithms[J].University of Pennsylvania Law Review,2016,165(3):633-705.
    [23]Christopher K,et al.Machine learning with personal data:Is data protection law smart enough to meet the challenge?[J].International Data Privacy Law ,2017,7(1):1-2.
    [24]Goodman B W.A step towards accountable algorithms?algorithmic discrimination and the european union general data protection[M]/ /29th Conference on Neural Information Processing Systems (NIPS 2016),Barcelona.NIPS Foundation,2016.
    [25]吴汉东.人工智能时代的制度安排与法律规制[J].法律科学,2017,(5):128-136.
    [26]张凌寒.商业自动化决策的算法解释权研究[J].法律科学,2018,(3):65-74.
    [27]Sandra W,Brent M and Chris R.Counterfactual Explanations Without Opening the Black Box:Automated Decisions and the GDPR[EB/OL].https://ssrn.com/abstract=3063289.2018-06-16/2018-6-22.
    [28]Burrell J.How the Machine 'Thinks:' Understanding Opacity in Machine Learning Algorithms[J].Social Science Electronic Publishing,2015,3(1):1-12.
    (1)第10条、第11条有关告知的内容中,仅简单列明数据处理的目的、数据处理者的身份、数据接收者等少量信息。See Directive 95 /46 / EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data,OJ L 281,23.11.1995.
    (2)See Loi no.78-17 du 6.janvier 1978 relative a l’informatique,aux fichiers et aux liberte’s.Quoted from Mendoza,Isak,and L.A.Bygrave.The Right Not to be Subject to Automated Decisions Based on Profiling.in Tatiana E.Synodinou(eds).EU Internet Law:Regulation and Enforcement.Springer Netherlands 2017,p79.
    (3)如欧洲委员会《个人数据自动化处理中的个人保护现代化公约》(2018)第9条第1款a项,See Modernised Convention for the Protection of Individuals with Regard to the Processing of Personal Data,available at https://search.coe.int/cm/Pages/result_details.aspx?ObjectId=09000016807c65bf,assessed on 4th October,2018.欧洲委员会《大数据领域个人数据处理中的个人保护指南》(2017),See Council of Europe’s Guidelines on the Protection of Individuals with Regard to the Processing of Personal Data in the World of Big Data (2017;T-PD(2017)01).
    (4)根据第29条工作组指南,法律影响如因自动化决策而取消合同、取消住房福利,类似重要影响如信用贷款审批、电子招聘、医疗服务、教育,或依赖自动化决策显著影响个人习惯或选择等。针对性广告营销,推荐系统建议用户何时整理家务一般不认为具有重要影响,但如利用数据主体的弱点(如数据杀熟)或敏感信息(如政治倾向),则可能定性为具有重要影响。See Article 29 Working Party’s Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679,http://ec.europa.eu/newsroom/article29/item-detail.cfm?item_id=612053,accessed on 17 May 2018.
    (5)如德国、奥地利、比利时、芬兰、挪威、葡萄牙、瑞典等国家。See Sandra W,Brent M.Floridi L.Why a right to explanation of automated decision-making does not exist in the general data protection regulation,International Data Privacy Law.2017,7(2):76-99.
    (6)Article 29 Working Party’s Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679,http://ec.europa.eu/newsroom/article29/item-detail.cfm?item_id=612053,2018年4月29日访问。
    (7)参见1995年数据保护指令算法条款立法文件,“机器使用越来越多复杂的软件和专业系统,其结果具有明显客观的、无可争议的特点,人类决策者往往过于强调这一点,因而放弃了自己的责任。”See COM(92)422 final-SYN 287,15.100.1992,p.26.
    (8)See COM(90)314 final-SYN 287,13.9.1990,p.29.
    (9)参见GDPR 第13条第2款f项、第14条第2款g项、第15条第1款h项规定。
    (10)对“解释权”的内容安放,欧洲委员会、欧洲议会和欧洲理事会意见明显不同。欧洲委员会基于对自动决策危害的顾虑,建议将解释权放置在第20条,欧洲议会建议放置在序言第58条,欧洲理事会则认为不应当包含这项内容。最终妥协结果是将该内容纳入序言第71条。
    (11)参见GDPR第36条。
    (12)参见GDPR第35条第9款。
    (13)参见Article 29 Working Party’s Guidelines on on Data Protection Officers,http://ec.europa.eu/newsroom/article29/item-detail.cfm?item_id=612048,最后访问时间,2018年6月5日。
    (14)参见於兴中:《算法社会与人的秉性》,载《中国法律评论》2018年第2期。Roger Clarke.The digital persona and its application to data surveillance.Information Society,2010,10(2):77-92.陈桦:《艾伦研究所CEO:人工智能不会消灭人类》,载《信息与电脑:理论版》2016年第21期。
    (15)See Great Britain,Select Committee on Artificial Intelligence,AI in the UK:Ready,Willing and Able?,https://publications.parliament.uk/pa/ld201719/ldselect/ldai/100/10002.htm,assessed on 12th September,2018.
    (16)参见吴沈括等译:《人工智能伦理与数据保护宣言》,http://smart.blogchina.com/547246125.html,最后访问时间:2018年11月10日。

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700