首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
There has naturally been a good deal of discussion of the forthcoming General Data Protection Regulation. One issue of interest to all data controllers, and of particular concern for researchers, is whether the GDPR expands the scope of personal data through the introduction of the term ‘pseudonymisation’ in Article 4(5). If all data which have been ‘pseudonymised’ in the conventional sense of the word (e.g. key-coded) are to be treated as personal data, this would have serious implications for research. Administrative data research, which is carried out on data routinely collected and held by public authorities, would be particularly affected as the sharing of de-identified data could constitute the unconsented disclosure of identifiable information.Instead, however, we argue that the definition of pseudonymisation in Article 4(5) GDPR will not expand the category of personal data, and that there is no intention that it should do so. The definition of pseudonymisation under the GDPR is not intended to determine whether data are personal data; indeed it is clear that all data falling within this definition are personal data. Rather, it is Recital 26 and its requirement of a ‘means reasonably likely to be used’ which remains the relevant test as to whether data are personal. This leaves open the possibility that data which have been ‘pseudonymised’ in the conventional sense of key-coding can still be rendered anonymous. There may also be circumstances in which data which have undergone pseudonymisation within one organisation could be anonymous for a third party. We explain how, with reference to the data environment factors as set out in the UK Anonymisation Network's Anonymisation Decision-Making Framework.  相似文献   

2.
Precision and effectiveness of Artificial Intelligence (AI) models are highly dependent on the availability of genuine, relevant, and representative training data. AI systems tested and validated on poor-quality datasets can produce inaccurate, erroneous, skewed, or harmful outcomes (actions, behaviors, or decisions), with far-reaching effects on individuals' rights and freedoms.Appropriate data governance for AI development poses manifold regulatory challenges, especially regarding personal data protection. An area of concern is compliance with rules for lawful collection and processing of personal data, which implies, inter alia, that using databases for AI design and development should be based on a clear and precise legal ground: the prior consent of the data subject or another specific valid legal basis.Faced with this challenge, the European Union's personal data protection legal framework does not provide a preferred, one-size-fits-all answer, and the best option will depend on the circumstances of each case. Although there is no hierarchy among the different legal bases for data processing, in doubtful cases, consent is generally understood by data controllers as a preferred or default choice for lawful data processing. Notwithstanding this perception, obtaining data subjects' consent is not without drawbacks for AI developers or AI-data controllers, as they must meet (and demonstrate) various requirements for the validity of consent. As a result, data subjects' consent could not be a suitable and realistic option to serve AI development purposes. In view of this, it is necessary to explore the possibility of basing this type of personal data processing on lawful grounds other than the data subject's consent, specifically, the legitimate interest of the data controller or third parties. Given its features, legitimate interests could help to meet the challenge of quality, quantity, and relevance of data curation for AI training.The aim of this article is to provide an initial conceptual approach to support the debate about data governance for AI development in the European Union (EU), as well as in non-EU jurisdictions with European-like data protection laws. Based on the rules set by the EU General Data Protection Regulation (GDPR), this paper starts by referring to the relevance of adequate data curation and processing for designing trustworthy AI systems, followed by a legal analysis and conceptualization of some difficulties data controllers face for lawful processing of personal data. After reflecting on the legal standards for obtaining data subject's valid consent, the paper argues that legitimate interests (if certain criteria are met) may better match the purpose of building AI training datasets.  相似文献   

3.
This article argues that Google's essentially blanket and unsafeguarded dissemination to webmasters of URLs delisted under the Google Spain judgment disclosures claimants’ personal data, cannot be justified either on the purported basis of their consent or a legal requirement but instead seriously infringes European data protection standards. Such disclosure would only be compatible with the initially contextually sensitive context of collection where it was (i) reasonably necessary and explicitly limited to the purposes of checking the legality of the initial decision and/or bona fide research and (ii) prevented unauthorised repurposing or other misuse through robust safeguards. Strict necessity thresholds would need to apply where disclosure involved special categories of data or was subject to reasoned objection by a data subject and international transfers would require further controls, ideally as provided by the European Commission's standard contractual clauses. Disclosing identifiable data on removals to end users would directly and fundamentally undermine a data subject's rights and, therefore, ipso facto violate purpose limitation and legality, irrespective of whether rights are claimed in data protection, defamation or civil privacy. The public's legitimate interests in receiving information on personal data removals are best secured through safeguarded scientific research, which search engines should facilitate.  相似文献   

4.
The commodification of digital identities is an emerging reality in the data-driven economy. Personal data of individuals represent monetary value in the data-driven economy and are often considered a counter performance for “free” digital services or for discounts for online products and services. Furthermore, customer data and profiling algorithms are already considered a business asset and protected through trade secrets. At the same time, individuals do not seem to be fully aware of the monetary value of their personal data and tend to underestimate their economic power within the data-driven economy and to passively succumb to the propertization of their digital identity. An effort that can increase awareness of consumers/users on their own personal information could be making them aware of the monetary value of their personal data. In other words, if individuals are shown the “price” of their personal data, they can acquire higher awareness about their power in the digital market and thus be effectively empowered for the protection of their information privacy. This paper analyzes whether consumers/users should have a right to know the value of their personal data. After analyzing how EU legislation is already developing in the direction of propertization and monetization of personal data, different models for quantifying the value of personal data are investigated. These models are discussed, not to determine the actual prices of personal data, but to show that the monetary value of personal data can be quantified, a conditio-sine-qua-non for the right to know the value of your personal data. Next, active choice models, in which users are offered the option to pay for online services, either with their personal data or with money, are discussed. It is concluded, however, that these models are incompatible with EU data protection law. Finally, practical, moral and cognitive problems of pricing privacy are discussed as an introduction to further research. We conclude that such research is needed to see to which extent these problems can be solved or mitigated. Only then, it can be determined whether the benefits of introducing a right to know the value of your personal data outweigh the problems and hurdles related to it.  相似文献   

5.
The existence of a fundamental right to the protection of personal data in European Union (EU) law is nowadays undisputed. Established in the EU Charter of Fundamental Rights in 2000, it is increasingly permeating EU secondary law, and is expected to play a key role in the future EU personal data protection landscape. The right's reinforced visibility has rendered manifest the co-existence of two possible and contrasting interpretations as to what it come to mean. If some envision it as a primarily permissive right, enabling the processing of such data under certain conditions, others picture it as having a prohibitive nature, implying that any processing of data is a limitation of the right, be it legitimate or illegitimate. This paper investigates existing tensions between different understandings of the right to the protection of personal data, and explores the assumptions and conceptual legacies underlying both approaches. It traces their historical lineages, and, focusing on the right to personal data protection as established by the EU Charter, analyses the different arguments that can ground contrasted readings of its Article 8. It also reviews the conceptualisations of personal data protection as present in the literature, and finally contrasts all these perspectives with the construal of the right by the EU Court of Justice.  相似文献   

6.
Adding to the current debate, this article focuses on the personal data and privacy challenges posed by private industry's use of smart mobile devices that provide location-based services to users and consumers. Directly relevant to personal data protection are valid concerns about the collection, retention, use and accessibility of this kind of personal data, in relation to which a key issue is whether valid consent is ever obtained from users. While it is indisputable that geo-location technologies serve important functions, their potential use for surveillance and invasion of privacy should not be overlooked. Thus, in this study we address the question of how a legal regime can ensure the proper functionality of geo-location technologies while preventing their misuse. In doing so, we examine whether information gathered from geo-location technologies is a form of personal data, how it is related to privacy and whether current legal protection mechanisms are adequate. We argue that geo-location data are indeed a type of personal data. Not only is this kind of data related to an identified or identifiable person, it can reveal also core biographical personal data. What is needed is the strengthening of the existing law that protects personal data (including location data), and a flexible legal response that can incorporate the ever-evolving and unknown advances in technology.  相似文献   

7.
企业并购是厘定个人信息权利与经营者权益边界的重要场景之一。在单纯的股权收购、企业的合并、分立与形式变更等情形下,原则上不产生个人信息保护的问题。在资产收购的情形下,则涉及个人数据的转让,但此时应考虑收购双方及目标公司债权人的合理利益。在美国法与欧盟法上,对于资产收购均存在数据传输无需用户同意的例外处理机制。这些机制的核心是数据主体与企业之间的利益衡量,一方面从数据主体的利益出发,考虑在资产收购后数据原本的使用目的是否能够实现;另一方面则从企业的利益出发,考虑其对于数据交易是否具有合理利益。在并购双方对于用户数据转让存在合理利益的基础上,应允许企业在并购中转移个人数据,但应给予用户事先或事后作出相反选择的权利。  相似文献   

8.
Financial Intelligence Units (FIUs) are key players in the current Anti-Money Laundering and Countering the Financing of Terrorism (AML/CFT) legal system. FIUs are specialised bodies positioned between private financial institutions and states’ law enforcement authorities, what renders them a crucial middle link in the chain of information exchange between the private and public sectors. Considering that a large share of this information is personal data, its processing must meet the minimum data protection standards. Yet, the EU data protection legal framework is composed of two main instruments, i.e. the General Data Protection Regulation (GDPR) and the Law Enforcement Directive (LED), which provide different thresholds for the protection of personal data. The aim of this paper is to clarify the applicable data protection legal regime for the processing of personal data by FIUs for AML/CFT purposes. To that end, the paper provides an overview of the nature and goals of AML/CFT policy and discusses the problem of the diversity of existing FIU models. Further, it proposes a number of arguments in favour of and against the possibility of applying either the GDPR or LED to the processing of personal data by the FIUs and reflects on how convincingly these arguments can be used depending on the specificities of a given FIU model.  相似文献   

9.
In the last few years there has been a lot of buzz around a so-called ‘right to be forgotten’. Especially in Europe, this catchphrase is heavily debated in the media, in court and by regulators. Since a clear definition has not emerged (yet), the following article will try to raise the veil on this vague concept. The first part will weigh the right’s pros and cons against each other. It will appear that the ‘right to be forgotten’ clearly has merit, but needs better definition to avoid any negative consequences. As such, the right is nothing more than a way to give (back) individuals control over their personal data and make the consent regime more effective. The second part will then evaluate the potential implementation of the right. Measures are required at the normative, economical, technical, as well as legislative level. The article concludes by proposing a ‘right to be forgotten’ that is limited to data processing situations where the individual has given his or her consent. Combined with a public interest exception, this should (partially) restore the power balance and allow individuals a more effective control over their personal data.  相似文献   

10.
The deployment of pervasive information and communication technologies (ICTs) within smart city initiatives transforms cities into extraordinary apparatuses of data capture. ICTs such as smart cameras, sound sensors and lighting technology are trying to infer and affect persons’ interests, preferences, emotional states, and behaviour. It should be no surprise then that contemporary legal and policy debates on privacy in smart cities are dominated by a debate focused on data and, therefore, on data protection law. In other words, data protection law is the go-to legal framework to regulate data processing activities within smart cities and similar initiatives. While this may seem obvious, a number of important hurdles might prevent data protection law to be (successfully) applied to such initiatives. In this contribution, we examine one such hurdle: whether the data processed in the context of smart cities actually qualifies as personal data, thus falling within the scope of data protection law. This question is explored not only through a theoretical discussion but also by taking an illustrative example of a smart city-type initiative – the Stratumseind 2.0 project and its living lab in the Netherlands (the Stratumseind Living Lab; SLL). Our analysis shows that the requirement of ‘identifiability’ might be difficult to satisfy in the SLL and similar initiatives. This is so for two main reasons. First, a large amount of the data at stake do not qualify as personal data, at least at first blush. Most of it relates to the environment, such as, data about the weather, air quality, sound and crowding levels, rather than to identified or even likely identifiable individuals. This is connected to the second reason, according to which, the aim of many smart city initiatives (including the SLL) is not to identify and target specific individuals but to manage or nudge them as a multiplicity – a combination of the environment, persons and all of their interactions. This is done by trying to affect the ‘atmosphere’ on the street. We thus argue that a novel type of profiling operations is at stake; rather than relying on individual or group profiling, the SLL and similar initiatives rely upon what we have called ‘atmospheric profiling’. We conclude that it remains highly uncertain, whether smart city initiatives like the SLL actually process personal data. Yet, they still pose risks for a wide variety of rights and freedoms, which data protection law is meant to protect, and a need for regulation remains.  相似文献   

11.
Successful criminal or civil legal system response to assaults against intimate partners (intimate partner violence; IPV) usually rely on the victim’s participation in the legal process, including having contact with the prosecutor, filing charges, and/or applying for an order of personal protection. Using data abstracted from criminal and civil legal system records for a county-wide cohort of 990 female IPV victims over a 4-year period, we examine the impact of having children, and of specific child factors, on victims’ engagement with the criminal prosecution of their abusive partners and/or seeking a personal protection order (PPO) in the civil court system. Having children increased victim’s contact with the prosecutor and applications for PPOs, but did not increase her likelihood of wanting to file or drop charges. Findings support prior work suggesting both the importance and complexity of children on mothers’ decision-making. Policy makers and service providers may want to assess survivors’ thoughts about the role children play in their decision-making. Additionally, by offering survivors interventions to help their children address the impact of IPV exposure, survivors may be more willing to engage with services.  相似文献   

12.
On 16 July 2020, the Grand Chamber of the European Court of Justice rendered its landmark judgment in Case C-311/18 Data Protection Commissioner v. Facebook Ireland Ltd and Maximillian Schrems (“Schrems II”). The Grand Chamber invalidated the Commission decision on the adequacy of the data protection provided by the EU-US Privacy Shield. It however considered that the decision of the Commission on standard contractual clauses (“SCCs”) issued by the Commission for the transfer of personal data to processors established in third states was legally valid.The legal effects of the judgment should first be clarified. In addition, it has far-reaching implications for companies which transfer personal data from the EU to the US. The judgment of the Grand Chamber has also far-reaching implications for transfers of personal data from the EU to other third states. Last, it has far-reaching implications for the UK in the context of Brexit.© 2020 Published by Elsevier Ltd. All rights reserved.  相似文献   

13.
Anonymization is viewed as an instrument by which personal data can be rendered so that it can be processed further without harming data subjects' private lives, for purposes that are beneficial to the public good. The anonymization is fair if the possibility of re-identification can be practically excluded. The data processor does all that he or she can to ensure this. For a fair anonymization, simply removing the primary personal identification data, such as the name, resident address, phone number and email address, is not enough, as many papers have warned. Therefore, new guidance documents, and even legal rulings such as the HIPAA Privacy Rule on de-identification, may improve the security of anonymization. Researchers are continuously testing the efficiency of the methods and simulating re-identification attacks. Since the US and Canada do not have a population registry, re-identification experiments were carried out with the help of other publicly available databases, such as census data or the voters' database. Unfortunately, neither of these is complete and sufficiently detailed, so the computed risk was only an estimate. The author obtained the zip code, gender, date of birth distribution data from the Hungarian population registry and computed re-identification risks in several simulated cases. This paper also gives an insight into the legal environment of Hungarian personal medical data protection legislation.  相似文献   

14.
This contribution is an attempt to facilitate a meaningful European discussion on propertization of personal data by explaining the idea as it emerged in its ‘mother-jurisdiction’, the United States. The piece starts with an overview of how the current US legal system addresses the data protection problem and whether, according to the US commentators, the law does it effectively. Furthermore, the contribution presents propertization of personal information as an alternative to the existing data protection regime and one of the ways to fill in the alleged gaps in the US data protection system. The article maps the US propertization debate. Pro-propertization arguments are considered from economic perspective as well as from the perspective of the limitations of the US legal and political system. In continuation it analyses proposals on how property rights in personal data would have to be regulated, if at all, in case the idea of propertization is accepted. The main points of criticism of propertization are also sketched. The article concludes with a brief summary of the US propertization discourse and, most importantly, with a list of the lessons Europeans can learn from their American counterparts engaging in the debate in the home jurisdiction. Among the main messages is that the outcome of the debate depends on the definition of the problem propertization is called on to tackle, and that it is the substance of the actual rights with regard to personal data that matters, and not whether we label them as property rights or not.  相似文献   

15.
This paper rethinks the reasons for and the nature and means of personal data protection. The reasons for personal data protection are that it could promote the fairness and effectiveness of information flow, help individuals develop their independent personality, and equip them to deal with risks. With respect to the nature of personal data, this paper argues that such data should not be perceived from a purely individualistic point of view. Rather, there should be a contextualized understanding of the data, which considers the appropriate information flow of personal data within a particular context. Regarding the legal framework of personal data protection, this paper suggests that consumer protection law and public law are better equipped to protect personal data than tort, contract, or property law.  相似文献   

16.
张涛 《现代法学》2022,(1):125-143
政府数据开放并非静态的单一行为,而是动态的系统过程。借助数据生命周期理论,可以将政府数据开放解构为数据收集、转换、存储、公开和使用五个阶段。根据《个人信息保护法》和《数据安全法》确立的最新规则,个人信息保护风险可能同时存在于政府数据开放生命周期的各个阶段。然而,政府数据开放中现有的个人信息保护范式主要采取“基于结果的方法”,重点关注政府数据在公开时的状态,依靠技术性匿名化手段,难以有效应对政府数据开放中的个人信息保护风险。与此相对应,“基于过程的方法”与政府数据生命周期、个人信息保护的程序化和数据安全全流程管理相契合,可以弥补“基于结果的方法”的不足。通过将风险预防原则和程序、技术、经济、教育和法律等手段分散放置在政府数据开放生命周期的每个阶段,能够最大限度减少个人信息保护风险,在个人信息保护与政府数据开放之间实现动态平衡。  相似文献   

17.
The opportunity to use extensive genetic data, personal information, and family medical history for research purposes may be naturally appealing to the personal genetic testing (PGT) industry, which is already coupling direct-to-consumer (DTC) products with social networking technologies, as well as to potential industry or institutional partners. This article evaluates the transformation in research that the hybrid of PGT and social networking will bring about, and--highlighting the challenges associated with a new paradigm of "patient-driven" genomic research--focuses on the consequences of shifting the structure, locus, timing, and scope of research through genetic crowd-sourcing. This article also explores potential ethical, legal, and regulatory issues that arise from the hybrid between personal genomic research and online social networking, particularly regarding informed consent, institutional review board (IRB) oversight, and ownership/intellectual property (IP) considerations.  相似文献   

18.
个人信息在《民法典》中被确认为一种人格法益,在理论和立法上确立了我国个人信息的私法保护面向。个人权益保护成为构建和理解个人信息保护的重要维度和线索。由于个人信息保护的公共目标和功能可能被个人私益保护的进路所覆盖或消解,因此有必要将社会风险控制作为个人信息保护的重要维度来对待。社会风险控制一直是电子化时代个人数据保护的基础性目的,它对于个人信息保护的相关理论和制度具有很强的解释力和动态构建作用。社会风险控制和个人权益保护两种进路在相关基础问题上出现分歧,如个人信息与隐私的基础关系、一般性保护与场景化保护以及本权与保护权的关系等。在《个人信息保护法》实施过程中,社会风险控制进路有助于合理解读和执行法律,把握风险大小与控制措施的合理匹配,以及在平衡相关立法价值的前提下,释放信息的流动性。  相似文献   

19.
In two recent judgements, the Court of Justice of the European Union stated that ‘The right to the protection of personal data is not, however, an absolute right, but must be considered in relation to its function in society’ (Eifert, para 48). This paper considers the ‘non-absolute’ nature of the right to data protection. Being a relatively new right, the boundaries of this right in the Charter are still somewhat unexplored. This paper considers five aspects that can be seen as setting boundaries to the otherwise absolute nature of the right to data protection: (a) consideration of the function of the right to data protection in society; (b) positive delimitations of the right that come from the formulation of the right (Article 8) in the Charter; (c) limitations on the right provided for in Article 52 of the Charter; (d) close connections with Article 7 of the Charter and Article 8 ECHR; and (e) the detailed provisions in current data protection secondary legislation and the future data protection regulation framework. Based on the reflections on each of these boundary-setting aspects, the paper argues that in spite of occasional vagueness and conflicting approaches of each of the aspects, understanding of the right to data protection has evolved since its first formulation in the Charter. There is a subtle and gradual distancing from the initial understanding of the close relationship with the right to private and family life. This gradual distancing is a positive development as the two have different foundations, scope and purposes. Yet it is only when both are taken together that the shared common objective of providing effective protection to citizens' personal and family life can be achieved.  相似文献   

20.
On 26 July 2017, the Grand Chamber of the European Court of Justice rendered its seminal Opinion 1/15 about the agreement on Passenger Name Record data between the EU and Canada. The Grand Chamber considered that the decision of the Council about the conclusion, on behalf of the Union, of the agreement between the EU and Canada about the transfer and processing of PNR data must be based jointly on Article 16(2) about the protection of personal data and Article 87(2)(a) about police co-operation among member states in criminal matters, but not on Article 82(1)(d) about judicial co-operation in criminal matters in the EU of the Treaty on the Functioning of the EU. The Grand Chamber also considered that the agreement is incompatible with Article 7 on the right to respect for private life, Article 8 on the right to the protection of personal data, Article 21 on non-discrimination and Article 52(1) on the principle of proportionality of the Charter of Fundamental Rights of the EU since it does not preclude the transfer, use and retention of sensitive data. In addition to the requirement to exclude such data, the Grand Chamber listed seven requirements that the agreement must include, specify, limit or guarantee to be compatible with the Charter.The opinion of the Grand Chamber has far-reaching implications for the agreement on PNR data between the EU and Canada. It has also far-reaching implications for international agreements on PNR data between the EU and other third states. Last, it has far-reaching implications for Directive 681 of 27 April 2016 on PNR data.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号