首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 406 毫秒
1.
Privacy by Design is now enjoying widespread acceptance. The EU has recently expressly included it as one of the key principles in the revised data protection legal framework. But how does Privacy by design and data anonymisation work in practise? In this article the authors address this question from a practical point of view by analysing a case study on EU Financial Intelligence Units (“FIUs”) using the Ma3tch technology as additional feature to the existing exchange of information via FIU.NET decentralised computer network. They present, analyse, and evaluate Ma3tch technology from the perspective of personal data protection. The authors conclude that Ma3tch technology can be seen as a valuable example of Privacy by Design. It achieves data anonymisation and enhances data minimisation and data security, which are the fundamental elements of Privacy by Design. Therefore, it may not only improve the exchange of information among FIUs and allow for the data processing to be in line with applicable data protection requirements, but it may also substantially contribute to the protection of privacy of related data subjects. At the same time, the case study clearly shows that Privacy by Design needs to be supported and complemented by appropriate organisational and technical procedures to assure that the technology solutions devised to protect privacy would in fact do so.  相似文献   

2.
The EU General Data Protection Regulation (GDPR) devotes particular attention to the protection of personal data of children. The rationale is that children are less aware of the risks and the potential consequences of the processing of their personal data on their rights. Yet, the text of the GDPR offers little clarity as to the actual implementation and impact of a number of provisions that may significantly affect children and their rights, leading to legal uncertainty for data controllers, parents and children. This uncertainty relates for instance to the age of consent for processing children's data in relation to information society services, the technical requirements regarding parental consent in that regard, the interpretation of the extent to which profiling of children is allowed and the level of transparency that is required vis-à-vis children. This article aims to identify a number of key issues and questions – both theoretical and practical – that raise concerns from a multi-dimensional children's rights perspective, and to clarify remaining ambiguities in the run-up to the actual application of the GDPR from 25 May 2018 onwards.  相似文献   

3.
The aim of this paper is to present tools that are used by financial institutions to implement legal requirements with regard to anti-money laundering (AML) and combating the financing of terrorism (CFT). The fight against money laundering and terrorist financing is a two-stage procedure, where state authorities outsource a major part of their tasks and responsibilities to private entities. The bulk of the task falls to financial institutions, which have to monitor the transactions of their customers pursuant to international, European and national laws as well as internal policies. To comply with the legal requirements, they use sophisticated surveillance software and collect large amounts of personal data. The European Union introduced two Proposals (COM(2013) 45 final and COM(2013) 44 final) to update the European Union's AML/CFT regime. Even though the Proposals have not yet become legal acts, it is quite obvious that new tools to support financial institutions will emerge. The amendments foreseen by the Proposals as well as already existing tools, which will remain relevant, shall be examined and privacy considerations will be discussed.  相似文献   

4.
This article examines the two major international data transfer schemes in existence today – the European Union (EU) model which at present is effectively the General Data Protection Regulation (GDPR), and the Asia-Pacific Economic Cooperation (APEC) Cross Border Privacy Rules system (CBPR), in the context of the Internet of Things (IoT).While IoT data ostensibly relates to things i.e. products and services, it impacts individuals and their data protection and privacy rights, and raises compliance issues for corporations especially in relation to international data flows. The GDPR regulates the processing of personal data of individuals who are EU data subjects including cross border data transfers. As an EU Regulation, the GDPR applies directly as law to EU member nations. The GDPR also has extensive extraterritorial provisions that apply to processing of personal data outside the EU regardless of place of incorporation and geographical area of operation of the data controller/ processor. There are a number of ways that the GDPR enables lawful international transfer of personal data including schemes that are broadly similar to APEC CBPR.APEC CBPR is the other major regional framework regulating transfer of personal data between APEC member nations. It is essentially a voluntary accountability scheme that initially requires acceptance at country level, followed by independent certification by an accountability agent of the organization wishing to join the scheme. APEC CBPR is viewed by many in the United States of America (US) as preferable to the EU approach because CBPR is considered more conducive to business than its counterpart schemes under the GDPR, and therefore is regarded as the scheme most likely to prevail.While there are broad areas of similarity between the EU and APEC approaches to data protection in the context of cross border data transfer, there are also substantial differences. This paper considers the similarities and major differences, and the overall suitability of the two models for the era of the Internet of Things (IoT) in which large amounts of personal data are processed on an on-going basis from connected devices around the world. This is the first time the APEC and GDPR cross-border data schemes have been compared in this way. The paper concludes with the author expressing a view as to which scheme is likely to set the global standard.  相似文献   

5.
This article uses the example of the cryptocurrency Bitcoin and the General Data Protection Regulation (GDPR) to show how distributed networks challenge existing legal mechanisms of allocating responsibility. The Bitcoin network stores personal data by automated means. Furthermore, full nodes qualify as establishments and the network offers a service to citizens in the EU. The data processing within the Bitcoin network therefore falls into the material and territorial scope of the GDPR. To protect data subjects, the GDPR allocates responsibility to the controller, who determines the ‘how’ and the ‘why’ of the data processing. However, the distributed structure of the Bitcoin network blurs the lines between actors who are responsible and actors who are worth protecting. Neither the Bitcoin users running lightweight nodes or full nodes nor the miners determine the ‘how’ and the ‘why’ of the data processing. They carry out their network activities according to the Bitcoin protocol, which can only be adopted and enforced by a collective of full nodes and miners. Members of this collective are joint controllers under Article 26 GDPR, which obliges them to clearly and transparently determine their respective responsibilities for compliance with the GDPR. However, this mechanism fails because of the very structure it aims to eliminate. Therefore, a solution to allocating responsibility for data protection in distributed networks lies outside the GDPR.  相似文献   

6.
This article reports on preliminary findings and recommendations of a cross-discipline project to accelerate international business-to-business automated sharing of cyber-threat intelligence, particularly IP addresses. The article outlines the project and its objectives and the importance of determining whether IP addresses can be lawfully shared as cyber threat intelligence.The goal of the project is to enhance cyber-threat intelligence sharing throughout the cyber ecosystem. The findings and recommendations from this project enable businesses to navigate the international legal environment and develop their policy and procedures to enable timely, effective and legal sharing of cyber-threat information. The project is the first of its kind in the world. It is unique in both focus and scope. Unlike the cyber-threat information sharing reviews and initiatives being developed at country and regional levels, the focus of this project and this article is on business-to-business sharing. The scope of this project in terms of the 34 jurisdictions reviewed as to their data protection requirements is more comprehensive than any similar study to date.This article focuses on the sharing of IP addresses as cyber threat intelligence in the context of the new European Union (EU) data protection initiatives agreed in December 2015 and formally adopted by the European Council and Parliament in April 2016. The new EU General Data Protection Regulation (GDPR) applies to EU member countries, a major focus of the international cyber threat sharing project. The research also reveals that EU data protection requirements, particularly the currently applicable law of the Data Protection Directive 95/46/EC (1995 Directive) (the rules of which the GDPR will replace in practice in 2018), generally form the basis of current data protection requirements in countries outside Europe. It is expected that this influence will continue and that the GDPR will shape the development of data protection internationally.In this article, the authors examine whether static and dynamic IP addresses are “personal data” as defined in the GDPR and its predecessor the 1995 Directive that is currently the model for data protection in many jurisdictions outside Europe. The authors then consider whether sharing of that data by a business without the consent of the data subject, can be justified in the public interest so as to override individual rights under Articles 7 and 8(1) of the Charter of Fundamental Rights of the European Union, which underpin EU data protection. The analysis shows that the sharing of cyber threat intelligence is in the public interest so as to override the rights of a data subject, as long as it is carried out in ways that are strictly necessary in order to achieve security objectives. The article concludes by summarizing the project findings to date, and how they inform international sharing of cyber-threat intelligence within the private sector.  相似文献   

7.
This paper aims to critically assess the information duties set out in the General Data Protection Regulation (GDPR) and national adaptations when the purpose of processing is scientific research. Due to the peculiarities of the legal regime applicable to the research context information about the processing plays a crucial role for data subjects. However, the analysis points out that the information obligations, or mandated disclosures, introduced in the GDPR are not entirely satisfying and present some flaws.In addition, the GDPR information duties risk suffering from the same shortcomings usually addressed in the literature about mandated disclosures. The paper argues that the principle of transparency, developed as a “user-centric” concept, can support the adoption of solutions that embed behavioural insights to support the rationale of the information provision better.  相似文献   

8.
《个人信息保护法》最终纳入“根据宪法”条款,表征着个人信息保护法律体系在底层逻辑上的更动。民法学上权利与利益的区分保护原理,难以适用于整个合宪性法秩序。应将个人信息权确立为宪法位阶的基本权利,并以基本权利作为针对国家的主观防御权和辐射一切法领域的客观价值秩序的原理,协调个人信息保护的私法机制和公法机制。通过对人权条款笼罩下的通信权和人格尊严条款的解释,可以在学理上证立“基本权利束”性质的个人信息权。但其具体保护则应分别归入不同基本权利条款,作出区分化、差异化的多层次构造。个人信息保护的支配权思维有其局限,告知同意模式的式微是重要表现。应将个人信息权的规范目标调整为人格的自由发展,指向免于他人的人格干预。从支配权到人格发展权的思维转换,有助于规制对已收集信息的不当利用、破除“信息茧房”、缓和个人信息保护与利用之间的紧张,以及在“个人—平台—国家”的三方关系中有效保护个人的自决,同时为数据产业保留发展空间。  相似文献   

9.
Precision and effectiveness of Artificial Intelligence (AI) models are highly dependent on the availability of genuine, relevant, and representative training data. AI systems tested and validated on poor-quality datasets can produce inaccurate, erroneous, skewed, or harmful outcomes (actions, behaviors, or decisions), with far-reaching effects on individuals' rights and freedoms.Appropriate data governance for AI development poses manifold regulatory challenges, especially regarding personal data protection. An area of concern is compliance with rules for lawful collection and processing of personal data, which implies, inter alia, that using databases for AI design and development should be based on a clear and precise legal ground: the prior consent of the data subject or another specific valid legal basis.Faced with this challenge, the European Union's personal data protection legal framework does not provide a preferred, one-size-fits-all answer, and the best option will depend on the circumstances of each case. Although there is no hierarchy among the different legal bases for data processing, in doubtful cases, consent is generally understood by data controllers as a preferred or default choice for lawful data processing. Notwithstanding this perception, obtaining data subjects' consent is not without drawbacks for AI developers or AI-data controllers, as they must meet (and demonstrate) various requirements for the validity of consent. As a result, data subjects' consent could not be a suitable and realistic option to serve AI development purposes. In view of this, it is necessary to explore the possibility of basing this type of personal data processing on lawful grounds other than the data subject's consent, specifically, the legitimate interest of the data controller or third parties. Given its features, legitimate interests could help to meet the challenge of quality, quantity, and relevance of data curation for AI training.The aim of this article is to provide an initial conceptual approach to support the debate about data governance for AI development in the European Union (EU), as well as in non-EU jurisdictions with European-like data protection laws. Based on the rules set by the EU General Data Protection Regulation (GDPR), this paper starts by referring to the relevance of adequate data curation and processing for designing trustworthy AI systems, followed by a legal analysis and conceptualization of some difficulties data controllers face for lawful processing of personal data. After reflecting on the legal standards for obtaining data subject's valid consent, the paper argues that legitimate interests (if certain criteria are met) may better match the purpose of building AI training datasets.  相似文献   

10.
The EU faces substantive legislative reform in data protection, specifically in the form of the General Data Protection Regulation (GDPR). One of the new elements in the GDPR is its call to establish data protection certification mechanisms, data protection seals and marks to help enhance transparency and compliance with the Regulation and allow data subjects to quickly assess the level of data protection of relevant products and services. To this effect, it is necessary to review privacy and data protection seals afresh and determine how data protection certification mechanisms, seals or marks might work given the role they will be called to play, particularly in Europe, in facilitating data protection. This article reviews the current state of play of privacy seals, the EU policy and regulatory thrusts for privacy and data protection certification, and the GDPR provisions on certification of the processing of personal data. The GDPR leaves substantial room for various options on data protection certification, which might play out in various ways, some of which are explored in this article.  相似文献   

11.
This article offers an interdisciplinary analysis of the General Data Protection Regulation (GDPR) in the context of electronic identification schemes. Gov.UK Verify, the UK Government's electronic identification scheme, and its compatibility with some important aspects of EU data protection law are reviewed. An in-depth examination of Gov.UK Verify's architecture and the most significant constituent elements of both the Data Protection Directive and the imminent GDPR – notably the legitimising grounds for the processing of personal data and the doctrine of joint controllership – highlight several flaws inherent in the Gov.UK Verify's development and mode of operation. This article advances the argument that Gov.UK Verify is incompatible with some major substantive provisions of the EU Data Protection Framework. It also provides some general insight as to how to interpret the requirement of a legitimate legal basis and the doctrine of joint controllership. It ultimately suggests that the choice of the appropriate legal basis should depend upon a holistic approach to the relationship between the actors involved in the processing activities.  相似文献   

12.
The year 2017 has seen many EU and UK legislative initiatives and proposals to consider and address the impact of artificial intelligence on society, covering questions of liability, legal personality and other ethical and legal issues, including in the context of data processing. In March 2017, the Information Commissioner's Office (UK) updated its big data guidance to address the development of artificial intelligence and machine learning, and to provide (GDPR), which will apply from 25 May 2018.This paper situates the ICO's guidance in the context of wider legal and ethical considerations and provides a critique of the position adopted by the ICO. On the ICO's analysis, the key challenge for artificial intelligence processing personal data is in establishing that such processing is fair. This shift reflects the potential for artificial intelligence to have negative social consequences (whether intended or unintended) that are not otherwise addressed by the GDPR. The question of ‘fairness’ is an important one, to address the imbalance between big data organisations and individual data subjects, with a number of ethical and social impacts that need to be evaluated.  相似文献   

13.
The paper illustrates obligations emerging under articles 9 and 89 of the EU Reg. 2016/679 (General Data Protection Regulation, hereinafter “GDPR”) within the health-related data processing for research purposes. Furthermore, through a comparative analysis of the national implementations of the GDPR on the topic, the paper highlights few practical issues that the researcher might deal with while accomplishing the GDPR obligations and the other ethical requirements. The result of the analyses allows to build up a model to achieve an acceptable standard of accountability in health-related data research. The legal remarks are framed within the myth of Ulysses.  相似文献   

14.
Although the protection of personal data is harmonized within the EU by Directive 95/46/EC and will be further harmonized by the General Data Protection Regulation (GDPR) in 2018, there are significant differences in the ways in which EU member states implemented the protection of privacy and personal data in national laws, policies, and practices. This paper presents the main findings of a research project that compares the protection of privacy and personal data in eight EU member states: France, Germany, the UK, Ireland, Romania, Italy, Sweden, and the Netherlands. The comparison focuses on five major themes: awareness and trust, government policies for personal data protection, the applicable laws and regulations, implementation of those laws and regulations, and supervision and enforcement.The comparison of privacy and data protection regimes across the EU shows some remarkable findings, revealing which countries are frontrunners and which countries are lagging behind on specific aspects. For instance, the roles of and interplay between governments, civil rights organizations, and data protections authorities vary from country to country. Furthermore, with regard to privacy and data protection there are differences in the intensity and scope of political debates, information campaigns, media attention, and public debate. New concepts like privacy impact assessments, privacy by design, data breach notifications and big data are on the agenda in some but not in all countries. Significant differences exist in (the levels of) enforcement by the different data protection authorities, due to different legal competencies, available budgets and personnel, policies, and cultural factors.  相似文献   

15.
Article 35 of the GDPR introduces the legal obligation to perform DPIAs in cases where the processing operations are likely to present high risks to the rights and freedoms of natural persons. This obligation is part of a change of approach in the GDPR towards a modified compliance scheme in terms of a reinforced principle of accountability. The DPIA is a prominent example of this approach given that it has an inclusive, comprehensive and proactive nature. Its importance lies in the fact that it forces data controllers to identify, assess and ultimately manage the high risks to the rights and freedoms. However, what is first and foremost important for a meaningful performance of DPIAs, is to have a common and objective understanding of what constitutes a risk in the field of data protection and of how to assess its likelihood and severity. The legislature has approached these concepts via the method of denotation, meaning by giving examples of (highly) risky processing operations. This article suggests a complementary approach, the connotation of these concepts and explains the added value of such a method. By way of a case-study the article also demonstrates the importance of performing complete and accurate DPIAs, in terms of contributing to improving the protection of personal data.  相似文献   

16.
This article explores existing data protection law provisions in the EU and in six other jurisdictions from around the world - with a focus on Latin America - that apply to at least some forms of the processing of data typically part of an Artificial Intelligence (AI) system. In particular, the article analyzes how data protection law applies to “automated decision-making” (ADM), starting from the relevant provisions of EU's General Data Protection Regulation (GDPR). Rather than being a conceptual exploration of what constitutes ADM and how “AI systems” are defined by current legislative initiatives, the article proposes a targeted approach that focuses strictly on ADM and how data protection law already applies to it in real life cases. First, the article will show how GDPR provisions have been enforced in Courts and by Data Protection Authorities (DPAs) in the EU, in numerous cases where ADM is at the core of the facts of the case considered. After showing that the safeguards in the GDPR already apply to ADM in real life cases, even where ADM does not meet the high threshold in its specialized provision in Article 22 (“solely” ADM which results in “legal or similarly significant effects” on individuals), the article includes a brief comparative law analysis of six jurisdictions that have adopted general data protection laws (Brazil, Mexico, Argentina, Colombia, China and South Africa) and that are visibly inspired by GDPR provisions or its predecessor, Directive 95/46/EC, including those that are relevant for ADM. The ultimate goal of this study is to support researchers, policymakers and lawmakers to understand how existing data protection law applies to ADM and profiling.1  相似文献   

17.
The General Data Protection Regulation (GDPR) will come into force in the European Union (EU) in May 2018 to meet current challenges related to personal data protection and to harmonise data protection across the EU. Although the GDPR is anticipated to benefit companies by offering consistency in data protection activities and liabilities across the EU countries and by enabling more integrated EU-wide data protection policies, it poses new challenges to companies. They are not necessarily prepared for the changes and may lack awareness of the upcoming requirements and the GDPR's coercive measures. The implementation of the GDPR requirements demands substantial financial and human resources, as well as training of employees; hence, companies need guidance to support them in this transition. The purposes of this study were to compare the current Data Protection Directive 95/46/EC with the GDPR by systematically analysing their differences and to identify the GDPR's practical implications, specifically for companies that provide services based on personal data. This study aimed to identify and discuss the changes introduced by the GDPR that would have the most practical relevance to these companies and possibly affect their data management and usage practices. Therefore, a review and a thematic analysis and synthesis of the article-level changes were carried out. Through the analysis, the key practical implications of the changes were identified and classified. As a synthesis of the results, a framework was developed, presenting 12 aspects of these implications and the corresponding guidance on how to prepare for the new requirements. These aspects cover business strategies and practices, as well as organisational and technical measures.  相似文献   

18.
Against the common perception of data protection as a road-block, we demonstrate that the GDPR can work as a research enabler. This study demonstrates that European data protection law's regulatory pillars, the first related to the protection of the fundamental right to data protection and the second regarding the promotion of the free flow of personal data, result into an architecture of layered data protection regimes, which come to tighten or relax data subjects’ rights and data protection safeguards vis à vis processing activities differently grounded in public or merely economic interests. Each of the identified data protection regimes shape different “enabling regulatory spots” for the processing of sensitive personal data for research purposes.  相似文献   

19.
The processing of personal data across national borders by both governments and the private sector has increased exponentially in recent years, as has the need for legal protections for personal data. This article examines calls for a global legal framework for data protection, and in particular suggestions that have been made in this regard by the International Law Commission and various national data protection authorities. It first examines the scope of a potential legal framework, and proceeds to analyze the status of data protection in international law. The article then considers the various options through which an international framework could be enacted, before drawing some conclusions about the form and scope such a framework could take, the institutions that could coordinate the work on it, and whether the time is ripe for a multinational convention on data protection.  相似文献   

20.
The rise of biometric data use in personal consumer objects and governmental (surveillance) applications is irreversible. This article analyses the latest attempt by the General Data Protection Regulation (EU) 2016/679 and the Directive (EU) 2016/680 to regulate biometric data use in the European Union. We argue that the new Regulation fails to provide clear rules and protection which is much needed out of respect of fundamental rights and freedoms by making an artificial distinction between various categories of biometric data. This distinction neglects the case law of the European Court of Human Rights and serves the interests of large (governmental) databases. While we support regulating the use and the general prohibition in the GDPR of using biometric data for identification, we regret this limited subjective and use based approach. We argue that the collection, storage and retention of biometric images in databases should be tackled (objective approach). We further argue that based on the distinctions made in the GDPR, several categories of personal data relating to physical, physiological or behavioural characteristics are made to which different regimes apply. Member States are left to adopt or modify their more specific national rules which are eagerly awaited. We contend that the complex legal framework risks posing headaches to bona fide companies deploying biometric data for multifactor authentication and that the new legal regime is not reaching its goal of finding a balance between the free movement of such data and protecting citizens. Law enforcement authorities also need clear guidance. It is questioned whether Directive (EU) 2016/680 provides this.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号