首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
The paper illustrates obligations emerging under articles 9 and 89 of the EU Reg. 2016/679 (General Data Protection Regulation, hereinafter “GDPR”) within the health-related data processing for research purposes. Furthermore, through a comparative analysis of the national implementations of the GDPR on the topic, the paper highlights few practical issues that the researcher might deal with while accomplishing the GDPR obligations and the other ethical requirements. The result of the analyses allows to build up a model to achieve an acceptable standard of accountability in health-related data research. The legal remarks are framed within the myth of Ulysses.  相似文献   

2.
The paper examines how the EU General Data Protection Regulation (GDPR) is applied to the development of AI products and services, drawing attention to the differences between academic and commercial research. The GDPR aims to encourage innovation by providing several exemptions from its strict rules for scientific research. Still, the GDPR defines scientific research in a broad manner, which includes academic and commercial research. However, corporations conducting commercial research might not have in place a similar level of ethical and institutional safeguards as academic researchers. Furthermore, corporate secrecy and opaque algorithms in AI research might pose barriers to oversight. The aim of this paper is to stress the limits of the GDPR research exemption and to find the proper balance between privacy and innovation. The paper argues that commercial AI research should not benefit from the GDPR research exemption unless there is a public interest and has similar safeguards to academic research, such as review by research ethics committees. Since the GDPR provides this broad exemption, it is crucial to clarify the limits and requirements of scientific research, before the application of AI drastically transforms this field.  相似文献   

3.
Janet Gilboy 《Law & policy》1998,20(2):135-155
This paper focuses on government's enlistment of private entities (industries, businesses, professionals) to help in public enforcement. Despite the popularity of this enforcement strategy, there is little systematic research about the behavior of private enforcers in these systems. This article draws together the widely scattered popular and scholarly literature to examine: (a) the settings in which this strategy occurs, (b) its common features, and (c) the responses of private parties to their compelled participation. I then examine existing explanations for private parties' responses. Very little empirical research has been published. I argue that what exists suggests limitations of a conventional cost‐benefit approach to explaining noncompliance, as well as highlights the need for a broader perspective that considers how cultural factors may influence private enforcers' responses to their mandated duties.  相似文献   

4.
This article uses the example of the cryptocurrency Bitcoin and the General Data Protection Regulation (GDPR) to show how distributed networks challenge existing legal mechanisms of allocating responsibility. The Bitcoin network stores personal data by automated means. Furthermore, full nodes qualify as establishments and the network offers a service to citizens in the EU. The data processing within the Bitcoin network therefore falls into the material and territorial scope of the GDPR. To protect data subjects, the GDPR allocates responsibility to the controller, who determines the ‘how’ and the ‘why’ of the data processing. However, the distributed structure of the Bitcoin network blurs the lines between actors who are responsible and actors who are worth protecting. Neither the Bitcoin users running lightweight nodes or full nodes nor the miners determine the ‘how’ and the ‘why’ of the data processing. They carry out their network activities according to the Bitcoin protocol, which can only be adopted and enforced by a collective of full nodes and miners. Members of this collective are joint controllers under Article 26 GDPR, which obliges them to clearly and transparently determine their respective responsibilities for compliance with the GDPR. However, this mechanism fails because of the very structure it aims to eliminate. Therefore, a solution to allocating responsibility for data protection in distributed networks lies outside the GDPR.  相似文献   

5.
In the United States, there is an ongoing debate about requiring health care professionals to report intimate partner violence (IPV) to law enforcement agencies. A comprehensive examination of the perspectives of those required to report abuse is critical, as their roles as mandated reporters often pose legal, practical, moral, and ethical questions. Even so, the perspective of health care professionals who are required to report is often overlooked and research is scarce on mandated reporters who work outside of clinical settings, such as nurses who engage in home visitation with clients. The purpose of this study was to examine nurse home visitors' perspectives regarding the mandatory reporting of IPV, specifically focusing on their attitudes toward reporting, perceived awareness of reporting requirements, and intended reporting behaviors. A web-based survey was administered to nurses in the Nurse-Family Partnership home visitation program across the United States. A total of 532 completed surveys were returned (response rate = 49%). In terms of support for reporting IPV, 40% of nurses indicated that they should "always" be required to report. Almost half of the sample indicated that they would report a case of IPV, yet less than one-third of participants were aware of a legal mandate. Attitudes and support toward reporting as well as the perception of a reporting requirement significantly predicted intention to report. Furthermore, 29% of participants did not know if they were required to report IPV perpetrated against their clients. Comprehensive information about mandatory reporting duties is needed for health care professionals in home visitation settings. The findings of the current study highlight the need to reduce variation among practitioners and establish consistent program practices that are grounded in the program's principals, supported by existing research, and compliant with existing state policies.  相似文献   

6.
This article examines the intersection of the GDPR and selected due process requirements in the context of automated administrative decision-making. It finds that the safeguards for decisions based solely on automated data processing provided by the GDPR coincide with or serve a comparable function to traditional administrative due process elements such as the duty to give reasons, the duty of care principle, and the right to a hearing. The automation of decision-making by public authorities across the EU will therefore be regulated by an overlap of national administrative procedures and the GDPR. This overlap, however, leads to a paradoxical problem: on the one hand, the GDPR is an inflexible legal instrument aimed at setting out in detail the rights of data subjects and the obligations of data controllers, and it does not offer national legislators much room to align its terms with national administrative procedures. On the other, the GDPR's broad language makes it susceptible to interpretations embedded in the elaborated practices of the national administrations. The unclear relationship between national administrative procedures and the GDPR may undermine its main purpose – to establish an equal level of protection in all EU Member States through its ‘consistent and homogenous application’. After outlining the main challenges in this regard, the article concludes with a call for further research and regulatory frameworks adjustments aimed at developing a better governance regime for automated administrative decision-making that would allow for embracing technological progress while keeping threats to individual rights in check.  相似文献   

7.
Financial Intelligence Units (FIUs) are key players in the current Anti-Money Laundering and Countering the Financing of Terrorism (AML/CFT) legal system. FIUs are specialised bodies positioned between private financial institutions and states’ law enforcement authorities, what renders them a crucial middle link in the chain of information exchange between the private and public sectors. Considering that a large share of this information is personal data, its processing must meet the minimum data protection standards. Yet, the EU data protection legal framework is composed of two main instruments, i.e. the General Data Protection Regulation (GDPR) and the Law Enforcement Directive (LED), which provide different thresholds for the protection of personal data. The aim of this paper is to clarify the applicable data protection legal regime for the processing of personal data by FIUs for AML/CFT purposes. To that end, the paper provides an overview of the nature and goals of AML/CFT policy and discusses the problem of the diversity of existing FIU models. Further, it proposes a number of arguments in favour of and against the possibility of applying either the GDPR or LED to the processing of personal data by the FIUs and reflects on how convincingly these arguments can be used depending on the specificities of a given FIU model.  相似文献   

8.
The EU General Data Protection Regulation (GDPR) devotes particular attention to the protection of personal data of children. The rationale is that children are less aware of the risks and the potential consequences of the processing of their personal data on their rights. Yet, the text of the GDPR offers little clarity as to the actual implementation and impact of a number of provisions that may significantly affect children and their rights, leading to legal uncertainty for data controllers, parents and children. This uncertainty relates for instance to the age of consent for processing children's data in relation to information society services, the technical requirements regarding parental consent in that regard, the interpretation of the extent to which profiling of children is allowed and the level of transparency that is required vis-à-vis children. This article aims to identify a number of key issues and questions – both theoretical and practical – that raise concerns from a multi-dimensional children's rights perspective, and to clarify remaining ambiguities in the run-up to the actual application of the GDPR from 25 May 2018 onwards.  相似文献   

9.
This article explores existing data protection law provisions in the EU and in six other jurisdictions from around the world - with a focus on Latin America - that apply to at least some forms of the processing of data typically part of an Artificial Intelligence (AI) system. In particular, the article analyzes how data protection law applies to “automated decision-making” (ADM), starting from the relevant provisions of EU's General Data Protection Regulation (GDPR). Rather than being a conceptual exploration of what constitutes ADM and how “AI systems” are defined by current legislative initiatives, the article proposes a targeted approach that focuses strictly on ADM and how data protection law already applies to it in real life cases. First, the article will show how GDPR provisions have been enforced in Courts and by Data Protection Authorities (DPAs) in the EU, in numerous cases where ADM is at the core of the facts of the case considered. After showing that the safeguards in the GDPR already apply to ADM in real life cases, even where ADM does not meet the high threshold in its specialized provision in Article 22 (“solely” ADM which results in “legal or similarly significant effects” on individuals), the article includes a brief comparative law analysis of six jurisdictions that have adopted general data protection laws (Brazil, Mexico, Argentina, Colombia, China and South Africa) and that are visibly inspired by GDPR provisions or its predecessor, Directive 95/46/EC, including those that are relevant for ADM. The ultimate goal of this study is to support researchers, policymakers and lawmakers to understand how existing data protection law applies to ADM and profiling.1  相似文献   

10.
This article examines the two major international data transfer schemes in existence today – the European Union (EU) model which at present is effectively the General Data Protection Regulation (GDPR), and the Asia-Pacific Economic Cooperation (APEC) Cross Border Privacy Rules system (CBPR), in the context of the Internet of Things (IoT).While IoT data ostensibly relates to things i.e. products and services, it impacts individuals and their data protection and privacy rights, and raises compliance issues for corporations especially in relation to international data flows. The GDPR regulates the processing of personal data of individuals who are EU data subjects including cross border data transfers. As an EU Regulation, the GDPR applies directly as law to EU member nations. The GDPR also has extensive extraterritorial provisions that apply to processing of personal data outside the EU regardless of place of incorporation and geographical area of operation of the data controller/ processor. There are a number of ways that the GDPR enables lawful international transfer of personal data including schemes that are broadly similar to APEC CBPR.APEC CBPR is the other major regional framework regulating transfer of personal data between APEC member nations. It is essentially a voluntary accountability scheme that initially requires acceptance at country level, followed by independent certification by an accountability agent of the organization wishing to join the scheme. APEC CBPR is viewed by many in the United States of America (US) as preferable to the EU approach because CBPR is considered more conducive to business than its counterpart schemes under the GDPR, and therefore is regarded as the scheme most likely to prevail.While there are broad areas of similarity between the EU and APEC approaches to data protection in the context of cross border data transfer, there are also substantial differences. This paper considers the similarities and major differences, and the overall suitability of the two models for the era of the Internet of Things (IoT) in which large amounts of personal data are processed on an on-going basis from connected devices around the world. This is the first time the APEC and GDPR cross-border data schemes have been compared in this way. The paper concludes with the author expressing a view as to which scheme is likely to set the global standard.  相似文献   

11.
In the Internet of Things (IoT), identification and access control technologies provide essential infrastructure to link data between a user's devices with unique identities, and provide seamless and linked up services. At the same time, profiling methods based on linked records can reveal unexpected details about users' identity and private life, which can conflict with privacy rights and lead to economic, social, and other forms of discriminatory treatment. A balance must be struck between identification and access control required for the IoT to function and user rights to privacy and identity. Striking this balance is not an easy task because of weaknesses in cybersecurity and anonymisation techniques. The EU General Data Protection Regulation (GDPR), set to come into force in May 2018, may provide essential guidance to achieve a fair balance between the interests of IoT providers and users. Through a review of academic and policy literature, this paper maps the inherent tension between privacy and identifiability in the IoT. It focuses on four challenges: (1) profiling, inference, and discrimination; (2) control and context-sensitive sharing of identity; (3) consent and uncertainty; and (4) honesty, trust, and transparency. The paper will then examine the extent to which several standards defined in the GDPR will provide meaningful protection for privacy and control over identity for users of IoT. The paper concludes that in order to minimise the privacy impact of the conflicts between data protection principles and identification in the IoT, GDPR standards urgently require further specification and implementation into the design and deployment of IoT technologies.  相似文献   

12.
There has naturally been a good deal of discussion of the forthcoming General Data Protection Regulation. One issue of interest to all data controllers, and of particular concern for researchers, is whether the GDPR expands the scope of personal data through the introduction of the term ‘pseudonymisation’ in Article 4(5). If all data which have been ‘pseudonymised’ in the conventional sense of the word (e.g. key-coded) are to be treated as personal data, this would have serious implications for research. Administrative data research, which is carried out on data routinely collected and held by public authorities, would be particularly affected as the sharing of de-identified data could constitute the unconsented disclosure of identifiable information.Instead, however, we argue that the definition of pseudonymisation in Article 4(5) GDPR will not expand the category of personal data, and that there is no intention that it should do so. The definition of pseudonymisation under the GDPR is not intended to determine whether data are personal data; indeed it is clear that all data falling within this definition are personal data. Rather, it is Recital 26 and its requirement of a ‘means reasonably likely to be used’ which remains the relevant test as to whether data are personal. This leaves open the possibility that data which have been ‘pseudonymised’ in the conventional sense of key-coding can still be rendered anonymous. There may also be circumstances in which data which have undergone pseudonymisation within one organisation could be anonymous for a third party. We explain how, with reference to the data environment factors as set out in the UK Anonymisation Network's Anonymisation Decision-Making Framework.  相似文献   

13.
The EU faces substantive legislative reform in data protection, specifically in the form of the General Data Protection Regulation (GDPR). One of the new elements in the GDPR is its call to establish data protection certification mechanisms, data protection seals and marks to help enhance transparency and compliance with the Regulation and allow data subjects to quickly assess the level of data protection of relevant products and services. To this effect, it is necessary to review privacy and data protection seals afresh and determine how data protection certification mechanisms, seals or marks might work given the role they will be called to play, particularly in Europe, in facilitating data protection. This article reviews the current state of play of privacy seals, the EU policy and regulatory thrusts for privacy and data protection certification, and the GDPR provisions on certification of the processing of personal data. The GDPR leaves substantial room for various options on data protection certification, which might play out in various ways, some of which are explored in this article.  相似文献   

14.
Increasingly widespread adoption of health information technology tools in clinical care increases interest in ethical and legal issues related to the use of these tools for public health and the effects of these uses on the clinician-patient relationship. It is argued that patients, clinicians, and society have generally uncontroversial duties to support civil society's public health mission, information technology supports this mission, and the effects of automated and computerized public health surveillance are likely to have little if any effect on the clinician-patient relationship. It is also suggested, nevertheless, that electronic public health surveillance raises interesting and important ethical issues, some of which can be addressed if not resolved by empirical research, especially regarding patient preferences about secondary use of health data and their moral obligation to contribute to population- based health.  相似文献   

15.
Data protection regulations are undergoing a global reform. The European Commission proposed a reform of the EU data protection framework in 2012. One major driver for the reform has been the research on the consumer perceptions indicating that the consumers are worried about their personal privacy. However, there has been practically no research on perceptions of companies (the controllers of the personal data) and on the data protection reform. This research analyses the awareness and the willingness to act towards compliance regarding the proposed General Data Protection Regulation (GDPR) in Finland in 2013. The GDPR will replace the Finnish Personal Data Act and therefore plays a central role in the Finnish privacy regulation. This research found that the general level of awareness was low: only 43% of the controllers were aware of the forthcoming reform. The willingness to act or to take steps towards the compliance was even lower: 31% of controllers said that they are planning to act towards compliance during this year. These results indicate that the companies are quite unfamiliar with the reform that correlates with other relevant studies in Europe. Personal data are said to be the oil of the digital economy, the hottest commodity of the market today. There are companies that understand this, but the majority seems to ignore this at least what comes to their awareness regarding the reform, even the reform captures many of the best practices regarding processing of personal data.  相似文献   

16.
The General Data Protection Regulation (GDPR) will come into force in the European Union (EU) in May 2018 to meet current challenges related to personal data protection and to harmonise data protection across the EU. Although the GDPR is anticipated to benefit companies by offering consistency in data protection activities and liabilities across the EU countries and by enabling more integrated EU-wide data protection policies, it poses new challenges to companies. They are not necessarily prepared for the changes and may lack awareness of the upcoming requirements and the GDPR's coercive measures. The implementation of the GDPR requirements demands substantial financial and human resources, as well as training of employees; hence, companies need guidance to support them in this transition. The purposes of this study were to compare the current Data Protection Directive 95/46/EC with the GDPR by systematically analysing their differences and to identify the GDPR's practical implications, specifically for companies that provide services based on personal data. This study aimed to identify and discuss the changes introduced by the GDPR that would have the most practical relevance to these companies and possibly affect their data management and usage practices. Therefore, a review and a thematic analysis and synthesis of the article-level changes were carried out. Through the analysis, the key practical implications of the changes were identified and classified. As a synthesis of the results, a framework was developed, presenting 12 aspects of these implications and the corresponding guidance on how to prepare for the new requirements. These aspects cover business strategies and practices, as well as organisational and technical measures.  相似文献   

17.
Discussion about vulnerable individuals and communities spread from research ethics to consumer law and human rights. According to many theoreticians and practitioners, the framework of vulnerability allows formulating an alternative language to articulate problems of inequality, power imbalances and social injustice. Building on this conceptualisation, we try to understand the role and potentiality of the notion of vulnerable data subjects. The starting point for this reflection is wide-ranging development, deployment and use of data-driven technologies that may pose substantial risks to human rights, the rule of law and social justice. Implementation of such technologies can lead to discrimination systematic marginalisation of different communities and the exploitation of people in particularly sensitive life situations. Considering those problems, we recognise the special role of personal data protection and call for its vulnerability-aware interpretation. This article makes three contributions. First, we examine how the notion of vulnerability is conceptualised and used in the philosophy, human rights and European law. We then confront those findings with the presence and interpretation of vulnerability in data protection law and discourse. Second, we identify two problematic dichotomies that emerge from the theoretical and practical application of this concept in data protection. Those dichotomies reflect the tensions within the definition and manifestation of vulnerability. To overcome limitations that arose from those two dichotomies we support the idea of layered vulnerability, which seems compatible with the GDPR and the risk-based approach. Finally, we outline how the notion of vulnerability can influence the interpretation of particular provisions in the GDPR. In this process, we focus on issues of consent, Data Protection Impact Assessment, the role of Data Protection Authorities, and the participation of data subjects in the decision making about data processing.  相似文献   

18.
The goal of this contribution is to understand the notion of risk as it is enshrined in the General Data Protection Regulation (GDPR), with a particular on Art. 35 providing for the obligation to carry out data protection impact assessments (DPIAs), the first risk management tool to be enshrined in EU data protection law, and which therefore contains a number of key elements in order to grasp the notion. The adoption of this risk-based approach has not come without a number of debates and controversies, notably on the scope and meaning of the risk-based approach. Yet, what has remained up to date out of the debate is the very notion of risk itself, which underpins the whole risk-based approach. The contribution uses the notions of risk and risk analysis as tools for describing and understanding risk in the GDPR. One of the main findings is that the GDPR risk is about “compliance risk” (i.e., the lower the compliance the higher the consequences upon the data subjects' rights). This stance is in direct contradiction with a number of positions arguing for a strict separation between compliance and risk issues. This contribution sees instead issues of compliance and risk to the data subjects rights and freedoms as deeply interconnected. The conclusion will use these discussions as a basis to address the long-standing debate on the differences between privacy impact assessments (PIAs) and DPIAs. They will also warn against the fact that ultimately the way risk is defined in the GDPR is somewhat irrelevant: what matters most is the methodology used and the type of risk at work therein.  相似文献   

19.
We study variability in General Data Protection Regulation (GDPR) awareness in relation to digital experience in the 28 European countries of EU27-UK, through secondary analysis of the Eurobarometer 91.2 survey conducted in March 2019 (N = 27,524). Education, occupation, and age are the strongest sociodemographic predictors of GDPR awareness, with little influence of gender, subjective economic well-being, or locality size. Digital experience is significantly and positively correlated with GDPR awareness in a linear model, but this relationship proves to be more complex when we examine it through a typological analysis. Using an exploratory k-means cluster analysis we identify four clusters of digital citizenship, across both dimensions of digital experience and GDPR awareness: the off-line citizens (22%), the social netizens (32%), the web citizens (17%), and the data citizens (29%). The off-line citizens rank lowest in internet use and GDPR awareness; the web citizens rank at about average values, while the data citizens rank highest in both digital experience and GDPR knowledge and use. The fourth identified cluster, the social netizens, have a discordant profile, with remarkably high social network use, below average online shopping experiences, and low GDPR awareness. Digitalization in human capital and general internet use is a strong country-level correlate of the national frequency of the data citizen type. Our results confirm previous studies of the low privacy awareness and skills associated with intense social media consumption, but we find that young generations are evenly divided between the rather carefree social netizens and the strongly invested data citizens. In order to achieve the full potential of the GDPR in changing surveillance practices while fostering consumer trust and responsible use of Big Data, policymakers should more effectively engage the digitally connected social netizens in the public debate over data use and protection. Moreover, they should enable all types of digital citizens to exercise their GDPR rights and to support the creation of value from data, while defending the right to protection of personal data.  相似文献   

20.
Machine-learning (‘ML’) models are powerful tools which can support personalised clinical judgments, as well as patients’ choices about their healthcare. Concern has been raised, however, as to their ‘black box’ nature, in which calculations are so complex they are difficult to understand and independently verify. In considering the use of ML in healthcare, we divide the question of transparency into three different scenarios:
  • 1)Solely automated decisions. We suggest these will be unusual in healthcare, as Article 22(4) of the General Data Protection Regulation presents a high bar. However, if solely automatic decisions are made (e.g. for inpatient triage), data subjects will have a right to ‘meaningful information’ about the logic involved.
  • 2)Clinical decisions. These are decisions made ultimately by clinicians—such as diagnosis—and the standard of transparency under the GDPR is lower due to this human mediation.
  • 3)Patient decisions. Decisions about treatment are ultimately taken by the patient or their representative, albeit in dialogue with clinicians. Here, the patient will require a personalised level of medical information, depending on the severity of the risk, and how much they wish to know.
In the final category of decisions made by patients, we suggest European healthcare law sets a more personalised standard of information requirement than the GDPR. Clinical information must be tailored to the individual patient according to their needs and priorities; there is no monolithic ‘explanation’ of risk under healthcare law. When giving advice based (even partly) on a ML model, clinicians must have a sufficient grasp of the medically-relevant factors involved in the model output to offer patients this personalised level of medical information. We use the UK, Ireland, Denmark, Norway and Sweden as examples of European health law jurisdictions which require this personalised transparency to support patients’ rights to make informed choices. This adds to the argument for post-hoc, rationale explanations of ML to support healthcare decisions in all three scenarios.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号