首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 851 毫秒
1.
X-chromosomal short tandem repeats (X-STRs) are very useful in complex paternity cases because they are inherited by male and female offspring in different ways. They complement autosomal STRs (as-STRs) allowing higher paternity probabilities to be attained. These probabilities are expressed in a likelihood ratio (LR). The formulae needed to calculate LR depend on the genotype combinations of suspected pedigrees. LR can also be obtained by the use of Bayesian networks (BNs). These are graphical representations of real situations that can be used to easily calculate complex probabilities. In the present work, two BNs are presented, which are designed to derive LRs for half-sisters/half-sisters and mother/daughter/paternal grandmother relationships. These networks were validated against known formulae and show themselves to be useful in other suspect pedigree situations than those for which they were developed. The BNs were applied in two paternity cases. The application of the mother/daughter/paternal grandmother BN highlighted the complementary value of X-STRs to as-STRs. The same case evaluated without the mother underlined that missing information tends to be conservative if the alleged father is the biological father and otherwise nonconservative. The half-sisters case shows a limitation of statistical interpretations in regard to high allelic frequencies.  相似文献   

2.
Performance of likelihood ratio (LR) methods for evidence evaluation has been represented in the past using, for example, Tippett plots. We propose empirical cross‐entropy (ECE) plots as a metric of accuracy based on the statistical theory of proper scoring rules, interpretable as information given by the evidence according to information theory, which quantify calibration of LR values. We present results with a case example using a glass database from real casework, comparing performance with both Tippett and ECE plots. We conclude that ECE plots allow clearer comparisons of LR methods than previous metrics, allowing a theoretical criterion to determine whether a given method should be used for evidence evaluation or not, which is an improvement over Tippett plots. A set of recommendations for the use of the proposed methodology by practitioners is also given.  相似文献   

3.
The reporting of a likelihood ratio (LR) calculated from probabilistic genotyping software has become more popular since 2015 and has allowed for the use of more complex mixtures at court. The meaning of “inconclusive” LRs and how to communicate the significance of low LRs at court is now important. We present a method here using the distribution of LRs obtained from nondonors. The nondonor distribution is useful for examining calibration and discrimination for profiles that have produced LRs less than about 104. In this paper, a range of mixed DNA profiles of varying quantity were constructed and the LR distribution considering the minor contributor for a number of nondonors was compared to the expectation given a calibrated system. It is demonstrated that conditioning genotypes should be used where reasonable given the background information to decrease the rate of nondonor LRs above 1. In all 17 cases examined, the LR for the minor donor was higher than the nondonor LRs, and in 12 of the 17 cases, the 99.9 percentile of the nondonor distribution was lower when appropriate conditioning information was used. The output of the tool is a graph that can show the position of the LR for the person of interest set against the nondonor LR distribution. This may assist communication between scientists and the court.  相似文献   

4.
Abstract: Likelihood ratios (LRs) provide a natural way of computing the value of evidence under competing propositions. We propose LR models for classification and comparison that extend the ideas of Aitken, Zadora, and Lucy and Aitken and Lucy to include consideration of zeros. Instead of substituting zeros by a small value, we view the presence of zeros as informative and model it using Bernoulli distributions. The proposed models are used for evaluation of forensic glass (comparison and classification problem) and paint data (comparison problem). Two hundred and sixty‐four glass samples were analyzed by scanning electron microscopy, coupled with an energy dispersive X‐ray spectrometer method and 36 acrylic topcoat paint samples by pyrolysis gas chromatography hyphened with mass spectrometer method. The proposed LR model gave very satisfactory results for the glass comparison problem and for most of the classification tasks for glass. Results of comparison of paints were also highly satisfactory, with only 3.0% false positive answers and 2.8% false negative answers.  相似文献   

5.
The relation between the potassium concentration in the vitreous humor, [K+], and the postmortem interval has been studied by several authors. Many formulae are available and they are based on a correlation test and linear regression using the PMI as the independent variable and [K+] as the dependent variable. The estimation of the confidence interval is based on this formulation. However, in forensic work, it is necessary to use [K+] as the independent variable to estimate the PMI. Although all authors have obtained the PMI by direct use of these formulae, it is, nevertheless, an inexact approach, which leads to false estimations. What is required is to change the variables, obtaining a new equation in which [K+] is considered as the independent variable and the PMI as the dependent. The regression line obtained from our data is [K+] = 5.35 + 0.22 PMI, by changing the variables we get PMI = 2.58[K+] - 9.30. When only nonhospital deaths are considered, the results are considerably improved. In this case, we get [K+] = 5.60 + 0.17 PMI and, consequently, PMI = 3.92[K+] - 19.04.  相似文献   

6.
7.
There is an apparent paradox that the likelihood ratio (LR) approach is an appropriate measure of the weight of evidence when forensic findings have to be evaluated in court, while it is typically not used by bloodstain pattern analysis (BPA) experts. This commentary evaluates how the scope and methods of BPA relate to several types of evaluative propositions and methods to which LRs are applicable. As a result of this evaluation, we show how specificities in scope (BPA being about activities rather than source identification), gaps in the underlying science base, and the reliance on a wide range of methods render the use of LRs in BPA more complex than in some other forensic disciplines. Three directions are identified for BPA research and training, which would facilitate and widen the use of LRs: research in the underlying physics; the development of a culture of data sharing; and the development of training material on the required statistical background. An example of how recent fluid dynamics research in BPA can lead to the use of LR is provided. We conclude that an LR framework is fully applicable to BPA, provided methodic efforts and significant developments occur along the three outlined directions.  相似文献   

8.
The pubic bone is considered one of the best sources of information for determining sex using skeletal remains, but can be easily damaged postmortem. This problem has led to the development of nonpelvic methods for cases when the pubic bone is too damaged for analysis. We approached this problem from a different perspective. In this article, we present an approach using new measurements and angles of the proximal femur to recreate the variation in the pubic bone. With a sample from the Terry Collection (n > 300), we use these new variables along with other traditional measurements of the femur and hipbone to develop two logistic regression equations (femur and fragmentary hipbone, and femur only) that are not population specific. Tests on an independent sample (Grant Collection; n = 37-40) with a different pattern of sexual dimorphism resulted in an allocation accuracy of 95-97% with minimal difference by sex.  相似文献   

9.
In R v T [2010] EWCA Crim 2439, [2011] 1 Cr App Rep 85, the Court of Appeal indicated that ‘mathematical formulae’, such as likelihood ratios, should not be used by forensic scientists to analyse data where firm statistical evidence did not exist. Unfortunately, when considering the forensic scientist's evidence, the judgment consistently commits a basic logical error, the ‘transposition of the conditional’ which indicates that the Bayesian argument has not been understood and extends the confusion surrounding it. The judgment also fails to distinguish between the validity of the relationships in a formula and the precision of the data. We explain why the Bayesian method is the correct logical method for analysing forensic scientific evidence, how it works and why ‘mathematical formulae’ can be useful even where firm statistical data is lacking.  相似文献   

10.
11.
Knowledge generated in universities can serve as an important base for the commercialization of innovation. One mechanism for commercialization is the creation of a new company by a scientist. We shed light on this process by examining the role of scientist characteristics, access to resources and key university conditions in driving the likelihood of a scientist to start a company. Our sample comprises 1,899 university scientists across six different scientific fields. We make a methodological contribution by using self-reported data from the scientists themselves, whereas most previous research relied on university or public data. Our consideration of six scientific fields is a substantive contribution and reveals that scientist startups are heterogeneous in nature. Our findings are largely consistent with extant research on the role of individual and university variables in scientist entrepreneurship; in addition, we uncover the novel finding that the type of research field is also a key driver of scientist startup activity.  相似文献   

12.
Abstract: The aim of this study was to assess the efficiency of likelihood ratio (LR)‐based measures when they are applied to solving various classification problems for glass objects which are described by elemental composition, and refractive index (RI) values, and compare LR‐based methods to other classification methods such as support vector machines (SVM) and naïve Bayes classifiers (NBC). One hundred and fifty‐three glass objects (23 building windows, 25 bulbs, 32 car windows, 57 containers, and 16 headlamps) were analyzed by scanning electron microscopy coupled with an energy dispersive X‐ray spectrometer. Refractive indices for building and car windows were measured before (RIb), and after (RIa) an annealing process. The proposed scheme for glass fragment(s) classification demonstrates some efficiency, although the classification of car windows (c) and building windows (w) must be treated carefully. This is because of their very similar elemental content. However, a combination of elemental content and information on the change in RI during annealing (ΔRI = RIa?RIb) gave very promising results. A LR model for the classification of glass fragments into use‐type categories for forensic purposes gives slightly higher misclassification rates than SVM and NBC. However, the observed differences between results obtained by all three approaches were very similar, especially when applied to the car window and building window classification problem. Therefore, the LR model can be recommended because of the ease of interpretation of LR‐based measures of certainty.  相似文献   

13.
PENDULUM--a guideline-based approach to the interpretation of STR mixtures   总被引:2,自引:0,他引:2  
Several years ago, a theory to interpret mixed DNA profiles was proposed that included a consideration of peak area using the method of least squares. This method of mixture interpretation has not been widely adopted because of the complexity of the associated calculations. Most reporting officers (RO) employ an experience and judgement based approach to the interpretation of mixed DNA profiles. Here we present an approach that has formalised the thinking behind this experience and judgement. This has been written into a computer program package called PENDULUM. The program uses a least squares method to estimate the pre-amplification mixture proportion for two potential contributors. It then calculates the heterozygous balance for all of the potential sets of genotypes. A list of "possible" genotypes is generated using a set of heuristic rules. External to the programme the candidate genotypes may then be used to formulate likelihood ratios (LR) that are based on alternative casework propositions. The system does not represent a black box approach; rather it has been integrated into the method currently used by the reporting officers at the Forensic Science Service (FSS). The time saved in automating routine calculations associated with mixtures analysis is significant. In addition, the computer program assists in unifying reporting processes, thereby improving the consistency of reporting.  相似文献   

14.
The Bayesian approach provides a unified and logical framework for the analysis of evidence and to provide results in the form of likelihood ratios (LR) from the forensic laboratory to court. In this contribution we want to clarify how the biometric scientist or laboratory can adapt their conventional biometric systems or technologies to work according to this Bayesian approach. Forensic systems providing their results in the form of LR will be assessed through Tippett plots, which give a clear representation of the LR-based performance both for targets (the suspect is the author/source of the test pattern) and non-targets. However, the computation procedures of the LR values, especially with biometric evidences, are still an open issue. Reliable estimation techniques showing good generalization properties for the estimation of the between- and within-source variabilities of the test pattern are required, as variance restriction techniques in the within-source density estimation to stand for the variability of the source with the course of time. Fingerprint, face and on-line signature recognition systems will be adapted to work according to this Bayesian approach showing both the likelihood ratios range in each application and the adequacy of these biometric techniques to the daily forensic work.  相似文献   

15.
How scientists commercialise new knowledge via entrepreneurship   总被引:3,自引:0,他引:3  
In this paper, we explore how university-based scientists overcome the barriers to appropriating the returns from new knowledge via entrepreneurship; and we examine how a university-based technology transfer office (TTO), with an incubation facility, can assist scientists in the commercialisation process. We identify how scientists overcome three barriers to commercialisation. First, we find that scientists take account of traditional academic rewards when considering the pay-offs of commercialisation activity. Second, scientists recognise the commercial value of new knowledge when market-related knowledge is embedded in their research context, and/or when they develop external contacts with those with market knowledge. Third, the deliberate efforts of scientists to acquire market information results in individuals or organisations with market knowledge learning of the new knowledge developed by the scientists; and intermediaries can help individuals or organisations with resources learn of new knowledge developed by scientists. We find that the TTO, principally through an enterprise development programme (CCDP), played an important role in the commercialisation process. The principal benefit of the TTO is in the domain of putting external resource providers in contact with scientists committed to commercialisation. Our findings have important implications for scientists and for those interested in promoting commercialisation via entrepreneurship.
Dipti PandyaEmail:
  相似文献   

16.
In July 1996, a sheep named Dolly was born in Scotland. What makes Dolly's birth noteworthy is that she is the result of the first successful cloning attempt using the nucleus of an adult cell. The technique that led to Dolly's birth involved transferring the nucleus of a mammary cell from an adult sheep to the enucleated egg cell of an unrelated sheep with gestation occurring in a third sheep. The possibility of applying this technique to human reproduction raised concerns worldwide with several countries moving for an immediate bans on human cloning. In the United States, President Clinton requested that the National Bioethics Advisory Commission ("NBAC"), a multidisciplinary group composed of scientists, lawyers, educators, theologians, and ethicists study the implications of cloning and issue recommendations. The Commission consulted other scientists, ethicists, theologians, lawyers, and citizens with interests in this advancing technology and concluded that, "at this time it is morally unacceptable for anyone in the public or private sector, whether in a research or clinical setting, to attempt to create a child using somatic cell nuclear transfer cloning." This Article was included in a larger work prepared at the request of, and submitted to the Commission by, law professor Lori B. Andrews. Cloning through nuclear transfer will change the way we create and define families. This Article explores how existing law relating to parentage, surrogacy, egg donation, and artificial insemination may apply in the cloning context to clarify the parent-child relationship established through cloning.  相似文献   

17.
The accurate determination of postmortem interval (PMI) using the formation of adipocere presents a significant challenge to forensic scientists interested in determining the time of death. Several attempts have been made to determine the time since the occurrence of death. However, up to date, this has been difficult because previous approaches have been mainly qualitative, focusing on the later stages of degradation processes. This work presents preliminary results of an experimental model of postmortem adipocere formation using liquid chromatography. Three pig cadavers were submerged in distilled water, chlorinated water, and saline water. Fresh specimens resulting from the degradation in the subcutaneous fat were obtained from the pigs at two-week intervals for a period of ten weeks, and were subjected to chromatographic analysis. By correlating the ratio of the disappearance of hydrolyzed fatty acids with the formation of hydroxystearic and oxostearic acids after death, a simple, quantitative analytical method was developed for the determination of PMI. Experimental observation of the chemistry of adipocere formation indicated that adipocere can be formed only a few hours after an incidence of death and this continues until the saturation of oleic acid degradation after several weeks. Different time courses were obtained for cadavers immersed in distilled, chlorinated, and saline water, respectively. This work has not in any way solved the time since death problem. But it may be an approach to the problem that has not been adequately explored.  相似文献   

18.
Anthropometric technique commonly used by anthropologists and adopted by medical scientists has been employed to estimate body size for over a hundred years. With the increasing frequency of mass disasters, the identification of an isolated lower extremity and the stature of the person it belonged to has created problems for the investigation of the identity of some of the victims. In spite of a need for such a study, there is a lack of systematic studies to identify fragmented and dismembered human remains. The purpose of the paper is to analyze anthropometric relationships between dimensions of the lower extremity and body height. Analysis is based on a sample of middle class male (N=203) and female (N=108) adult Turks residing in Istanbul. The participants are mostly students and staff members of a medical school, and military personnel. Measurements taken are stature, trochanteric height, thigh length, lower leg length, leg length, and foot height, breadth, and length. Of the five variables entered into the regression analysis, all but foot breadth participate in the analysis with leg length as the first and followed by thigh and foot lengths, and finally foot height in males (R(2)). There were also individually calculated formulae for some of these measurements which provided smaller R(2)-values. Student's t-test to assess if there was any intraobserver error in measurements take by individual anthropometrist did not show such any statistically significant difference. In conclusion, the study suggested that estimation of a living height can be made possible using various dimensions of the lower extremity. One must consider differences between populations in order to apply functions as such to others.  相似文献   

19.
Stature reconstruction is important as it provides a forensic anthropological estimate of the height of a person in the living state; playing a vital role in the identification of individuals from their skeletal remains. Regression formulae for stature estimation have been generated for indigenous South Africans based on measurements of long bones of upper and lower extremities and the calcaneus. Since these bones are not always available for forensic analysis, it became necessary to use other bones such as the skull for stature estimation. The aim of the present study was to investigate the usefulness of certain measurements of the skull of indigenous South Africans in the estimation of adult stature. Ninety-nine complete skeletons obtained from the Raymond A. Dart Collection, School of Anatomical Sciences of the University of the Witwatersrand, were used. Total skeletal height (TSH) was calculated for each skeleton using the Fully's (anatomical) method. Furthermore, six variables were measured on each skull. TSH was regressed onto these cranial measurements in order to obtain regression formulae. The correlation coefficients obtained ranged between 0.40 and 0.54. The range of the standard errors of estimate from the current study (4.37 and 6.24) is high in comparison to that obtained for stature estimation based on intact long bones and the calcaneus. Therefore, the equations presented in this study should be used with caution in forensic cases when only the skull is available for human identification.  相似文献   

20.
In this article, the performance of a score‐based likelihood ratio (LR) system for comparisons of fingerprints with fingermarks is studied. The system is based on an automated fingerprint identification system (AFIS) comparison algorithm and focuses on fingerprint comparisons where the fingermarks contain 6–11 minutiae. The hypotheses under consideration are evaluated at the level of the person, not the finger. The LRs are presented with bootstrap intervals indicating the sampling uncertainty involved. Several aspects of the performance are measured: leave‐one‐out cross‐validation is applied, and rates of misleading evidence are studied in two ways. A simulation study is performed to study the coverage of the bootstrap intervals. The results indicate that the evidential strength for same source comparisons that do not meet the Dutch twelve‐point standard may be substantial. The methods used can be generalized to measure the performance of score‐based LR systems in other fields of forensic science.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号