scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Forensic Sciences in 2020"


Journal ArticleDOI
TL;DR: The applications of nanoparticles in developing and detecting the latent fingerprints and nano‐based techniques hold immense future potential in fingerprint investigations are focused on.
Abstract: The imminent nanotechnology and progressive instrumentations together have vast applications in the field of forensic science. Few prominent examples are gold nanoparticles for improvising the efficiency of polymerase chain reaction and atomic force microscopy for examining ink and bloodstains. Characteristics like distinct ridge details of fingerprints could be obtained by applying different nanoparticles such as silver, zinc oxide, silicon dioxide, aluminum oxide, gold (with silver physical developer), europium, fluorescent carbon, and amphiphilic silica on a range of object surfaces, and among all, gold is most commonly used. Fingerprint is considered noteworthy evidence in any crime scene, and nano-based techniques hold immense future potential in fingerprint investigations. Therefore, this paper focuses on the applications of nanoparticles in developing and detecting the latent fingerprints.

48 citations


Journal ArticleDOI
TL;DR: It is the firm opinion that morphoscopic traits are forensic anthropology’s Lost Cause and it is time for the field to abandon the importance and utility of their importance.
Abstract: Editor, Historians of the 21st century will examine the year 2020 and the confluence of seismic events which impacted everyday life in the United States and served to highlight systemic inequalities that have permeated the nation for centuries. Between the devastating COVID-19 pandemic and the homicides of numerous Black Americans at the hands of law enforcement officials, we have all been reminded about the fragility of life, and the failures of our society to live up to the ideals enshrined in the foundational documents which established the United States of America over two centuries ago. Tackling these failures seems overwhelming at times; however, changes can be enacted with candid and reflexive discussions about the status quo. In writing this letter, we direct our comments to the forensic anthropology community in the United States in hopes of sparking a discussion about the long-standing practice of ancestry estimation and changes that are frankly long overdue. Herein, we argue two primary points: (i) we urge an immediate moratorium on the use of morphoscopic cranial traits in the estimation of ancestry given the lack of comprehensive inquiry into why the traits exist and the fact that their use serves to bolster the debunked biological race concept; and (ii) we issue a call to action for forensic anthropologists to scrutinize the way that ancestry estimates in forensic anthropology reports might hinder identification efforts. Starting with the modernization of forensic anthropology in the mid-twentieth century, the estimation of ancestry has figured prominently in our methodological toolkit. It has been accepted as fact that estimates of “race,” “ancestry,” “population affinity,” or “bioaffinity” (hereafter ancestry) are compulsory for the process of human identification, in concert with other parameters such as age-at-death to include or exclude possible missing persons in cases when unidentified human remains are recovered. However, at this moment in the histories of our discipline and our country, it is time to rethink the forensic anthropological canon and critically evaluate why we have relied on traits that we do not understand. Indeed, morphoscopic traits may best be viewed through a historical lens that signal a time when many leading anthropologists believed that Homo sapiens could be winnowed into a small number of discrete groups. Ultimately, it is our firm opinion that morphoscopic traits are forensic anthropology’s Lost Cause and it is time for our field to abandon the fac ade of their importance and utility. Today, forensic anthropologists hold that estimating ancestry which includes the use of morphoscopic traits and craniometric distances is possible because variables related to worldwide human variation are reflected in the skeleton and are ultimately linked to heritability (1,2). However, herein lies the problem: The skeletal nonmetric traits which for decades have been studied by biological anthropologists and population geneticists are not the morphoscopic traits utilized by forensic anthropologists to generate estimates of an unknown decedent’s ancestry. Practitioners of forensic anthropology have glossed over this point and have not only failed to explicitly recognize that the heritability of morphoscopic traits remains unknown, but a similarly egregious transgression is the failure to initiate lines of empirical inquiry into this question. The former is a problem because of the faulty logic inherent in the assumption that these traits have classificatory power when we do not understand important details such as how and why a person inherited the traits from their ancestors; the latter is particularly telling and highlights the inextricable nature of the science we practice and the society in which we were enculturated—namely, one where the reality of social race hierarchies and differences dictates part of our worldview. Indeed, the fact that we have arrived at a point where we could formulate these opinions is a result of the confluence of our unique lived experiences in this particular time and place. It may be that the intention of proponents of morphoscopic traits is not to imply anything about heritability given one recent definition (e.g., “quasicontinuous variables of the cranium that can be reflected as soft-tissue differences [emphasis ours] in the living” [3]); however, trait application has not been satisfactorily uncoupled from the blatantly typological approach, thoroughly debunked by biological anthropologists going back decades. In this way, part of contemporary forensic anthropological practice is a throwback to an earlier time when ideas about biological determinism and essentialist aspects of the races were taken as gospel truths and used to justify racialized hierarchies leading to both overt and covert structural inequities which continue today. Moreover, studies including morphoscopic traits such as the “post-bregmatic depression” or “nasal bone contour” have not provided a suitable explanation as to why varying constellations of morphoscopic traits such as these have classificatory power that correspond with common socially recognized labels related to ancestry or ethnicity. The idea that morphoscopic traits can adequately be used to place decedents into discrete groups has been perpetuated in the pages of this journal without any deliberate appraisal of the underlying assumption that a finite number of groups exist; or critically, acknowledgement of the harm of connecting social race to skeletal traits insofar that it only serves to sustain the falsified biological race concept and misinforms the public, to include law enforcement and the medicolegal community, about human variation. Of further major concern is that studies involving these traits have generally not been conducted through the lenses of evolutionary theory or ecogeographic variation, foundational theories for biological anthropology. Indeed, every other parameter commonly evaluated by forensic anthropologists is grounded in an explanatory framework that ties back to particular accepted bodies of knowledge (4). Rather, when it comes to morphoscopic traits in particular, typology appears to be the only theoretical framework in attendance. Taken together, these failures are frankly unacceptable and underscore why this approach should be removed from everyday forensic anthropological practice. Just as troubling as the scientific and social issues raised by the use of morphoscopic traits are overall ancestry estimates. As discussed, most forensic anthropologists operate under an explicit assumption that ancestry is a critical part of the biological profile. Along with other estimated parameters, we produce an Received 24 June 2020; accepted 25 June 2020.

43 citations


Journal ArticleDOI
TL;DR: There are measurable differences in the prevalence of risk factors between lone‐actor terrorists and the general population, however, no single factor “predicts” violent extremism.
Abstract: Improvements have been made in identifying the prevalence of risk factors/indicators for violent extremism. A consistent problem is the lack of base rates. How to develop base rates is of equal concern. This study has two aims: (i) compare two methods for developing base rates; the Unmatched Count Technique (UCT) and direct questioning, (ii) generate base rates in a general population sample and compare these to a sample of lone-actor terrorists (n = 125). We surveyed 2108 subjects from the general population. Participants were recruited from an online access panel and randomly assigned to one of three conditions; direct survey, control, or UCT. Survey items were based on a lone-actor terrorist codebook developed from the wider literature. Direct questioning was more suitable under our study conditions where UCT resulted in deflation effects. Comparing the base rates identified a number of significant differences: (i) lone-actor terrorists demonstrated propensity indicators related to a cognitive susceptibility, and a crime- and/or violence-supportive morality more often; the general sample demonstrated protective factors more often, (ii) lone-actor terrorists demonstrated situational indicators related to a crime- and/or violence-supportive morality more often, whereas the general sample experienced situational stressors more often, (iii) lone-actor terrorists demonstrated indicators related to exposure to extremism more often. Results suggest there are measurable differences in the prevalence of risk factors between lone-actor terrorists and the general population. However, no single factor "predicts" violent extremism. This bears implications for our understanding of the interrelation of risk and protective factors, and for the risk assessment of violent extremism.

39 citations


Journal ArticleDOI
TL;DR: It is hypothesized that segmenting only the third molar could further improve the automated stage allocation performance and a DenseNet CNN optimize automated dental stage allocation for age estimation.
Abstract: Staging third molar development is commonly used for age estimation in subadults. Automated developmental stage allocation to the mandibular left third molar in panoramic radiographs has been examined in a pilot study. This method used an AlexNet Deep Convolutional Neural Network (CNN) approach to stage lower left third molars, which had been selected by manually drawn bounding boxes around them. This method (bounding box AlexNet = BA) still contained parts of surrounding structures which may have affected the automated stage allocation performance. We hypothesize that segmenting only the third molar could further improve the automated stage allocation performance. Therefore, the current study aimed to determine and validate the effect of lower third molar segmentations on automated tooth development staging. Retrospectively, 400 panoramic radiographs were collected, processed and segmented in three ways: bounding box (BB), rough (RS), and full (FS) tooth segmentation. A DenseNet201 CNN was used for automated stage allocation. Automated staging results were compared with reference stages - allocated by human observers - overall and per stage. FS rendered the best results with a stage allocation accuracy of 0.61, a mean absolute difference of 0.53 stages and a Cohen's linear kappa of 0.84. Misallocated stages were mostly neighboring stages, and DenseNet201 rendered better results than AlexNet by increasing the percentage of correctly allocated stages by 3% (BA compared to BB). FS increased the percentage of correctly allocated stages by 7% compared to BB. In conclusion, full tooth segmentation and a DenseNet CNN optimize automated dental stage allocation for age estimation.

32 citations


Journal ArticleDOI
TL;DR: The need to create good practice for 3D printing across the forensic science process, the need to develop accurate and admissible 3D printed models while exploring the techniques, accuracy and bias within the courtroom, and calls for the alignment of future research and agendas perhaps in the form of a specialist working group are highlighted.
Abstract: There has been a rapid development and utilization of three-dimensional (3D) printing technologies in engineering, health care, and dentistry. Like many technologies in overlapping disciplines, these techniques have proved to be useful and hence incorporated into the forensic sciences. Therefore, this paper describes how the potential of using 3D printing is being recognized within the various sub-disciplines of forensic science and suggests areas for future applications. For instance, the application can create a permanent record of an object or scene that can be used as demonstrative evidence, preserving the integrity of the actual object or scene. Likewise, 3D printing can help with the visualization of evidential spatial relationships within a scene and increase the understanding of complex terminology within a courtroom. However, while the application of 3D printing to forensic science is beneficial, currently there is limited research demonstrated in the literature and a lack of reporting skewing the visibility of the applications. Therefore, this article highlights the need to create good practice for 3D printing across the forensic science process, the need to develop accurate and admissible 3D printed models while exploring the techniques, accuracy and bias within the courtroom, and calls for the alignment of future research and agendas perhaps in the form of a specialist working group.

32 citations


Journal ArticleDOI
TL;DR: This work represents the first use of Rapid DNA Identification in a mass casualty event, and the results support the use ofRapid DNA as an integrated tool with conventional disaster victim identification modalities.
Abstract: In November 2018, Butte County, California, was decimated by the Camp Fire, the deadliest wildfire in state history. Over 150,000 acres were destroyed, and at its peak, the fire consumed eighty acres per minute. The speed and intensity of the oncoming flames killed scores of people, and weeks before the fire was contained, first responders began searching through the rubble of 18,804 residences and commercial buildings. As with most mass disasters, conventional identification modalities (e.g., fingerprints, odontology, hardware) were utilized to identify victims. The intensity and duration of the fire severely degraded most of the remains, and these approaches were useful in only 22 of 84 cases. In the past, the remaining cases would have been subjected to conventional DNA analysis, which may have required months to years. Instead, Rapid DNA technology was utilized (in a rented recreational vehicle outside the Sacramento morgue) in the victim identification effort. Sixty-nine sets of remains were subjected to Rapid DNA Identification and, of these, 62 (89.9%) generated short tandem repeat profiles that were subjected to familial searching; essentially all these profiles were produced within hours of sample receipt. Samples successfully utilized for DNA identification included blood, bone, liver, muscle, soft tissue of unknown origin, and brain. In tandem with processing of 255 family reference samples, 58 victims were identified. This work represents the first use of Rapid DNA Identification in a mass casualty event, and the results support the use of Rapid DNA as an integrated tool with conventional disaster victim identification modalities.

31 citations


Journal ArticleDOI
TL;DR: A liquid chromatography quadrupole time‐of‐flight mass spectrometry (LC‐QTOF‐MS) assay was developed, validated, and implemented for forensic toxicology testing and more than 20 emerging NPS were detected for the first time.
Abstract: Novel psychoactive substances (NPS) are synthetic drugs that pose serious public health and safety concerns. A multitude of NPS have been identified in the United States, often implicated in forensic investigations. The most common and effective manner for identifying NPS is by use of mass spectrometry and the true utility lies within nontargeted acquisition techniques. During this study, a liquid chromatography quadrupole time-of-flight mass spectrometry (LC-QTOF-MS) assay was developed, validated, and implemented for forensic toxicology testing. A SCIEX TripleTOF™ 5600 + with SWATH® acquisition was used. Resulting data were compared against an extensive library database containing more than 800 compounds. The LC-QTOF-MS assay was applied to the reanalysis of biological sample extracts to discover emergent NPS. More than 3,000 sample extracts were analyzed, and more than 20 emerging NPS were detected for the first time. Among these were isopropyl-U-47700, 3,4-methylenedioxy-U-47700, fluorofuranylfentanyl, N-methyl norfentanyl, 2F-deschloroketamine, 3,4-methylenedioxy-alpha-PHP, eutylone, and N-ethyl hexedrone.

28 citations


Journal ArticleDOI
TL;DR: This article addresses how issues of impartiality and bias relate to forensic work, and how one can effectively evaluate and mitigate those risks, to meet ISO/IEC 17020 and 17025 requirements.
Abstract: The ISO/IEC 17020 and 17025 standards both include requirements for impartiality and the freedom from bias. Meeting these requirements for implicit cognitive bias is not a simple matter. In this article, we address these international standards, specifically focusing on evaluating and mitigating the risk to impartiality, and quality assurance checks, so as to meet accreditation program requirements. We cover their meaning to management as well as to practitioners, addressing how these issues of impartiality and bias relate to forensic work, and how one can effectively evaluate and mitigate those risks. We then elaborate on specific quality assurance policies and checks and identify when corrective action may be appropriate. These measures will not only serve to meet ISO/IEC 17020 and 17025 requirements, but also enhance forensic work and decision-making.

28 citations


Journal ArticleDOI
TL;DR: Combining the increased cardiac weight and/or gastric volume and toxicology data identifying 5‐Fluoro‐ADB, it is hypothesized that abuse of this substance may precipitate a dysrhythmia and cause sudden death.
Abstract: Forty-three fatalities involving the potent synthetic cannabinoid, 5-Fluoro-ADB, are summarized. For each case, a description of the terminal event, autopsy findings, cause of death, qualitative identification of 5-Fluoro-ADB and its ester hydrolysis metabolite, 5-Fluoro-ADB metabolite 7, in urine, and the quantitative values obtained in the blood specimens are outlined. Central blood concentrations ranged from 0.010 to 2.2 ng/mL for 5-Fluoro-ADB and 2.0 to 166 ng/mL for 5-Fluoro-ADB metabolite 7. Peripheral blood concentrations ranged from 0.010 to 0.77 ng/mL and 2.0 to 110 ng/mL for 5-Fluoro-ADB and 5-Fluoro-ADB metabolite 7, respectively. The majority of cases resulted in central to peripheral blood concentration ratios greater than 1 for 5-Fluoro-ADB (58%) and 5-Fluoro-ADB metabolite 7 (71%) suggesting that postmortem redistribution occurs to some extent. Combining the increased cardiac weight and/or gastric volume and toxicology data identifying 5-Fluoro-ADB, it is hypothesized that abuse of this substance may precipitate a dysrhythmia and cause sudden death.

26 citations


Journal ArticleDOI
TL;DR: All 471 sharp force homicides in Denmark during 1992–2016 are analyzed with special focus on aspects that are relevant to forensic pathologists, including the distribution of wounds and organ injuries, to show strong sex differences in both victims and offenders.
Abstract: Sharp force trauma is a common homicide method. The weapon is typically a knife, which is easily accessible and does not require special skills. We have analyzed all 471 sharp force homicides in Denmark during 1992-2016 with special focus on aspects that are relevant to forensic pathologists, including the distribution of wounds and organ injuries. Most homicides were committed inside with a kitchen knife. The front left thorax was the most common area to be affected by sharp force trauma. In 18.9% of the victims, there was only one sharp injury, the majority on the thorax. The most common trajectory for stab wounds was directly posterior with no deviation to the sides or up/down followed by directly anterior. The heart (including pericardium) and lungs (including hemo- and pneumothorax) had injuries in more than 75% of the victims. 67% of victims were males. Female victims had more sharp force injuries and defense wounds than male victims. Most females were killed in domestic homicides (73.7%), most commonly in partner killings (56.4%). In contrast, many male victims were killed in a setting of nightlife/intoxication (34.0%) most by a friend/acquaintance delivering a few stab wounds. The results clearly show strong sex differences in both victims and offenders. This could be useful for shaping policies and public opinion, and as a route for understanding the developments in interpersonal violence. In the narrow setting of death investigation, our results will provide an evidence-based approach to understanding the injury patterns in sharp force homicide.

23 citations


Journal ArticleDOI
TL;DR: The developed age predictor model (APM) for blood samples of deceased individuals based in five age‐correlated genes seems to be informative and could have potential application in forensic analysis.
Abstract: Age estimation using DNA methylation levels has been widely investigated in recent years because of its potential application in forensic genetics. The main aim of this study was to develop an age predictor model (APM) for blood samples of deceased individuals based in five age-correlated genes. Fifty-one samples were analyzed through the bisulfite polymerase chain reaction (PCR) sequencing method for DNA methylation evaluation in genes ELOVL2, FHL2, EDARADD, PDE4C, and C1orf132. Linear regression was used to analyze relationships between methylation levels and age. The model using the highest age-correlated CpG from each locus revealed a correlation coefficient of 0.888, explaining 76.3% of age variation, with a mean absolute deviation from the chronological age (MAD) of 6.08 years. The model was validated in an independent test set of 19 samples producing a MAD of 8.84 years. The developed APM seems to be informative and could have potential application in forensic analysis.

Journal ArticleDOI
TL;DR: A proficiency test was applied to the validated method with z‐scores within ±2, demonstrating the accuracy of the method for the determination of drugs of abuse in the hair of individuals suspected of abusing drugs.
Abstract: A method using liquid chromatography-tandem mass spectrometry (LC-MS/MS) to simultaneously quantify amphetamines, opiates, ketamine, cocaine, and metabolites in human hair is described. Hair samples (50 mg) were extracted with methanol utilizing cryogenic grinding. Calibration curves for all the analytes were established in the concentration range 0.05-10 ng/mg. The recoveries were above 72%, except for AMP at the limit of quantification (LOQ), which was 48%. The accuracies were within ±20% at the LOQ (0.05 ng/mg) and between -11% and 13.3% at 0.3 and 9.5 ng/mg, respectively. The intraday and interday precisions were within 19.6% and 19.8%, respectively. A proficiency test was applied to the validated method with z-scores within ±2, demonstrating the accuracy of the method for the determination of drugs of abuse in the hair of individuals suspected of abusing drugs. The hair concentration ranges, means, and medians are summarized for abused drugs in 158 authentic cases.

Journal ArticleDOI
TL;DR: This extensive developmental validation data provides support for broad use of the ANDE Rapid DNA Identification System by agencies and accredited forensic laboratories in single‐source suspect‐evidence comparisons, local database searches, and DVI.
Abstract: A developmental validation was performed to demonstrate reliability, reproducibility, and robustness of the ANDE Rapid DNA Identification System for processing of crime scene and disaster victim identification (DVI) samples. A total of 1705 samples were evaluated, including blood, oral epithelial samples from drinking containers, samples on FTA and untreated paper, semen, bone, and soft tissues. This study was conducted to address the FBI's Quality Assurance Standards on developmental validation and to accumulate data from a sufficient number of unique donors and sample types to meet NDIS submission requirements for acceptance of the ANDE Expert System for casework use. To date, no Expert System has been approved for such samples, but the results of this study demonstrated that the automated Expert System performs similarly to conventional laboratory data analysis. Furthermore, Rapid DNA analysis demonstrated accuracy, precision, resolution, concordance, and reproducibility that were comparable to conventional processing along with appropriate species specificity, limit of detection, performance in the presence of inhibitors. No lane-to-lane or run-to-run contamination was observed, and the system correctly identified the presence of mixtures. Taken together, the ANDE instrument, I-Chip consumable, FlexPlex chemistry (a 27-locus STR assay compatible with all widely used global loci, including the CODIS core 20 loci), and automated Expert System successfully processed and interpreted more than 1200 unique samples with over 99.99% concordant CODIS alleles. This extensive developmental validation data provides support for broad use of the system by agencies and accredited forensic laboratories in single-source suspect-evidence comparisons, local database searches, and DVI.

Journal ArticleDOI
TL;DR: A paper‐based analytical device that can run a library of 12 colorimetric tests at the same time, each detecting different chemical functional groups and materials found in illicit drugs, distractor substances, and cutting agents is discussed.
Abstract: As drug overdose deaths across the United States continue to rise, there is increasing interest in field testing of illicit substances. This work discusses a paper-based analytical device (idPAD) that can run a library of 12 colorimetric tests at the same time, each detecting different chemical functional groups and materials found in illicit drugs, distractor substances, and cutting agents. The idPAD requires no electricity, costs less than $2 USD, and requires minimal training to operate. The results of the 12 tests form a color barcode which is "read" by comparison to standard images. The accuracy of the idPAD was assessed using samples of heroin, cocaine HCl, crack, and methamphetamine at concentrations of 25%-100% in a lactose matrix, as well as pure lactose. Based on 840 "reads" by three different users, the idPAD showed 95% sensitivity and 100% specificity for detecting these drugs; the most common error was mistaking cocaine HCl for crack or crack for cocaine HCl. In a second step, samples of heroin, cocaine, and methamphetamine (n = 30) and distractor substances (pharmaceuticals, cutting agents, and other illicit drugs, n = 64) were tested by two readers, yielding a sensitivity of 100% and specificity of 97%. Targeted substances were detected reliably at 55-180 μg/lane, and the idPAD was found to be stable for at least 3 months when stored at room temperature. The library approach used in the idPAD may provide the accuracy and robustness necessary for a presumptive field drug test.

Journal ArticleDOI
TL;DR: This study profile VOCs released from three postmortem bacterial isolates using solid‐phase microextraction arrow (SPME Arrow) and gas chromatography–mass spectrometry (GC‐MS) to improve understanding of underlying mechanisms for decomposition VOC production.
Abstract: Volatile organic compounds (VOCs) are by-products of cadaveric decomposition and are responsible for the odor associated with decomposing remains. The direct link between VOC production and individual postmortem microbes has not been well characterized experimentally. The purpose of this study was to profile VOCs released from three postmortem bacterial isolates (Bacillus subtilis, Ignatzschineria indica, I. ureiclastica) using solid-phase microextraction arrow (SPME Arrow) and gas chromatography-mass spectrometry (GC-MS). Species were inoculated in headspace vials on Standard Nutrient Agar and monitored over 5 days at 24°C. Each species exhibited a different VOC profile that included common decomposition VOCs. VOCs exhibited upward or downward temporal trends over time. Ignatzschineria indica produced a large amount of dimethyldisulfide. Other compounds of interest included alcohols, aldehydes, aromatics, and ketones. This provides foundational data to link decomposition odor with specific postmortem microbes to improve understanding of underlying mechanisms for decomposition VOC production.

Journal ArticleDOI
TL;DR: The study found that TrueAllele is a reliable method for analyzing DNA mixtures containing up to ten unknown contributors.
Abstract: Most DNA evidence is a mixture of two or more people. Cybergenetics TrueAllele® system uses Bayesian computing to separate genotypes from mixture data and compare genotypes to calculate likelihood ratio (LR) match statistics. This validation study examined the reliability of TrueAllele computing on laboratory-generated DNA mixtures containing up to ten unknown contributors. Using log(LR) match information, the study measured sensitivity, specificity, and reproducibility. These reliability metrics were assessed under different conditions, including varying the number of assumed contributors, statistical sampling duration, and setting known genotypes. The main determiner of match information and variability was how much DNA a person contributed to a mixture. Observed contributor number based on data peaks gave better results than the number known from experimental design. The study found that TrueAllele is a reliable method for analyzing DNA mixtures containing up to ten unknown contributors.

Journal ArticleDOI
TL;DR: A blind quality control program was successfully developed and implemented in the Toxicology, Seized Drugs, Firearms, Latent Prints, Forensic Biology, and Multimedia sections at the Houston Forensic Science Center.
Abstract: A blind quality control (QC) program was successfully developed and implemented in the Toxicology, Seized Drugs, Firearms, Latent Prints (Processing and Comparison), Forensic Biology, and Multimedia (Digital and Audio/Video) sections at the Houston Forensic Science Center (HFSC). The program was put into practice based on recommendations set forth in the 2009 National Academy of Sciences report and is conducted in addition to accreditation required annual proficiency tests. The blind QC program allows HFSC to test its entire quality management system and provides a real-time assessment of the laboratory's proficiency. To ensure the blind QC cases mimicked real casework, the workflow for each forensic discipline and their evidence submission processes were assessed prior to implementation. Samples are created and submitted by the HFSC Quality Division to whom the expected answer is known. Results from 2015 to 2018 show that of the 973 blind samples submitted, 901 were completed, and only 51 were discovered by analysts as being blind QC cases. Implementation data suggests that this type of program can be employed at other forensic laboratories.

Journal ArticleDOI
TL;DR: The flurry of activities in establishing error rates for the forensic sciences has largely overlooked some fundamental issues that make error rates a problematic construct and limit the ability to obtain a meaningful error rate.
Abstract: Establishing error rates is crucial for knowing how well one is performing, determining whether improvement is needed, measuring whether interventions are effective, as well as for providing transparency. However, the flurry of activities in establishing error rates for the forensic sciences has largely overlooked some fundamental issues that make error rates a problematic construct and limit the ability to obtain a meaningful error rate. These include knowing the ground truth, establishing appropriate databases, determining what counts as an error, characterizing what is an acceptable error rate, ecological validity, and transparency within the adversarial legal system. Without addressing these practical and theoretical challenges, the very notion of a meaningful error rate is limited.

Journal ArticleDOI
TL;DR: ATR FTIR spectroscopy which is a rapid, nondestructive, sensitive, reliable, and safe alternative to other analytical techniques has been used to differentiate 31 cosmetic foundation creams belonging to 23 different brands supported by chemometric methods.
Abstract: Cosmetic foundation creams are encountered as evidentiary material in criminal investigations, particularly in cases related to sexual and physical assault against women. Analyzing foundation cream exhibit is a challenging task as the exhibit is recovered in trace quantity with similar hue. In the present study, ATR FTIR spectroscopy which is a rapid, nondestructive, sensitive, reliable, and safe alternative to other analytical techniques has been used to differentiate 31 cosmetic foundation creams belonging to 23 different brands supported by chemometric methods such as principal component analysis (PCA) and linear discriminant analysis (LDA). The discriminating power of visual analysis is found to be 98.0%, while PCA and LDA further increased the discriminating power to 99.3% and 100%, respectively. The blind test is conducted with three unknown samples (pretended to be unknown to the analyst), which were correctly linked with their respective source. Further, the effect of substrate such as tissue paper (dry and wet) and white cotton cloth during sample analysis are also examined to simulate the actual forensic casework conditions. The stains on substrates could be identified and linked to its parent product as well. The reported method provides significant results for the differentiation of cosmetic foundation creams.

Journal ArticleDOI
TL;DR: Three commercially available integrated rapid DNA instruments were tested as a part of a rapid DNA maturity assessment in July of 2018 and it was demonstrated that 95% of all heterozygous alleles were above 59% heterozygote balance and the precision was below the standard 0.5 bp deviation.
Abstract: Three commercially available integrated rapid DNA instruments were tested as a part of a rapid DNA maturity assessment in July of 2018. The assessment was conducted with sets of blinded single-source reference samples provided to participants for testing on the individual rapid platforms within their laboratories. The data were returned to the National Institute of Standards and Technology (NIST) for review and analysis. Both FBI-defined automated review (Rapid DNA Analysis) and manual review (Modified Rapid DNA Analysis) of the datasets were conducted to assess the success of genotyping the 20 Combined DNA Index System (CODIS) core STR loci and full profiles generated by the instruments. Genotype results from the multiple platforms, participating laboratories, and STR typing chemistries were combined into a single analysis. The Rapid DNA Analysis resulted in a success rate of 80% for full profiles (85% for the 20 CODIS core loci) with automated analysis. Modified Rapid DNA Analysis resulted in a success rate of 90% for both the CODIS 20 core loci and full profiles (all attempted loci per chemistry). An analysis of the peak height ratios demonstrated that 95% of all heterozygous alleles were above 59% heterozygote balance. For base-pair sizing precision, the precision was below the standard 0.5 bp deviation for both the ANDE 6C System and the RapidHIT 200.

Journal ArticleDOI
TL;DR: The PARQ‐Gap may be useful for both clinicians and forensic practitioners in evaluating children of separating and divorced parents when there is a concern about the possible diagnosis of parental alienation.
Abstract: Parental alienation (rejection of a parent without legitimate justification) and realistic estrangement (rejection of a parent for a good reason) are generally accepted concepts among mental health and legal professionals. Alienated children, who were not abused, tend to engage in splitting and lack ambivalence with respect to their parents; estranged children, who were maltreated, usually perceive their parents in an ambivalent manner. The hypothesis of this study was that a psychological test-the Parental Acceptance-Rejection Questionnaire (PARQ)-will help to distinguish severely alienated from nonalienated children. The PARQ, which was used to identify and quantify the degree of splitting for each participant, was administered to 45 severely alienated children and 71 nonalienated children. The PARQ-Gap score-the difference between each child's PARQ: Father score and PARQ: Mother score-was introduced and defined in this research. Using a PARQ-Gap score of 90 as a cut point, this test was 99% accurate in distinguishing severely alienated from nonalienated children. This research presents a way to distinguish parental alienation from other reasons for contact refusal. The PARQ-Gap may be useful for both clinicians and forensic practitioners in evaluating children of separating and divorced parents when there is a concern about the possible diagnosis of parental alienation.

Journal ArticleDOI
TL;DR: The performance of handheld Raman devices for detecting opioids and related substances including fentanyl and several analogs are described and the parent‐daughter electronic transfer method was successful and effective, which permits the ability to develop methods in the laboratory that can be seamlessly pushed out to field devices.
Abstract: This study describes the performance of handheld Raman devices for detecting one hundred opioids and related substances including fentanyl and several analogs. Using a single "parent" device, signatures (spectra) with excellent signal-to-noise ratios were generated using <5 mg of most compounds. The signatures were added to a method (library), which was electronically transferred to three "daughter" devices. The devices were able to discriminate different salt forms and isomers. On average, the daughter devices yielded a true-positive rate of 97.3% for generating an alarm for opioids and were 93.3% effective for correctly identifying the opioid. The devices yielded true-negative, false-positive and false-negative rates of 100%, 0%, and 2.7%, respectively, where false negatives were due to weak signal and fluorescence. These data demonstrate that the parent-daughter electronic transfer method was successful and effective, which permits the ability to develop methods in the laboratory that can be seamlessly pushed out to field devices.

Journal ArticleDOI
TL;DR: The experiences, lessons learned, and program details are shared to assist other forensic service providers with developing their own blind testing programs, which would ultimately lead to improved quality assurance.
Abstract: Blind proficiency testing is ideal for testing crime laboratory personnel because the elements of analyst bias and anticipation are removed. However, sending proficiency tests through the laboratory system as real casework is difficult. The substantial challenges with preparing and administering blind tests may prevent laboratory managers from initiating blind testing. In 2015, the Harris County Institute of Forensic Sciences committed to improving its crime laboratory's proficiency testing program by adding blind tests. The goal was to test the whole system, from evidence receipt to report release. With careful planning, trial-and-error, and ongoing assessment of available resources, not only was the program proven to be feasible, but there was also clear understanding of how to optimize our program. In this article, we share our experiences, lessons learned, and program details to assist other forensic service providers with developing their own blind testing programs, which would ultimately lead to improved quality assurance.

Journal ArticleDOI
TL;DR: The ability to detect both IGSR and OGSR simultaneously provides a selective testing platform for gunshot residues that can provide a powerful field‐testing technique and assist with decisions in case management.
Abstract: The increasing demand for rapid methods to identify both inorganic and organic gunshot residues (IGSR and OGSR) makes electrochemical methods, an attractive screening tool to modernize current practice. Our research group has previously demonstrated that electrochemical screening of GSR samples delivers a simple, inexpensive, and sensitive analytical solution that is capable of detecting IGSR and OGSR in less than 10 min per sample. In this study, we expand our previous work by increasing the number of GSR markers and applying machine learning classifiers to the interpretation of a larger population data set. Utilizing bare screen-printed carbon electrodes, the detection and resolution of seven markers (IGSR; lead, antimony, and copper, and OGSR; nitroglycerin, 2,4-dinitrotoluene, diphenylamine, and ethyl centralite) was achieved with limits of detection (LODs) below 1 µg/mL. A large population data set was obtained from 395 authentic shooter samples and 350 background samples. Various statistical methods and machine learning algorithms, including critical thresholds (CT), naive Bayes (NB), logistic regression (LR), and neural networks (NN), were utilized to calculate the performance and error rates. Neural networks proved to be the best predictor when assessing the dichotomous question of detection of GSR on the hands of shooter versus nonshooter groups. Accuracies for the studied population were 81.8 % (CT), 88.1% (NB), 94.7% (LR), and 95.4% (NN), respectively. The ability to detect both IGSR and OGSR simultaneously provides a selective testing platform for gunshot residues that can provide a powerful field-testing technique and assist with decisions in case management.

Journal ArticleDOI
TL;DR: One suggestion for future research is outlined, which is that studies on contextual bias in forensic decisions should be conducted in collaboration between forensic scientists and cognitive psychologists, so that rigorous and ecological valid experiments can be created that will be able to assess how task‐irrelevant contextual information influences forensic analysis and judgments in operationally valid settings.
Abstract: In recent years, a number of studies have demonstrated that forensic examiners can be biased by task-irrelevant contextual information. However, concerns relating to methodological flaws and ecological validity attenuate how much the current body of knowledge can be applied to real-life operational settings. The current review takes a narrative approach to synthesizing the literature across forensic science. Further, the review considers three main issues: (i) primary research on contextual bias within forensic science; (ii) methodological criticisms of this research; (iii) an alternative perspective that task-irrelevant contextual information does not always lead to error. One suggestion for future research is outlined, which is that studies on contextual bias in forensic decisions should be conducted in collaboration between forensic scientists and cognitive psychologists. Only then can rigorous and ecological valid experiments be created that will be able to assess how task-irrelevant contextual information influences forensic analysis and judgments in operationally valid settings.

Journal ArticleDOI
TL;DR: An epigenetic model for age prediction was validated in a sample of the Italian population of different ages covering the whole span of adult life and was identified as the best performing model across a plethora of candidates.
Abstract: Forensic DNA phenotyping refers to an emerging field of forensic sciences aimed at the prediction of externally visible characteristics of unknown sample donors directly from biological materials. The aging process significantly affects most of the above characteristics making the development of a reliable method of age prediction very important. Today, the so-called "epigenetic clocks" represent the most accurate models for age prediction. Since they are technically not achievable in a typical forensic laboratory, forensic DNA technology has triggered efforts toward the simplification of these models. The present study aimed to build an epigenetic clock using a set of methylation markers of five different genes in a sample of the Italian population of different ages covering the whole span of adult life. In a sample of 330 subjects, 42 selected markers were analyzed with a machine learning approach for building a prediction model for age prediction. A ridge linear regression model including eight of the proposed markers was identified as the best performing model across a plethora of candidates. This model was tested on an independent sample of 83 subjects providing a median error of 4.5 years. In the present study, an epigenetic model for age prediction was validated in a sample of the Italian population. However, its applicability to advanced ages still represents the main limitation in forensic caseworks.

Journal ArticleDOI
TL;DR: It is important that there is sufficient awareness of the prodrug concept and potential impact and associated forensic implications, not just for chemical analysis but also for toxicological considerations when a substance has been used.
Abstract: The concept of a substance acting as a prodrug for an intended drug is not new and has been known and utilized with particular benefits within medicine for efficacy and patient safety. Prodrugs of psychoactive substances are also not particularly new but this has also extended to considerations of prodrugs of new psychoactive substances (NPS). The continuing evolution of NPS has been a constant forensic challenge. In some countries, this constant evolution has led to the introduction of various alternative methods of drug control. Whether for this reason or in the pursuit of user experimentation, prodrugs of NPS have been discussed, developed, and exploited, posing some distinct forensic challenges. This is especially the case within toxicological analysis of biological fluids and for some substances, also forensic chemical analysis, through inherent instability of the prodrug or metabolism in the body. Particular examples of NPS prodrugs include 1-propanoyl-LSD, 1-butanoyl-LSD, 1-acetyl-LSD, and 2C-B-AN. This is in addition to associated substances and medicines that may be used for an intended pharmacological effect. Various prodrugs for stimulant and hallucinogenic substances in particular have appeared in the literature and have been discussed within drug user forums and made available for purchase online. Presently, drug monitoring data from national and international systems indicate that prodrugs are not widely available or problematic. Nevertheless, it is important that there is sufficient awareness of the prodrug concept and potential impact and associated forensic implications, not just for chemical analysis but also for toxicological considerations when a substance has been used.

Journal ArticleDOI
TL;DR: A method involving direct extraction of hair shaft proteins more sensitive than previously published methods regarding GVP detection was found to provide reproducible results and some of the specific GVP identifications depend on the sample preparation method.
Abstract: Recent reports have demonstrated that genetically variant peptides derived from human hair shaft proteins can be used to differentiate individuals of different biogeographic origins. We report a method involving direct extraction of hair shaft proteins more sensitive than previously published methods regarding GVP detection. It involves one step for protein extraction and was found to provide reproducible results. A detailed proteomic analysis of this data is presented that led to the following four results: (i) A peptide spectral library was created and made available for download. It contains all identified peptides from this work, including GVPs that, when appropriately expanded with diverse hair-derived peptides, can provide a routine, reliable, and sensitive means of analyzing hair digests; (ii) an analysis of artifact peptides arising from side reactions is also made using a new method for finding unexpected modifications; (iii) detailed analysis of the gel-based method employed clearly shows the high degree of cross-linking or protein association involved in hair digestion, with major GVPs eluting over a wide range of high molecular weights while others apparently arise from distinct non-cross-linked proteins; and (v) finally, we show that some of the specific GVP identifications depend on the sample preparation method.

Journal ArticleDOI
TL;DR: This review examines the technique of bomb pulse dating and its use in the identification of differentially preserved unknown human remains and demonstrated reliable and accurate results.
Abstract: In cases where there is limited antemortem information, the examination of unidentified human remains as part of the investigation of long-term missing person's cases is a complex endeavor and consequently requires a multidisciplinary approach. Bomb pulse dating, which involves the analysis and interpretation of 14C concentration, is one technique that may assist in these investigations by providing an estimate of year of birth and year of death. This review examines the technique of bomb pulse dating and its use in the identification of differentially preserved unknown human remains. Research and case studies implementing bomb pulse dating have predominantly been undertaken in the Northern Hemisphere and have demonstrated reliable and accurate results. Limitations were, however, identified throughout the literature. These included the small sample sizes used in previous research/case studies which impacted on the statistical significance of the findings, as well as technique-specific issues. Such limitations highlight the need for future research.

Journal ArticleDOI
TL;DR: This feasibility study investigated the application of convolutional neural network, a form of deep learning AI, to PMCT head imaging in differentiating fatal head injury from controls, demonstrating an accuracy of between 70% and 92.5%, with difficulties in recognizing subarachnoid hemorrhage and in distinguishing congested vessels and prominent falx from head injury.
Abstract: Postmortem computed tomography (PMCT) is a relatively recent advancement in forensic pathology practice that has been increasingly used as an ancillary investigation and screening tool. One area of clinical CT imaging that has garnered a lot of research interest recently is the area of "artificial intelligence" (AI), such as in screening and computer-assisted diagnostics. This feasibility study investigated the application of convolutional neural network, a form of deep learning AI, to PMCT head imaging in differentiating fatal head injury from controls. PMCT images of a transverse section of the head at the level of the frontal sinus from 25 cases of fatal head injury were combined with 25 nonhead-injury controls and divided into training and testing datasets. A convolutional neural network was constructed using Keras and was trained against the training data before being assessed against the testing dataset. The results of this study demonstrated an accuracy of between 70% and 92.5%, with difficulties in recognizing subarachnoid hemorrhage and in distinguishing congested vessels and prominent falx from head injury. These results are promising for potential applications as a screening tool or in computer-assisted diagnostics in the future.