DoDPI Banner for Research Papers

Abstract

BLACKWELL, N. J. POLYSCORE: A comparison of accuracy. June 1996, Report No. DoDPI95-R-0001. Department of Defense Polygraph Institute, Ft. McClellan, AL 36205.

Using data collected under a mock crime scenario paradigm, four versions of the John Hopkins University Applied Physics Laboratory (APL) algorithm-based scoring system were evaluated for consistency in scoring accuracy. The four versions were: (a) PASS 2.0, (b) POLYSCORE 2.3, (c) POLYSCORE 2.9, and (d) POLYSCORE 3.0. The algorithm's rates of agreement/disagreement with ground truth were examined, and the same evaluations were made for the psychophysiological detection of deception (PDD) examiners who collected the data. The PDD examiners in this evaluation had an overall accuracy rate of 72.27% when compared to ground truth. The overall rate of accuracy generated by the algorithm (edited dataset) was: (a) PASS 2.0, 63.03%; (b) POLYSCORE 2.3, 67.72%; (c) POLYSCORE 2.9, 72.27%; and (d) POLYSCORE 3.0, 68.91%. With the inconclusive decisions eliminated, the recomputed accuracy rate for the PDD examiners was 79.63%, while each version of the algorithm was comparable (PASS 2.0, [78.95%]; POLYSCORE 2.3, [79.21%]; POLYSCORE 2.9, [83.50%]; POLYSCORE 3.0, [82.83%]), with both POLYSCORE 2.9 and POLYSCORE 3.0 exceeding the examiners' level of accuracy. In addition to overall accuracy and accuracy based on the test format used, the effects of subjective manipulation of the data were discussed, and information was provided on the occurrence of decision reversals and statistical outliers.

Key-words: POLYSCORE, Axciton, computerized scoring algorithms, Zone Comparison Test (ZCT), polygraph, forensic psychophysiology, psychophysiological detection of deception (PDD).

Director's Foreword

As the discipline of Forensic Psychophysiology evolves, computer hardware and software have become increasingly important to the administration and evaluation of psychophysiological detection of deception (PDD) examinations. Such automation decreases the physical and mental demands placed on examiners by reducing the effort required to operate the instrument. This allows the examiners to concentrate their efforts on the interview and interrogation portions of the examination. While automation can increase examiner efficiency there are potential penalties. As PDD examinations become more automated the examiner surrenders some control over the examination to the hardware and software manufacturers. Those practicing Forensic Psychophysiology must remain vigilant to ensure that the hardware and software used accurately record and evaluate physiologic responses.

This report describes the results obtained when the same data were evaluated by examiners and four versions of a computer program designed to assess physiologic responses recorded during PDD examinations. It is essential that such comparative studies be completed and reported to validate our increasing reliance on computer software. It should be noted that the reported comparisons were made using data collected following a laboratory mock-crime while the computer program was designed using the results of actual criminal examinations. This difference may have influenced the overall accuracy rates if there are intrinsic differences between data collected following actual and mock crimes.

Michael H. Capps

Director