Effect of changes in testing parameters on the ...

Description

Citation

Title Effect of changes in testing parameters on the cost-effectiveness of two pooled test methods to classify infection status of animals in a herd
Author(s) Locksley L. McV, Joshua M. O’Brien, Sharon K. Hietala, Ian A. Gardner
Journal Preventive Veterinary Medicine
Date 2010
Volume 94
Issue 3-4
Start page 202
End page 212
Abstract Monte Carlo simulation was used to determine optimal fecal pool sizes for identification of all Mycobacterium avium subsp. paratuberculosis (MAP)-infected cows in a dairy herd. Two pooling protocols were compared: a halving protocol involving a single retest of negative pools followed by halving of positive pools and a simple protocol involving single retest of negative pools but no halving of positive pools. For both protocols, all component samples in positive pools were then tested individually. In the simulations, the distributions of number of tests required to classify all individuals in an infected herd were generated for various combinations of prevalence (0.01, 0.05 and 0.1), herd size (300, 1000 and 3000), pool size (5, 10, 20 and 50) and test sensitivity (0.5–0.9). Test specificity was fixed at 1.0 because fecal culture for MAP yields no or rare false-positive results. Optimal performance was determined primarily on the basis of a comparison of the distributions of numbers of tests needed to detect MAP-infected cows using the Mann–Whitney U test statistic. Optimal pool size was independent of both herd size and test characteristics, regardless of protocol. When sensitivity was the same for each pool size, pool sizes of 20 and 10 performed best for both protocols for prevalences of 0.01 and 0.1, respectively, while for prevalences of 0.05, pool sizes of 10 and 20 were optimal for the simple and halving protocols, respectively. When sensitivity decreased with increasing pool size, the results changed for prevalences of 0.05 and 0.1 with pool sizes of 50 being optimal especially at a prevalence of 0.1. Overall, the halving protocol was more cost effective than the simple protocol especially at higher prevalences. For detection of MAP using fecal culture, we recommend use of the halving protocol and pool sizes of 10 or 20 when the prevalence is suspected to range from 0.01 to 0.1 and there is no expected loss of sensitivity with increasing pool size. If loss in sensitivity is expected and the prevalence is thought to be between 0.05 and 0.1, the halving protocol and a pool size of 50 is recommended. Our findings are broadly applicable to other infectious diseases under comparable testing conditions.
DOI 10.1016/j.prevetmed.2010.01.005

Using APA 6th Edition citation style.

[Page generation failure. The bibliography processor requires a browser with Javascript enabled.]

Times viewed: 149

Adding this citation to "My List" will allow you to export this citation in other styles.