Currently the global test used to detect intrinsic factor deficiencies in patients with bleeding is the activated partial thromboplastin time (APTT). The APTT reagent used should be sensitive to a reduction in coagulation factors such as FVIII and FIX that are commonly associated with bleeding. Literature states that it is desirable to have APTT systems that are sensitive to factor levels in the 30 percent to 40 percent range.¹
Numerous reagent systems for performing the APTT assay are available each differing in the following: the characteristics of the activator used, the platelet phospholipid substitute, the calcium chloride concentration, and the recommended times for pre‐incubating the test plasma with the activator.² Therefore it is no surprise that the practice of determining the sensitivity of APTT reagents for coagulation factor levels has become a bit controversial. We know that a normal APTT does not necessarily indicate that all clotting factors in the intrinsic pathway are normal. A deficiency of one intrinsic clotting factor can be masked by the elevated level of another factor.
The CLSI guideline clearly states that the value obtained should be considered an “estimate” and should not be considered absolute because of the variables in the materials used.³
An alternative method that has been suggested is for each individual laboratory to determine the factor sensitivity of their instrument and reagent system using 20 well‐characterized samples from patients with inherited coagulation deficiencies ranging from 10‐50% for each of the intrinsic coagulation factors (VIII, IX, XI, and XII). 4
See a full list of George King Bio-Medical Congenital Factor Deficient Plasmas here.