Is site submission quality less important than data analysis?

Clinical trial sponsors have focused for years on finding the right investigators who can recruit patients, and choosing the best core laboratories to do the necessary data analysis. Without a doubt, having the best thought leaders associated with a trial can increase the level of success (defined as pursuing the right path and abandoning the wrong one) and thus lower the risk of the whole trial enterprise.

its_about_quality_not_speed

Unfortunately for the ever increasing number of imaging trials, the way data is collected has continued to ignore the workload imposed on trial coordinators, zeroing in instead on ways to cut costs. Submissions are typically assembled by hand, and then sent either using couriers, fax or simple, home-grown electronic transfer systems. That’s putting the emPHÁsis on the wrong syLLÁble : a focus on reducing cost without improving process.

Non-imaging trials have benefited greatly from the use of EDC. In retrospect, if the trial community had known in 1995 what they know now, it would not have taken more than 10 years to get to the current levels of adoption. EDC’s success is not in electronic forms. It is in the fact that it channels users’ input through a well-thought-out workflow which in turn reduces errors and increases the quality of the data being reported. In the EDC-equivalent of imaging trials, the platform is fully aware of the data structure and its content, and is therefore able to guide the process for submission assembly. As I have said in the past, electronically moving large data blobs without understanding their meaning and content, is a way of propagating problems… faster!

Why would some simplistic, 1990s “web send” solutions developed in-house be marketed then? Since the directive from sponsors has been for core labs to cut the costs associated with collecting images and image data, electronic transport was viewed as a way to eliminate courier shipments. Soon after that, having an electronic submission capability became a requirement for doing business. Many labs quickly developed systems to allow for electronic transport of images using existing technologies (sFTP or HTTPs), as I’ve discussed in previous blogs. Unfortunately, at best these approaches have done nothing more than accelerate the delivery of the same (or less) information than you would find in a courier envelope. These solutions have done nothing to eliminate queries related to data entry errors, missing or incomplete forms, data discrepancies between images and forms, etc. Yet, these systems have been marketed to sponsors as a way to achieve cost savings. “Look!  No more courier costs!” Some have even gone so far as to say that these systems are “FREE!” Unfortunately, as the old man once said, there is nothing more expensive than “free.”

What these sFTP and HTTPS-based systems have failed to do is provide the proper context and workflow for the site submission, and they have allowed the same quality and patient privacy compliance issues to continue propagating unabated. User frustration introduced by the inadequacy of the outdated technologies’ ability to move massive DICOM files reliably (lost images, lack of notifications, separate work-flow and delivery method for ancillary data, etc.) is only contributing to delay benefits the industry needs badly.

Ken Getz at Tufts pointed out that the single most amazing change he has witnessed in the industry is that “drug development conditions have NOT been measurably altered in spite of dramatic and profound changes in technology solutions now available.” These solutions will necessitate changes in the way the corporate Life Science ecosystem operates, getting over the not-invented-here syndrome, and adopting and supporting systems and methods to lower the risk of running clinical trials, while significantly enhancing outcomes. One part of this progress is based on increasing submission quality, not just speed.

No Comments

Sorry, the comment form is closed at this time.