Verify Then Trust

One concept for accelerating the development of drugs for neglected diseases (i.e., those whose treatment is not reimbursed by insurance companies) is application of the “open source” innovation model in which distributed groups of researchers, mostly at academic laboratories, share tasks and pool information to advance drug discovery and development.  I have written about the utility of this approach, but not favorably.  In my post, “Open source Sesame” (3/11/11), I commented on a draft study, “Open Source for Neglected Diseases:  Magic Bullet or Mirage?” put out by the Center for Global Health R and D Policy Assessment (the final report is at CGHRDPA report) and noted two fundamental limitations of the open source approach as applied to drug development.  It is really only applicable to the discovery of drug candidates, and therefore unlikely to have much of an effect on  the overall costs, and the early stages need to  informed and guided by continuous input from those responsible for the later stages, that is, a positive feed back loop as is the case in companies who are continuously pruning their R and D efforts to get to a product.

As it turns out I had missed a third, possibly fatal, flaw in the utilization of academic labs for distributed, or possibly any, drug discovery, that is, the high rate of results irreproducibility.  This topic took on a personal angle when I caught up recently with a friend of mine who is CEO of a company based on academic research and who is now the process of selling the company’s assets because duplicating (proving) the original research turned out to be beyond of scope of his start-up’s investors.  Upon further research, I found that, in September, Asher Mullard wrote a Nature Drug Discovery Reviews article on this topic and his analysis is an eye-opener (Nature Reviews).  His starting point is a report by the major pharma company, Bayer, which found that in-house experimental data do not match published results in 65% of target validation projects, leading to project discontinuation, and that the confirmation of results by another academic group did not improve data reliability.  He also cites two academic studies of gene expression and proteomics data that found almost the vast majority of results were not reproducible.  As for the causes, Mr. Mullard references a 2005 essay in which the author, John Ioannidis, noted the causes include “investigator prejudice, incorrect statistical methods, competition in hot fields and publishing bias, whereby journals are more likely to publish positive and novel findings than negative or incremental results” (PLOS Medicine article), but also that academic labs do not work to achieve the same standards as industrial labs, particularly in incrementally verifying materials and results.  As for the professional investor view of the problem (after all the VCs are the ones we expect to take risks and go where pharma companies fear to thread [but not neglected diseases, yet]), Mr. Mullard, as well as my friend, refer to a post from March of this year by Bruce Booth of Atlas Ventures (Booth Blog).  One of his points is:  “The unspoken rule is that at least 50% of the studies published even in top tier academic journals – Science, Nature, Cell, PNAS, etc. – can’t be repeated with the same conclusions by an industrial lab.”  Troubling to say the least.

Beyond being troubling, why this is important, after all drug development is highly risky and even the biggest and the best companies fail at it (see Fierce Biotech article on Phase III failures, Fierce Biotech article)?  Funding  for neglected disease drug discovery and development, while growing over the past ten years, is still inadequate and needs to be used efficiently.  Since the leaders in the field, the product development programs like Medicines for Malaria Ventures (MMV) and the Drugs for Neglected Disease Initiative (DNDi), rely heavily on academic advisors and research, they need to adopt industrial standards and methods for data verification.  I’m sure this is not on their “to-do” lists but should be.

 

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s