The Overlooked Role of Universities in the Drug Discovery Process
Drug discovery is the process by which novel therapeutic candidates are discovered and ultimately developed into human-grade medications. These candidates may include compounds ranging from modified or unmodified natural products or extracts and small molecules to biologics. Forbes reports that the average cost of drug development for a major pharmaceutical company is between $4 billion and $11 billion, which is a great deal higher than the more commonly reported $1 billion, as it accounts for failure rates. And in fact, failure rates are an important part of the drug discovery process.
Estimates show there is a 1 in 30 chance for the identification of an initial drug target to result in a product launch. Furthermore, reports indicate that it can take more than 12 years from the initial molecular target discovery to progress through preclinical studies and ultimately to US Food and Drug Administration (FDA) regulated human clinical trials. Costs accrue throughout this lengthy process for various items including personnel, research materials, collaborations, patent filings, clinical trials, and regulatory applications.
Many people tend to associate the development of drugs with the large pharmaceutical companies that sell them including Johnson & Johnson, Pfizer, and Merck & Co., to name a few. However, in most cases the critical initial discovery role of the smaller organization is overlooked.
A recent agreement between Eisai, one of the largest pharmaceutical companies based on revenue, and Johns Hopkins University Brain Science Institute (JHUBSi) represents the current push in university-private sector collaboration for drug discovery. In this arrangement, JHUBSi is tasked with taking the lead in identifying novel compounds to potentially treat neurological disorders and Eisai will have the option to further develop and commercialize these leads through a licensing agreement. The university stands to benefit financially from the potential receipt of up front payments, royalties, and negotiated milestone payments once a licensing agreement is reached.
Interestingly, of the 252 new drugs approved by the FDA from 1998 to 2007, approximately 24% originated from University or biotechnology company research and were subsequently transferred to a pharmaceutical or biotechnology company to further the research and develop a product that could be marketed. These data verify the fundamental role for university research and suggest that technology transfer is an essential component to the development of therapeutics.
In fact, university research has resulted, in part, in the development of many well-known therapeutics including insulin as a treatment for diabetes, various tuberculosis antibiotics, Allegra, and multiple chemotherapeutic agents such as Cisplatin. University research has contributed to the development of other areas of technology transfer and product commercialization, including in the development of various vaccines, medical devices such as the ultrasound and the pacemaker, and everyday items such as the seat belt and Google.
While the role of large pharmaceutical companies is integral in the development of novel drugs, it is important to also note that universities and biotechnology companies play an important, and often overlooked role as well. Industry-academia collaborations may represent the future of drug development and it is predicted that in years to come, we will see more than 24% of novel drugs originating from technologies developed by universities and biotechnology companies.
Bains, W., Drug Discovery World, (Fall 2004). Failure rates in drug discovery and development: will we ever get any better?
Herper, M., Forbes, (October 2012). The Truly Staggering Cost Of Inventing New Drugs.
Kneller, R. (2010). The importance of new companies for drug discovery: origins of a decade of new drugs Nature Reviews Drug Discovery, 9 (11), 867-882 DOI: 10.1038/nrd3251