Each year, 50,000 people die in the US, 700,000 worldwide, from seemingly simple infections that have no drug to combat them. Worse, recent studies from the World Health Organization (WHO) found that antibiotic resistance is spreading throughout the world, all while our preparedness for future pandemics like the 2014-15 Eloba scare was found woefully inadequate. And while the number of documented diseases is growing, the number of newly discovered cures is shrinking every decade.
This is the world our pharmaceutical industry is struggling against.
To be fair, your overall health today is far better than it would have been just 100 years ago. Back then, average life expectancy was just 48 years. These days, most people can expect to one day blow out the candles on their 80th birthday cake.
The largest contributor to this doubling of life expectancy was the discovery of antibiotics, the first one being Penicillin in 1943. Before that drug became available, life was far more fragile.
Common illnesses like strep throat or pneumonia were life-threatening. Common surgeries we take for granted today, like inserting pacemakers or replacing knees and hips for the elderly, would have resulted in a one in six mortality rate. A simple scratch from a thorn bush or gash from a workplace accident could have left you at risk for serious infection, amputation, and in some cases, death.
And according to the WHO, this is a world we could potentially return to—a post-antibiotic era.
Antibiotic resistance becoming a global threat
Simply put, an antibiotic drug is a tiny molecule designed to attack a target bacteria. The rub is that over time, bacteria build a resistance to that antibiotic to a point where it's no longer effective. That forces Big Pharma to work constantly on developing new antibiotics to replace the ones bacteria become resistant to. Consider this:
Penicillin was invented in 1943, and then resistance to it began in 1945;
Vancomycin was invented in 1972, resistance to it began in 1988;
Imipenem was invented in 1985, resistance to it began in 1998;
Daptomycin was invented in 2003, resistance to it began in 2004.
This cat and mouse game is speeding up faster than Big Pharma can afford to stay ahead of it. It takes up to a decade and billions of dollars to develop a new class of antibiotics. Bacteria spawn a new generation every 20 minutes, growing, mutating, evolving until one generation finds a way to overcome the antibiotic. It's reaching a point where it's no longer profitable for Big Pharma to invest in new antibiotics, as they become obsolete so quickly.
But why are bacteria overcoming antibiotics faster today than in the past? A couple of reasons:
Most of us overuse antibiotics instead of just toughing out an infection naturally. This exposes the bacteria in our bodies to antibiotics more often, allowing them the opportunity to build resistance to them.
We pump our livestock full of antibiotics, thereby introducing even more antibiotics into your system through our diets.
As our population balloons from seven billion today to nine billion by 2040, bacteria will have more and more human hosts to live and evolve in.
Our world is so connected through modern travel that new strains of antibiotic-resistant bacteria can reach all corners of the world within a year.
The only silver lining in this current state of affairs is that 2015 saw the introduction of a groundbreaking antibiotic called, Teixobactin. It attacks bacteria in a novel way that scientists hope will keep us ahead of their eventual resistance for at least another decade, if not more.
But bacterial resistance isn’t the only danger Big Pharma is tracking.
If you were to look at a graph plotting the number of unnatural deaths that have occurred between 1900 to today, you would expect to see two large humps around 1914 and 1945: the two World Wars. However, you might be surprised to find a third hump between the two around 1918-9. This was the Spanish Influenza and it killed over 65 million people worldwide, 20 million more than WWI.
Aside from environmental crises and world wars, pandemics are the only events that have the potential to rapidly wipe out over 10 million people in a single year.
The Spanish Influenza was our last major pandemic event, but in recent years, smaller pandemics like SARS (2003), H1N1 (2009), and the 2014-5 West African Ebola outbreak have reminded us that the threat is still out there. But what the latest Ebola outbreak also revealed is that our ability to contain these pandemics leaves much to be desired.
That’s why advocates, like the renowned, Bill Gates, are now working with international NGOs to build a global biosurveillance network to better track, predict, and hopefully prevent future pandemics. This system will track global health reports at the national level, and, by 2025, the individual level, as a larger percentage of the population starts tracking their health via increasingly powerful apps and wearables.
Yet, while all this real-time data will allow organizations, like the WHO, to react faster to outbreaks, it won't mean anything if we aren't able to create new vaccines fast enough to stop these pandemics in their tracks.
Working in quicksand to design new drugs
The pharmaceutical industry has seen huge advances in the technology now at its disposal. Whether it's the enormous drop in the cost of decoding the human genome from $100 million to under $1,000 today, to the ability to catalog and decipher the exact molecular makeup of diseases, you’d think that Big Pharma has everything it needs to cure every illness in the book.
Well, not quite.
Today, we’ve been able to decipher the molecular makeup of about 4,000 diseases, much of this data gathered during the past decade. But of those 4,000, how many do we have treatments for? About 250. Why is this gap so large? Why aren’t we curing more diseases?
While the tech industry blossoms under Moore’s Law—the observation that the number of transistors per square inch on integrated circuits will double annually—the pharmaceutical industry suffers under Eroom’s Law (‘Moore’ spelled backward)—the observation that the number of drugs approved per billion in R&D dollars halves every nine years, adjusted for inflation.
There is no one person or process to blame for this crippling decline in pharmaceutical productivity. Some blame how drugs are funded, others blame the overly stifling patent system, the excessive costs of testing, the years needed for regulatory approval—all these factors play a part in this broken model.
Luckily, there are some promising trends that together could help break Eroom's downward curve.
Medical data on the cheap
The first trend is one we already touched on: the cost of collecting and processing medical data. Whole genome testing costs have fallen over 1,000 percent to below $1,000. And as more people start tracking their health through specialized apps and wearables, the ability to collect data at enormous scale will finally become possible (a point we’ll touch on below).
Democratized access to advanced health tech
A large factor behind the falling costs of processing medical data is the falling cost of the technology doing said processing. Putting aside the obvious stuff, like the falling cost and access to supercomputers that can crunch large data sets, smaller medical research labs are now able to afford medical manufacturing equipment that used to cost tens of millions.
One of the trends gaining a great deal of interest includes 3D chemical printers (ex. one and two) that will allow medical researchers to assemble complex organic molecules, up to fully ingestible pills that can be customized to the patient. By 2025, this technology will allow research teams and hospitals to print chemicals and custom prescription drugs in-house, without depending on outside vendors. Future 3D printers will eventually print more advanced medical equipment, as well as the simple surgical tools needed for sterile operating procedures.
Testing new drugs
Among the costliest and most time-consuming aspects of drug creation is the testing phase. New drugs need to pass computer simulations, then animal trials, then limited human trials, and then regulatory approvals before getting approved for use by the general public. Luckily, there are innovations happening at this stage as well.
Chief among them is an innovation we can bluntly describe as body parts on a chip. Instead of silicon and circuits, these tiny chips contain real, organic fluids and living cells that are structured in a way as to simulate a specific, human organ. Experimental drugs can then be injected into these chips to reveal how the drug would affect real human bodies. This bypasses the need for animal testing, offers a more accurate representation of the drug’s effects on human physiology, and allows researchers to run hundreds to thousands of tests, using hundreds to thousands of drug variants and dosages, on hundreds to thousands of these chips, thereby speeding the drug testing phases considerably.
Then when it comes to human trials, startups like myTomorrows, will better connect terminally ill patients with these new, experimental drugs. This helps people close to death get access to drugs that might save them while offering Big Pharma with test subjects who (if cured) might expedite the regulatory approval process to get these drugs to the market.
The future of healthcare is not mass produced
The abovementioned innovations in antibiotic development, pandemic preparedness, and drug development are already happening and should be well established by 2020-2022. However, the innovations we'll explore over the rest of this Future of Health series will reveal how the true future of healthcare lies not in creating life-saving drugs for the masses, but for the individual.