Tuesday, July 22, 2014

Public funding is essential for new technologies


I was recently driving through the Sierra Nevada range in California, and the references to the gold rush reminded me of a related concept I've been thinking about lately: early stage technology advances require government funding. In fact, I've come to believe public funding is not only important to accelerate technology, but without it, there would be no noticeable progress. This is highly relevant to the current state of life sciences funding given the challenges with NIH and NSF budgets. To illustrate that point, think about the following 3 steps required to strike it rich in the gold rush.

1. Explore and discover. This is the well established activity of academic basic research institutions. Often characterized by mavericks or small groups wandering in uncharted territory where no 'sane' person would spend their time or money. Public funds are an efficient and equitable way to fund such endeavors (by supporting a relatively large number of investigators with relatively small awards). Just like the early days of gold mining, this works because if the search is spread wide enough, there are bound to be a lucky few that find gold.

2. Dig holes and build roads. The reality is that discovering gold is only the beginning of the process to creating wealth. The next steps can be described as two-fold: a) high risk/reward ventures often led by startup companies (dig holes), and b) support of basic infrastructure to increase the flow of progress (build roads). It's reasonably intuitive that road building falls in the realm of government funding. In life sciences, this is served by advisory/regulatory agencies, a trained workforce, the healthcare system, and related laws. What is often less appreciated is the role of public funding for hole digging. In earlier days, a lot of this was funded by venture capital and corporate research, but the burden is now heavily carried by programs such as SBIR grants and funding from non-profit endowments (Gates Foundation being the most famous). 

3. Bring in the heavy machinery. Once the vein of gold is located, big business is ready to move in. Since this task requires a lot of capital, lots of people, countless transactions to execute, and generally longer timescales, corporate entities can easily out-compete the smaller players. It is not a coincidence that this is also where all the big money is made.

I was lucky enough to have been able to follow this trajectory (from 1 to 2 to 3) with a technology that originated in graduate school, led to a startup, and was later absorbed into a large corporation. From this experience, I have gained an appreciation of how the loop gets closed.

The forward direction is fairly intuitive to most of us living in a capitalistic society. On this path, business decisions are driven by net present value (NPV)-- essentially a measure of how much money a company can make. The practical implication is that big money tends to chase easy money. In the example above, most disciplined investors no longer believe the NPV is sufficiently attractive in step 2, and will wait until step 3. It should also be fairly apparent that government funding creates more positive NPVs for businesses to take advantage of and make more money. In most cases, without the initial injection of "free money" by public sources, business would substantially slow down.

The arrow that transfers wealth back is less intuitive-- but comes in the form of taxes. Government funding is not (and should not) be based on NPV, and instead tries to maximize other values such as westward expansion, education, and job creation. Given that public funding is helping companies make money, it is only fair that some of this is fed back to governments to continue funding the early innovation pipeline. In the current capitalist system, this occurs "naturally" through income and employment taxes. Without debating the politics of tax structure, suffice it to say that this transfer of wealth is necessary for proper function of the innovation engine.

I have been able to see both sides of this cycle-- the elation of receiving "no strings attached" funding to launch an early stage idea, as well as forking over eye-popping amounts of money in the form of taxes. The correct balance of these two forces are crucial for proper support of technology innovation.

Monday, July 14, 2014

Three Signs it's Time to Invest in Biological Technology

When I started down the "bioengineering technology" path in the late 90's, the natural assumption was that it would quickly follow the rapid market success of the internet. There was widespread optimism in the power of research and engineering, nearly endless capital to fund bold ideas, and a heartening stream of good news in the form of breakthrough discoveries (DNA sequencing, stem cells, systems biology). We are now fifteen years or so since that prominent wave of activity. While there have been some notable successes (personal genomics), the general sentiment is that biological technology has not caught the wave of market success that was expected. It's definitely not time to give up on the industry, but it may be useful to ponder what the appropriate timing is for the coming market success.

While it is never possible to predict market success, the following three signs are pretty good indicators that the wave is upon us.

1. A hero company emerges
A key trigger to signal the dawn of a new era is the emergence of "hero" companies. We are all familiar with examples in the internet industry-- Google, Amazon, Facebook. These are companies that are widely admired, exceptionally well funded with nearly bottomless revenue streams, possess exceptionally deep employee talent pools, are constantly in the news, and seem to effortlessly increase market value by billions of dollars at a time. The key here is that it doesn't matter who the hero company it, the fact that one has emerged is a sure sign of broader market success. The momentum and wake of such a company stimulates the entire industry-- driving new investment capital, becoming an acquirer of start-ups, providing training grounds for talent, fueling a swell of public investment, raising awareness in mass media to drive additional revenue, and so on.

In the cellular technology space, we have yet to see a company rise to this level. At various times, there were strong contenders coming from various fields such as stem cell science, miniaturization and automation systems, novel analysis instruments, and synthetic cell engineering. These cases were highlighted by big venture funding investments (generally above the 9-digit mark), impressive technology and products, and genuine excitement among the scientific communities. In almost all cases, these heroes-in-training hit a plateau as their acceleration slowed. Much of this was due to lofty expectations of investors, but also due to factors highlighted in #2 and #3 on this list. However, there is no rule that a hero has to be an overnight success, and some of the more patient companies are starting to show renewed momentum.

2. Publicly funded infrastructure is understood by the public that funds it
For those of us in the right age group, we first started using products such as email, web browsers, smart phones, and tablets at stages when most of the public (think parents) would comment "I've heard of that gizmo... very cool technology, but I can't imagine it'll be mainstream." While there are multiple factors leading to this transition, one of the central aspects is that a broad infrastructure needs to be in place for entrepreneurs to launch their companies and products from. Even for fast moving fields like electronics or internet, it took decades of work to put together the underlying systems and knowledge pools necessary for commercial success. On top of that, there is a time period where these systems are only used and understood by small groups-- the military, professional scientists, enthusiasts. At some point, this knowledge crosses the threshold to the public, and revenues start to flow.

The infrastructure for biology is bit foreign to most consumers. A common misconception is that biological technology is a component of the bio-pharmaceutical industry. While there is some overlap, the underlying infrastructure is different. By my definition, a key difference is that the products created via biological technology will not be "drugs." They may be used by bio-pharmaceutical companies, but just as easily find applications directly in consumer hands. Some of the initial steps have taken hold, such as the creation of a generation of bio-engineers to fill academic, scientific, industry, and innovation talent pools (this still requires heavy life support from government funding until commercial funding is ready). This has led to commercial application of many core technologies, building up manufacturing, marketing, and sales channels. Probably the two biggest infrastructure challenges to enable a biological technology renaissance is the creation of a more transparent healthcare marketplace, and regulatory (and popular) understanding of how to interpret personal biological information. Interestingly, both topics are trending in the public awareness.

3. Mass market applications
It is nearly impossible to achieve #1 or #2 without delivering a product that has mass market appeal. Often, a novel technology will initially be used to develop products marketed towards a niche user group. For biological technology, this is almost always the research community (academic and bio-pharmaceutical). It should only be a matter of time before an enterprising company figures out how to deliver a product that expands beyond this niche. Stay tuned for future posts that will speculate on where these may come from.

The emergence of these three elements for biological (dynamic cell) technology is likely within 5 years (if funding of basic science and infrastructure continues). Fifteen years after the first wave, many of the initial biological technologies have been matured in the "real world", and there is an impending convergence of biology with electronic device and personal information products. In the coming years, there will likely be a lot of risk, turnover, and competition in the innovation marketplace as the best companies position themselves for long term success. As every entrepreneur inherently knows, it's great to be able to spot the coming wave, but if you are not already there, it's too late. The good news for everyone else is that regardless of which company wins, the outcomes will be truly impressive, and likely to last for many years.

Saturday, July 5, 2014

Finding the "Killer App" and the World Cup

The legend of the "killer app" is a story every new entrepreneur learns from the elders. There are many variations, but the general idea is that scores of brilliant researchers and brave businessfolk work many years on breakthrough technologies and products with little to show for it-- until one day, the stars align, a "killer app" is born, and a lucky few get to join the ranks of the self-made billionaire. There are plenty of real-life examples of this in every industry (and sub-industry), and from a narrative perspective, it is hard to argue that this is one of the defining themes in technology innovation. What I hope to explain in this post is that while this story is true for the successful "killer apps," it does not imply that there is any way to predict beforehand what that application will be. I will use the World Cup as a crutch to illustrate this point, which at the time of this post is entering the semi-final round in Brazil.

If you are an innovator in an emerging market space, you should definitely strive to find the "killer app." (Note: there really isn't a good definition or criteria of what constitutes a "killer app," but if your investors ever say you've found it, have a party. My own vague definition is that success will transform what was previously a "niche application" to one that the whole world, including your grandmother, is using.) In the midst of this quest, don't waste your time trying to convince anyone that you know the answer. If you must, assert that you have discovered a possible answer. Consider the question "do you know who will win the World Cup?" The correct answer (until time-travel is possible) is "of course not." But poll your friends and you are unlikely to get that response. Further, if you are clever enough to re-ask this question over the course of the tournament (being careful not to offend anyone), I'm sure you'll also find that individual responses will change over time. "I was sure that team A was great, but they are all screwed up, so of course team B will win." When the final game is over, there will be a definitive champion, and then you can brag about your foresight.

My graduate work was in the field of microfluidics, and in those days (as I'm sure it's still true today), we spent many nights thinking about potential applications for the technology. Even more telling, there were hundreds of peer-reviewed publications on novel applications, and probably only a handful have reached any level of commercial success. As some readers may know, the dominant application for biological microfluidics today is for next-generation sequencing sample preparation. Suffice to say, this was not obvious to many experts 10-15 years ago. This doesn't mean that most of us didn't think about this application, or were blind to the factors for its success. But the fact is that it was one application out of many that could have taken off. Of course, there were a number of people who did bet big on this application, and were aptly rewarded for it. It is worth noting that even in the face of uncertainty, passion and conviction for your team is critical. Just think of the feeling if your team does win the World Cup of being able to say "I knew it from the beginning-- and I'm right!" It's also helpful to know that just like sports fans, the technology community appreciates a passionate fan of a losing team much more than an opportunistic supporter of the victor.

The field of dynamic cell analysis has yet to find it's "killer app," although many have been in contention for a number of years. Continued advances in the field show high probability that one will emerge, but when and which one is impossible to predict. In my view, the current "bracket" of contenders (in no particular order) includes the fields of: in vitro toxicity testing (organ-on-a-chip), precision stem cell control, cell-based weapons against cancer heterogeneity, manipulating/decoding brain neuron firing, pathogen detection (food, virus, infection), environmental monitoring (climate change, ecosystem health), bio-energy production, lab-grown food, cell-based consumer health devices, and cell-based manufacturing (many of these will be detailed in future posts). There are plenty of intrepid teams working in all of these areas (and other interesting areas that are not listed), and when a victor does get crowned, we should make sure to celebrate-- and then start preparations for the next round.

Tuesday, July 1, 2014

Why Optical Imaging Will Continue to Rule Cellular Analysis

A stroll through any cell biology research lab will likely reveal crammed benches full of optical microscopy systems. While instruments such as plate readers, flow cytometers, and cell counters are still commonly used, there has been a strong trend (most evident in the past 5 years) towards better imaging as the method of choice. It only takes a few quick web searches to find the expansion of microscopy technologies as well as the adoption of imaging to the previously "non-imaging" analysis instruments.This fervent competition among the instrument vendors has led to a strong pace of innovation and excellent product options for the research scientist. In this post, I'd like to take a step back and ponder whether this "imaging arms race" is leading down the right path for the cellular analysis community.

The underlying demand for improved imaging technology is the desire to visualize living cells at full resolution and in real time. Human intuition teaches us that "seeing is believing," and this applies to the study of a single cell the same as it would to an exotic giant squid. While this assumption takes a bit of a leap of faith, it is likely to be correct for cellular research. The core reason is that like the study of animal behavior, cellular biology is dynamic and non-deterministic. (In contrast, a phenomenon such as gravitational force does not require seeing to believe.) There is also sufficient scientific evidence that intracellular processes are largely "see-able"-- such as ionic flux, protein binding/localization, genetic switches, morphology/movement, etc. It is also worth highlighting the emotional power of visualization, which undoubtedly drives many purchasing decisions (for research equipment as well as in our living rooms). Our brains are wired to respond to full color, three-dimensional, 60 fps, "retina display" visuals by triggering associations with reality and truth. It is not surprising that it is rare to find a "cell paper" that does not have figures with microscope visuals. The prevalence of imaging in cell biology is deeply entrenched, and is unlikely to diminish.

The dynamic cell analysis community also receives key benefits from the much larger economy around displays, cameras, light sources, software, and digital storage driven by the same human desire for visualization. In fact, it is hard to imagine modern microscopy existing without the substantial investments made in the consumer electronics markets-- digital cameras, LED light sources, terabyte hard drives. Even more importantly, this scale is driving down costs of components such that in the near future, a cellular imaging instrument could become near-ubiquitous. (Not too long ago, cellphone CMOS cameras were thought to be too low quality for everyday use.) Another aspect of this ecosystem that has yet to spill over to biology is the tremendous progress being made in image analysis/recognition and 3D graphics software. The combined economies in security/surveillance, movies, and gaming will surely blow this space up in both capability and cost. This confluence of technology ecosystems is a great sign for the future of scientific imaging instruments.

As with any emerging technology, a key challenge is to convince the broader public to support and adopt an often foreign concept. For most people, the application of cellular analysis technology is far from mainstream, and evokes images of scientists in biohazard suits. While this acclimation and communication takes its own course (to be covered in a future post), the cellular analysis field has a strong advantage in being able to appeal to the sense of dynamic visualization (a fancy term for story). There is an inherent cinematic quality in watching a cell rebellion to turn cancerous, or the decision of a pluripotent cell to commit to a certain fate, or a cell succumbing to a pathogen invasion. For these reasons, the continued drive towards better imaging instruments, and the community's commitment to acquire better and more powerful visuals is definitely a good thing.