Tuesday, March 15, 2016

Tech transfer is good for academics and good for society

In the life sciences business, patents have an enormous influence on success and profitability. Whether this should be the case is an interesting debate, but not the topic of this post.

The current IP landscape in the life sciences generally consists of: a) research institutions (largely academic universities) generating the lion's share of innovative intellectual property, some of which ends up as filed patent applications, and b) operating (profitable) businesses getting increasingly large, and relying more on in-licensing to drive innovation. This is quite an amazing phenomenon, and may be a central reason the United States continues to dominate life science innovation.

The current bottleneck in this process is that only a small proportion of the IP filed at universities ends up getting licensed by businesses. This inefficiency is unfortunate, since much of this research is funded by taxpayers, and proper translation into business will accelerate progress, create jobs, and boost productivity.

A key way to alleviate this bottleneck is to encourage inventors to pursue licensing of their inventions. The typical route most academics take is to dump their technical data on the Tech Transfer office, and assume everything will take care of itself (or if it doesn't, they are off to the next thing already). There is huge benefit for inventors to take it at least one step further-- make sure the information is easily understandable, make the effort to "sell" the license to potential partners, and build a network of relationships with potential partners. It's a small effort to make to lock in "free" revenue streams for up to 20 years. In fact, finding a licensee early often means they will cover the (usually extensive) legal burden to ensure the patent is granted. (Most new inventors are shocked at how complex the process of getting an application to a granted patent ends up being.)

Not only is this lucrative for the inventor (and the business), but there is a feel good factor that comes with getting novel technology into the "real world". Especially considering the alternative of having all that hard work sit on the shelf, losing value with each passing month.

Sunday, April 26, 2015

3 Realizations from the AACR (American Association for Cancer Research) Annual Meeting

In April, one of the biggest gatherings of cancer researchers met in Philadelphia to discuss the current progress in the field. I was lucky enough to attend most of the conference this year, and three things stuck in my head:

1. The number of attendees and intellectual power focusing on cancer research is truly inspirational. Seeing rooms full of thousands of highly trained scientists and doctors dedicating their careers to solve this challenge exudes a sense of hope that this menacing disease can be conquered.

2. It surprised me how pivotal the role of cell biology is to the future of cancer treatment. The seeming tidal wave of good results from immunotherapies and T-Cell therapies is almost too good to be true-- and promises to usher in a new class of treatments for patients.

3. Even the most astonishing new technology or discovery takes time to reach the market. 15 years seems to be typical even of the biggest success stories. So while killer applications like adoptive T-Cell therapy and percision medicine (informed by gene sequencing) were widely published over a decade ago-- they are just now reaching the inflection point for clinical impact.

Wednesday, February 25, 2015

Is Biological Research the New Aspiration for Top Talent?

While most of us are exposed biological sciences in high school and college, the field has never really represented an aspirational career goal for young, highly talented individuals. For most of the past century, the respectable explanation of obtaining a life science degree was to then become a medical doctor. A much smaller subset successfully became rock-star professors-- but for the most part this path was secluded to the "genius types." (http://ascb.org/where-will-a-biology-phd-take-you/).

Common knowledge on college campuses (at least for the past 20 or so years), was that if you wanted to be really successful in life, you studied finance or computer science. And for the most part that premise has been supported by the trajectory of the economy. This also has the effect that new generations of talented individuals naturally tend to follow the fields that are the most lucrative.

For a long time (at least since before the human genome was first published), it has also been common knowledge that life sciences would be the successor (or at least the younger sibling) to the amazingly successful "computer science" boom. There are many explanations of why this has remained "5 years away" for a few decades, but I bet there is a high correlation with the career paths chosen by the "top talent" at our universities.

There is hope that we are near a time when the best and brightest will snub the 6-digit offer from Wall Street or a social media giant, and instead accept an even more lucrative offer from a life science technology company. Private money is starting to flow more readily into basic research, such as the new Allen Institute for Cell Science (http://alleninstitutecellscience.org/). Investments in "health tech" by industry giants like Apple and Google, and traditional VCs (http://rockhealth.com/2014/11/rock-weekly-digital-health-investments-soaring/) point to a general understanding by those with deep pockets that there is untapped value in the biological sciences. And perhaps most interesting is that the college campus "bioengineering start-up" is fast becoming the destination of the "cool kids".

Thursday, December 18, 2014

Is cell biology about to go commercial?



Image result for ascb logo

The recent American Society for Cell Biology annual meeting in Philadelphia, lived up to its billing as a mecca for the latest and greatest scientific minds to "talk shop" (as it has for many decades). Cell biology is a field that takes a lot of pride in its academic heritage, and has generally been associated with more of an "ivory tower" vibe as compared to the related fields of biochemistry, molecular biology, and genetics.  At this year's meeting, two interesting, and promising trends stuck in my mind that may signal a sea change for the field.

Trend 1: Science and technology are becoming one
While the concepts of quantitative biology, bioengineering, and computational biology have been around for many years, this year seemed to mark a turning point in the way these ways of thinking are being applied and presented in cell biology. Whereas it used to feel like these disciplines occupied separate enclaves in the same hall, this year there was a serious (and largely successful) effort to showcase how co-mingling of these elements can lead to exceptional science.

This trend was given an extra boost with the timely granting of the Nobel Prize for inventors of "super resolution" microscopy technologies-- largely applied for cellular imaging. The prevalence of high-technology as a cornerstone of good science has progressed to a state where current investigators are expected to showcase novel methodologies to be considered relevant.

Trend 2: Science and industry are coming together
Having attended ASCB for many years as both an academic and also a commercial exhibitor, the prevailing undertone was that the exhibit hall was a necessary evil (to pay for the event). As recent as 2013, many industry sponsored sessions were assigned rooms down dark hallways scheduled at inconvenient times. This year marked a significant change in both the physical atmosphere (a much cozier layout enabled less awkward mingling), and a real effort from both sides to "meet in the middle." For academics, this meant trying to engage exhibitors as enablers of their research (as opposed to sinkholes for grant funding); and for the exhibitors it meant connecting on scientific values (as opposed to market and sales pitches).

I view these trends optimistically as a sign that the infrastructure of cell biology is getting ready for a market boom. Once collaboration between academics and industry reaches a critical threshold, the positive feedback loop leads to a rapid acceleration of growth, funding, and broad impact.


Wednesday, October 15, 2014

Is life science research ready for the cloud?

There is substantial buzz (and lots of VC dollars) being paid currently on healthcare IT-- with folks like Apple and Google leading the charge. While this is a terrifically interesting topic to explore, my goal in this post is to recount similar activity of "web enabled products" in the much smaller market of life science research. The biology research market often serves as a good example to highlight the idiosyncrasies of markets/technologies involving living systems compared to computer systems. Customers in life science research are usually industry insiders, in "B2B" roles such as pharmaceutical, medical, or environmental R&D. While not a perfect analog, the relative slowness of web based technology adoption in this marketplace should serve as a warning for over-optimistic investors expecting a mainstream "bio-connected" web to emerge in the next 3-5 years.

In the late 90's and throughout the early 2000's, the tidal wave of web based companies spread into the life science research field-- most notably with applications such as electronic lab notebooks, bio-informatics software platforms/services, related enterprise data management software, and generally more user friendly software interfaces (for instruments, web-ordering, protocol sharing). As is typical in specialized markets such as biology research, no blockbuster company emerged, and most of the innovations have been modestly rolled into various products offered by more traditional incumbents. 

So while innovations have not been lacking in this space, customer adoption has been stubborn, and excruciatingly slow by consumer web standards. For most of the life science research community, adoption of web tools (even something as mundane as e-commerce) substantially lags that of what one would find in their "personal lives." This is a bit puzzling, given that most customers in this space are highly educated (Ph.D. training in science or engineering), likely graduates within the last decade, spend most of their professional day thinking about advanced technology, and probably spend most of their personal free time using web apps (Amazon, Facebook, etc.). If asked why adoption of better (known) technologies has been so slow, most would likely respond "our current systems don't work that way", "corporate IT...", "we've been doing fine with pen and paper since the dawn of science, why change now?", "grad student labor is cheap." In the pessimistic view, these are strong signs that the market is entrenched, with heavy barriers for startups to break through. In the optimistic view, it's a perfect place for disruption.

One of the great things about more recent IT-based startups is the stronger emphasis on "delivering solutions" over "providing technology." Examples of this next-generation of VC funded startups hoping to push the research customer into the internet age include Benchling (an MIT founded company promoting cloud based protocol/data sharing) and Emerald Therapeutics (who've recently launched the Emerald Cloud Lab with the promise of moving typical lab bench work to a remote web-based automated laboratory). Both have developed great interfaces to substantially increase R&D productivity. Both are also early stage, and facing the chasm of "mainstream" market adoption. It is still too early to tell if the life science research community is truly ready for web/cloud enabled products, or if it will take another 10 years to build the correct foundation. Regardless, it should be very clear to the current generation of scientists that more of these types of tools are needed to overcome the complex challenges facing the healthcare/biomedical technology field.

Monday, September 1, 2014

Computational Image Analysis is a Centerpiece of Cell Biology

For most biologists, especially cell biologists, it doesn't take long before one realizes how important visual processing is for interpreting scientific data. Likely in the first few weeks of an undergraduate molecular biology lab, young scientists learn the importance of being able to differentiate a "good" gel from a "bad" one by the look of the bands. At a slightly more expert level, we are trained to discriminate the "look" of healthy cells from problematic cells under a microscope. The most seasoned scientists develop the uncanny skill of scanning hundreds of fluorescently stained cells and mentally processing a) if the experiment ran correctly, and b) find a "representative" field of view that can be used in a publication. Typically, this skill will be explained as intuition, and most scientists can also quickly tick off a list of instances when such a process has led to novel discoveries and countless publications.

The success of visual intuition is one of the most amazing aspects of the human brain. In fact, the performance of human/animal brains in these tasks is far superior to modern computers, famously illustrated by the Google "cat recognition problem". A recent publication from IBM Research in Science replicated neural architecture in a computer chip, and not surprisingly the test case of its performance was to test image recognition. Given this trajectory of computational and engineering efforts, it won't be long until the algorithms of visual intuition are more rigorously quantified (and more broadly exploited).

In the cell biology field, an excellent tool to help scientists grasp computational image analysis is the CellProfiler free-ware developed by MIT/Broad Institute. This program offers fairly powerful analysis tools (which most companies were charging thousands of dollars), and also is structured in a way to maximize learning potential of novice to intermediate users. Most importantly, widespread access is removing the mystery of image analysis to a whole generation of scientists.

One of the first reactions when a biologist applies computational image analysis on a data set is disbelief. Even after the kinks of the routine are ironed out, and the analysis is performed on a "known" data set (for example one that was previously published), the results often won't look right to the user. A typical case being where an experimental condition promotes the expression of a target protein. In data from populations averages (e.g. Western blots), it should be obvious that there is 5X enhancement of expression after treatment. Subsequently looking at microscope images, one can clearly see the difference between control and experiment groups-- often highlighted in a prototypical case in the figure of a publication. However, upon doing the image analysis on hundreds of cells, a common result is that there is tremendous heterogeneity in the sample. An average 5X enhancement may result from 20X from a small subset, and no change in a surprisingly large percentage of cells. The opposite cause is also often true-- a large phenotypic change is caused by a small shift of response across the entire population of cells. These types of results, while not contradictory to the previous experimental data, often make scientists uneasy.

As computational methods become more commonplace in understanding and engineering biological systems, it is important to embrace the messiness of single cell data. While it may initially feel counterproductive to more traditional intuitive methods (compound A causes translocation of TF Y is "cleaner" than A increases the likelihood of translocation in X% of cells by P to Q fold), the reality is that our intuition is operating by the same logic as (well crafted) computational methods. The power of combining human and machine analysis to cell data should be to improve efficiency in discovery. Such an approach has (slowly) started to take root in histopathology, and will surely find many more applications in the life sciences.

Saturday, August 2, 2014

Why the microbiome is worth paying attention to



It is now common knowledge that the typical person is carrying around many more bacterial cells than human cells. While this fact should not be very surprising (I remember learning as a kid that the world was covered in germs), what is becoming extraordinarily interesting for biology is how these non-human cells affect our human ones. The old conventional wisdom was that this bulk of "harmless bacteria" were simply passive passengers on the human body. To be fair, most biologists and doctors probably believe (and have for some time) that micro-organism populations on the human body play a key role in health-- but were forced to shrug off that intuition with "we just don't know." With improved technology (primarily from Next Generation Sequencing) this barrier is coming down, and I anticipate a flood of important scientific findings in the coming years.

To put this in scale, the Human Microbiome Project identified thousands of bacterial species associated with the human body, and more importantly variations between locations on the body, and between individuals. More recent studies (such as this one from MIT) show (not surprisingly) that the bacterial populations change over time. And don't forget that these cells are interacting in populations within species and across species.

What implications might this have for the industry?

1. A research area equivalent to cancer. From a cell biology point of view, the complexity of the microbiome is on-par (and possibly exceeds) that of cancer research. The great challenge of cancer research has been the realization that cancer cells consists of hundreds (or thousands) of genetic variants, interact with surrounding cell types, and change over time. Simply taking into account the number of experiments that could be performed on the cells and interactions within the microbiome indicates that this is an area that will last a long time, and occupy many scientists (and represent a sizable market for research tools makers).

2. Direct associations with many health conditions. My prediction is that many health conditions that modern medicine has relegated to "non-treatable" will be found to be caused by (or substantially influenced by) the microbiome. This might include pervasive conditions like eczema, allergies, headaches, moodiness, lethargy, virus susceptibility, obesity, athletic performance, and fertility. (Note this is pure conjecture and not based on any scientific evidence on my part.) The microbiome may also prove to be the scientific basis for many "alternative medicine" practices.

3. A portal to "engineer" bio-medicine. As the biotechnology industry has proven over the last 30 years, bacteria are relatively easy to manipulate and engineer. With the new tools available through synthetic biology, it should just be a matter of time before someone figures out how to engineer the microbiome in a way to benefit human health.

4. Merging of "health" and "environment" industries. Since bacterial susceptibility to environmental changes will be much easier to prove biologically (vs. linking to human outcomes), the pace of environmental health studies should drastically increase. For example, once scientists have determined "healthy" and "unhealthy" gut microbiome states, it stands to reason that food will be scrutinized for effects on this composition (health foods, additives, etc.). While diet will likely be the main emphasis, other factors such as airborne particles, cosmetics, household materials/chemicals, and electromagnetic radiation could become areas of increased scrutiny.

While we are definitely in the early days of understanding the human microbiome, there is an exciting likelihood that research and technology in this space will have a substantial impact on our everyday lives.