Thursday, December 18, 2014

Is cell biology about to go commercial?



Image result for ascb logo

The recent American Society for Cell Biology annual meeting in Philadelphia, lived up to its billing as a mecca for the latest and greatest scientific minds to "talk shop" (as it has for many decades). Cell biology is a field that takes a lot of pride in its academic heritage, and has generally been associated with more of an "ivory tower" vibe as compared to the related fields of biochemistry, molecular biology, and genetics.  At this year's meeting, two interesting, and promising trends stuck in my mind that may signal a sea change for the field.

Trend 1: Science and technology are becoming one
While the concepts of quantitative biology, bioengineering, and computational biology have been around for many years, this year seemed to mark a turning point in the way these ways of thinking are being applied and presented in cell biology. Whereas it used to feel like these disciplines occupied separate enclaves in the same hall, this year there was a serious (and largely successful) effort to showcase how co-mingling of these elements can lead to exceptional science.

This trend was given an extra boost with the timely granting of the Nobel Prize for inventors of "super resolution" microscopy technologies-- largely applied for cellular imaging. The prevalence of high-technology as a cornerstone of good science has progressed to a state where current investigators are expected to showcase novel methodologies to be considered relevant.

Trend 2: Science and industry are coming together
Having attended ASCB for many years as both an academic and also a commercial exhibitor, the prevailing undertone was that the exhibit hall was a necessary evil (to pay for the event). As recent as 2013, many industry sponsored sessions were assigned rooms down dark hallways scheduled at inconvenient times. This year marked a significant change in both the physical atmosphere (a much cozier layout enabled less awkward mingling), and a real effort from both sides to "meet in the middle." For academics, this meant trying to engage exhibitors as enablers of their research (as opposed to sinkholes for grant funding); and for the exhibitors it meant connecting on scientific values (as opposed to market and sales pitches).

I view these trends optimistically as a sign that the infrastructure of cell biology is getting ready for a market boom. Once collaboration between academics and industry reaches a critical threshold, the positive feedback loop leads to a rapid acceleration of growth, funding, and broad impact.


Wednesday, October 15, 2014

Is life science research ready for the cloud?

There is substantial buzz (and lots of VC dollars) being paid currently on healthcare IT-- with folks like Apple and Google leading the charge. While this is a terrifically interesting topic to explore, my goal in this post is to recount similar activity of "web enabled products" in the much smaller market of life science research. The biology research market often serves as a good example to highlight the idiosyncrasies of markets/technologies involving living systems compared to computer systems. Customers in life science research are usually industry insiders, in "B2B" roles such as pharmaceutical, medical, or environmental R&D. While not a perfect analog, the relative slowness of web based technology adoption in this marketplace should serve as a warning for over-optimistic investors expecting a mainstream "bio-connected" web to emerge in the next 3-5 years.

In the late 90's and throughout the early 2000's, the tidal wave of web based companies spread into the life science research field-- most notably with applications such as electronic lab notebooks, bio-informatics software platforms/services, related enterprise data management software, and generally more user friendly software interfaces (for instruments, web-ordering, protocol sharing). As is typical in specialized markets such as biology research, no blockbuster company emerged, and most of the innovations have been modestly rolled into various products offered by more traditional incumbents. 

So while innovations have not been lacking in this space, customer adoption has been stubborn, and excruciatingly slow by consumer web standards. For most of the life science research community, adoption of web tools (even something as mundane as e-commerce) substantially lags that of what one would find in their "personal lives." This is a bit puzzling, given that most customers in this space are highly educated (Ph.D. training in science or engineering), likely graduates within the last decade, spend most of their professional day thinking about advanced technology, and probably spend most of their personal free time using web apps (Amazon, Facebook, etc.). If asked why adoption of better (known) technologies has been so slow, most would likely respond "our current systems don't work that way", "corporate IT...", "we've been doing fine with pen and paper since the dawn of science, why change now?", "grad student labor is cheap." In the pessimistic view, these are strong signs that the market is entrenched, with heavy barriers for startups to break through. In the optimistic view, it's a perfect place for disruption.

One of the great things about more recent IT-based startups is the stronger emphasis on "delivering solutions" over "providing technology." Examples of this next-generation of VC funded startups hoping to push the research customer into the internet age include Benchling (an MIT founded company promoting cloud based protocol/data sharing) and Emerald Therapeutics (who've recently launched the Emerald Cloud Lab with the promise of moving typical lab bench work to a remote web-based automated laboratory). Both have developed great interfaces to substantially increase R&D productivity. Both are also early stage, and facing the chasm of "mainstream" market adoption. It is still too early to tell if the life science research community is truly ready for web/cloud enabled products, or if it will take another 10 years to build the correct foundation. Regardless, it should be very clear to the current generation of scientists that more of these types of tools are needed to overcome the complex challenges facing the healthcare/biomedical technology field.

Monday, September 1, 2014

Computational Image Analysis is a Centerpiece of Cell Biology

For most biologists, especially cell biologists, it doesn't take long before one realizes how important visual processing is for interpreting scientific data. Likely in the first few weeks of an undergraduate molecular biology lab, young scientists learn the importance of being able to differentiate a "good" gel from a "bad" one by the look of the bands. At a slightly more expert level, we are trained to discriminate the "look" of healthy cells from problematic cells under a microscope. The most seasoned scientists develop the uncanny skill of scanning hundreds of fluorescently stained cells and mentally processing a) if the experiment ran correctly, and b) find a "representative" field of view that can be used in a publication. Typically, this skill will be explained as intuition, and most scientists can also quickly tick off a list of instances when such a process has led to novel discoveries and countless publications.

The success of visual intuition is one of the most amazing aspects of the human brain. In fact, the performance of human/animal brains in these tasks is far superior to modern computers, famously illustrated by the Google "cat recognition problem". A recent publication from IBM Research in Science replicated neural architecture in a computer chip, and not surprisingly the test case of its performance was to test image recognition. Given this trajectory of computational and engineering efforts, it won't be long until the algorithms of visual intuition are more rigorously quantified (and more broadly exploited).

In the cell biology field, an excellent tool to help scientists grasp computational image analysis is the CellProfiler free-ware developed by MIT/Broad Institute. This program offers fairly powerful analysis tools (which most companies were charging thousands of dollars), and also is structured in a way to maximize learning potential of novice to intermediate users. Most importantly, widespread access is removing the mystery of image analysis to a whole generation of scientists.

One of the first reactions when a biologist applies computational image analysis on a data set is disbelief. Even after the kinks of the routine are ironed out, and the analysis is performed on a "known" data set (for example one that was previously published), the results often won't look right to the user. A typical case being where an experimental condition promotes the expression of a target protein. In data from populations averages (e.g. Western blots), it should be obvious that there is 5X enhancement of expression after treatment. Subsequently looking at microscope images, one can clearly see the difference between control and experiment groups-- often highlighted in a prototypical case in the figure of a publication. However, upon doing the image analysis on hundreds of cells, a common result is that there is tremendous heterogeneity in the sample. An average 5X enhancement may result from 20X from a small subset, and no change in a surprisingly large percentage of cells. The opposite cause is also often true-- a large phenotypic change is caused by a small shift of response across the entire population of cells. These types of results, while not contradictory to the previous experimental data, often make scientists uneasy.

As computational methods become more commonplace in understanding and engineering biological systems, it is important to embrace the messiness of single cell data. While it may initially feel counterproductive to more traditional intuitive methods (compound A causes translocation of TF Y is "cleaner" than A increases the likelihood of translocation in X% of cells by P to Q fold), the reality is that our intuition is operating by the same logic as (well crafted) computational methods. The power of combining human and machine analysis to cell data should be to improve efficiency in discovery. Such an approach has (slowly) started to take root in histopathology, and will surely find many more applications in the life sciences.

Saturday, August 2, 2014

Why the microbiome is worth paying attention to



It is now common knowledge that the typical person is carrying around many more bacterial cells than human cells. While this fact should not be very surprising (I remember learning as a kid that the world was covered in germs), what is becoming extraordinarily interesting for biology is how these non-human cells affect our human ones. The old conventional wisdom was that this bulk of "harmless bacteria" were simply passive passengers on the human body. To be fair, most biologists and doctors probably believe (and have for some time) that micro-organism populations on the human body play a key role in health-- but were forced to shrug off that intuition with "we just don't know." With improved technology (primarily from Next Generation Sequencing) this barrier is coming down, and I anticipate a flood of important scientific findings in the coming years.

To put this in scale, the Human Microbiome Project identified thousands of bacterial species associated with the human body, and more importantly variations between locations on the body, and between individuals. More recent studies (such as this one from MIT) show (not surprisingly) that the bacterial populations change over time. And don't forget that these cells are interacting in populations within species and across species.

What implications might this have for the industry?

1. A research area equivalent to cancer. From a cell biology point of view, the complexity of the microbiome is on-par (and possibly exceeds) that of cancer research. The great challenge of cancer research has been the realization that cancer cells consists of hundreds (or thousands) of genetic variants, interact with surrounding cell types, and change over time. Simply taking into account the number of experiments that could be performed on the cells and interactions within the microbiome indicates that this is an area that will last a long time, and occupy many scientists (and represent a sizable market for research tools makers).

2. Direct associations with many health conditions. My prediction is that many health conditions that modern medicine has relegated to "non-treatable" will be found to be caused by (or substantially influenced by) the microbiome. This might include pervasive conditions like eczema, allergies, headaches, moodiness, lethargy, virus susceptibility, obesity, athletic performance, and fertility. (Note this is pure conjecture and not based on any scientific evidence on my part.) The microbiome may also prove to be the scientific basis for many "alternative medicine" practices.

3. A portal to "engineer" bio-medicine. As the biotechnology industry has proven over the last 30 years, bacteria are relatively easy to manipulate and engineer. With the new tools available through synthetic biology, it should just be a matter of time before someone figures out how to engineer the microbiome in a way to benefit human health.

4. Merging of "health" and "environment" industries. Since bacterial susceptibility to environmental changes will be much easier to prove biologically (vs. linking to human outcomes), the pace of environmental health studies should drastically increase. For example, once scientists have determined "healthy" and "unhealthy" gut microbiome states, it stands to reason that food will be scrutinized for effects on this composition (health foods, additives, etc.). While diet will likely be the main emphasis, other factors such as airborne particles, cosmetics, household materials/chemicals, and electromagnetic radiation could become areas of increased scrutiny.

While we are definitely in the early days of understanding the human microbiome, there is an exciting likelihood that research and technology in this space will have a substantial impact on our everyday lives.

Tuesday, July 22, 2014

Public funding is essential for new technologies


I was recently driving through the Sierra Nevada range in California, and the references to the gold rush reminded me of a related concept I've been thinking about lately: early stage technology advances require government funding. In fact, I've come to believe public funding is not only important to accelerate technology, but without it, there would be no noticeable progress. This is highly relevant to the current state of life sciences funding given the challenges with NIH and NSF budgets. To illustrate that point, think about the following 3 steps required to strike it rich in the gold rush.

1. Explore and discover. This is the well established activity of academic basic research institutions. Often characterized by mavericks or small groups wandering in uncharted territory where no 'sane' person would spend their time or money. Public funds are an efficient and equitable way to fund such endeavors (by supporting a relatively large number of investigators with relatively small awards). Just like the early days of gold mining, this works because if the search is spread wide enough, there are bound to be a lucky few that find gold.

2. Dig holes and build roads. The reality is that discovering gold is only the beginning of the process to creating wealth. The next steps can be described as two-fold: a) high risk/reward ventures often led by startup companies (dig holes), and b) support of basic infrastructure to increase the flow of progress (build roads). It's reasonably intuitive that road building falls in the realm of government funding. In life sciences, this is served by advisory/regulatory agencies, a trained workforce, the healthcare system, and related laws. What is often less appreciated is the role of public funding for hole digging. In earlier days, a lot of this was funded by venture capital and corporate research, but the burden is now heavily carried by programs such as SBIR grants and funding from non-profit endowments (Gates Foundation being the most famous). 

3. Bring in the heavy machinery. Once the vein of gold is located, big business is ready to move in. Since this task requires a lot of capital, lots of people, countless transactions to execute, and generally longer timescales, corporate entities can easily out-compete the smaller players. It is not a coincidence that this is also where all the big money is made.

I was lucky enough to have been able to follow this trajectory (from 1 to 2 to 3) with a technology that originated in graduate school, led to a startup, and was later absorbed into a large corporation. From this experience, I have gained an appreciation of how the loop gets closed.

The forward direction is fairly intuitive to most of us living in a capitalistic society. On this path, business decisions are driven by net present value (NPV)-- essentially a measure of how much money a company can make. The practical implication is that big money tends to chase easy money. In the example above, most disciplined investors no longer believe the NPV is sufficiently attractive in step 2, and will wait until step 3. It should also be fairly apparent that government funding creates more positive NPVs for businesses to take advantage of and make more money. In most cases, without the initial injection of "free money" by public sources, business would substantially slow down.

The arrow that transfers wealth back is less intuitive-- but comes in the form of taxes. Government funding is not (and should not) be based on NPV, and instead tries to maximize other values such as westward expansion, education, and job creation. Given that public funding is helping companies make money, it is only fair that some of this is fed back to governments to continue funding the early innovation pipeline. In the current capitalist system, this occurs "naturally" through income and employment taxes. Without debating the politics of tax structure, suffice it to say that this transfer of wealth is necessary for proper function of the innovation engine.

I have been able to see both sides of this cycle-- the elation of receiving "no strings attached" funding to launch an early stage idea, as well as forking over eye-popping amounts of money in the form of taxes. The correct balance of these two forces are crucial for proper support of technology innovation.

Monday, July 14, 2014

Three Signs it's Time to Invest in Biological Technology

When I started down the "bioengineering technology" path in the late 90's, the natural assumption was that it would quickly follow the rapid market success of the internet. There was widespread optimism in the power of research and engineering, nearly endless capital to fund bold ideas, and a heartening stream of good news in the form of breakthrough discoveries (DNA sequencing, stem cells, systems biology). We are now fifteen years or so since that prominent wave of activity. While there have been some notable successes (personal genomics), the general sentiment is that biological technology has not caught the wave of market success that was expected. It's definitely not time to give up on the industry, but it may be useful to ponder what the appropriate timing is for the coming market success.

While it is never possible to predict market success, the following three signs are pretty good indicators that the wave is upon us.

1. A hero company emerges
A key trigger to signal the dawn of a new era is the emergence of "hero" companies. We are all familiar with examples in the internet industry-- Google, Amazon, Facebook. These are companies that are widely admired, exceptionally well funded with nearly bottomless revenue streams, possess exceptionally deep employee talent pools, are constantly in the news, and seem to effortlessly increase market value by billions of dollars at a time. The key here is that it doesn't matter who the hero company it, the fact that one has emerged is a sure sign of broader market success. The momentum and wake of such a company stimulates the entire industry-- driving new investment capital, becoming an acquirer of start-ups, providing training grounds for talent, fueling a swell of public investment, raising awareness in mass media to drive additional revenue, and so on.

In the cellular technology space, we have yet to see a company rise to this level. At various times, there were strong contenders coming from various fields such as stem cell science, miniaturization and automation systems, novel analysis instruments, and synthetic cell engineering. These cases were highlighted by big venture funding investments (generally above the 9-digit mark), impressive technology and products, and genuine excitement among the scientific communities. In almost all cases, these heroes-in-training hit a plateau as their acceleration slowed. Much of this was due to lofty expectations of investors, but also due to factors highlighted in #2 and #3 on this list. However, there is no rule that a hero has to be an overnight success, and some of the more patient companies are starting to show renewed momentum.

2. Publicly funded infrastructure is understood by the public that funds it
For those of us in the right age group, we first started using products such as email, web browsers, smart phones, and tablets at stages when most of the public (think parents) would comment "I've heard of that gizmo... very cool technology, but I can't imagine it'll be mainstream." While there are multiple factors leading to this transition, one of the central aspects is that a broad infrastructure needs to be in place for entrepreneurs to launch their companies and products from. Even for fast moving fields like electronics or internet, it took decades of work to put together the underlying systems and knowledge pools necessary for commercial success. On top of that, there is a time period where these systems are only used and understood by small groups-- the military, professional scientists, enthusiasts. At some point, this knowledge crosses the threshold to the public, and revenues start to flow.

The infrastructure for biology is bit foreign to most consumers. A common misconception is that biological technology is a component of the bio-pharmaceutical industry. While there is some overlap, the underlying infrastructure is different. By my definition, a key difference is that the products created via biological technology will not be "drugs." They may be used by bio-pharmaceutical companies, but just as easily find applications directly in consumer hands. Some of the initial steps have taken hold, such as the creation of a generation of bio-engineers to fill academic, scientific, industry, and innovation talent pools (this still requires heavy life support from government funding until commercial funding is ready). This has led to commercial application of many core technologies, building up manufacturing, marketing, and sales channels. Probably the two biggest infrastructure challenges to enable a biological technology renaissance is the creation of a more transparent healthcare marketplace, and regulatory (and popular) understanding of how to interpret personal biological information. Interestingly, both topics are trending in the public awareness.

3. Mass market applications
It is nearly impossible to achieve #1 or #2 without delivering a product that has mass market appeal. Often, a novel technology will initially be used to develop products marketed towards a niche user group. For biological technology, this is almost always the research community (academic and bio-pharmaceutical). It should only be a matter of time before an enterprising company figures out how to deliver a product that expands beyond this niche. Stay tuned for future posts that will speculate on where these may come from.

The emergence of these three elements for biological (dynamic cell) technology is likely within 5 years (if funding of basic science and infrastructure continues). Fifteen years after the first wave, many of the initial biological technologies have been matured in the "real world", and there is an impending convergence of biology with electronic device and personal information products. In the coming years, there will likely be a lot of risk, turnover, and competition in the innovation marketplace as the best companies position themselves for long term success. As every entrepreneur inherently knows, it's great to be able to spot the coming wave, but if you are not already there, it's too late. The good news for everyone else is that regardless of which company wins, the outcomes will be truly impressive, and likely to last for many years.

Saturday, July 5, 2014

Finding the "Killer App" and the World Cup

The legend of the "killer app" is a story every new entrepreneur learns from the elders. There are many variations, but the general idea is that scores of brilliant researchers and brave businessfolk work many years on breakthrough technologies and products with little to show for it-- until one day, the stars align, a "killer app" is born, and a lucky few get to join the ranks of the self-made billionaire. There are plenty of real-life examples of this in every industry (and sub-industry), and from a narrative perspective, it is hard to argue that this is one of the defining themes in technology innovation. What I hope to explain in this post is that while this story is true for the successful "killer apps," it does not imply that there is any way to predict beforehand what that application will be. I will use the World Cup as a crutch to illustrate this point, which at the time of this post is entering the semi-final round in Brazil.

If you are an innovator in an emerging market space, you should definitely strive to find the "killer app." (Note: there really isn't a good definition or criteria of what constitutes a "killer app," but if your investors ever say you've found it, have a party. My own vague definition is that success will transform what was previously a "niche application" to one that the whole world, including your grandmother, is using.) In the midst of this quest, don't waste your time trying to convince anyone that you know the answer. If you must, assert that you have discovered a possible answer. Consider the question "do you know who will win the World Cup?" The correct answer (until time-travel is possible) is "of course not." But poll your friends and you are unlikely to get that response. Further, if you are clever enough to re-ask this question over the course of the tournament (being careful not to offend anyone), I'm sure you'll also find that individual responses will change over time. "I was sure that team A was great, but they are all screwed up, so of course team B will win." When the final game is over, there will be a definitive champion, and then you can brag about your foresight.

My graduate work was in the field of microfluidics, and in those days (as I'm sure it's still true today), we spent many nights thinking about potential applications for the technology. Even more telling, there were hundreds of peer-reviewed publications on novel applications, and probably only a handful have reached any level of commercial success. As some readers may know, the dominant application for biological microfluidics today is for next-generation sequencing sample preparation. Suffice to say, this was not obvious to many experts 10-15 years ago. This doesn't mean that most of us didn't think about this application, or were blind to the factors for its success. But the fact is that it was one application out of many that could have taken off. Of course, there were a number of people who did bet big on this application, and were aptly rewarded for it. It is worth noting that even in the face of uncertainty, passion and conviction for your team is critical. Just think of the feeling if your team does win the World Cup of being able to say "I knew it from the beginning-- and I'm right!" It's also helpful to know that just like sports fans, the technology community appreciates a passionate fan of a losing team much more than an opportunistic supporter of the victor.

The field of dynamic cell analysis has yet to find it's "killer app," although many have been in contention for a number of years. Continued advances in the field show high probability that one will emerge, but when and which one is impossible to predict. In my view, the current "bracket" of contenders (in no particular order) includes the fields of: in vitro toxicity testing (organ-on-a-chip), precision stem cell control, cell-based weapons against cancer heterogeneity, manipulating/decoding brain neuron firing, pathogen detection (food, virus, infection), environmental monitoring (climate change, ecosystem health), bio-energy production, lab-grown food, cell-based consumer health devices, and cell-based manufacturing (many of these will be detailed in future posts). There are plenty of intrepid teams working in all of these areas (and other interesting areas that are not listed), and when a victor does get crowned, we should make sure to celebrate-- and then start preparations for the next round.

Tuesday, July 1, 2014

Why Optical Imaging Will Continue to Rule Cellular Analysis

A stroll through any cell biology research lab will likely reveal crammed benches full of optical microscopy systems. While instruments such as plate readers, flow cytometers, and cell counters are still commonly used, there has been a strong trend (most evident in the past 5 years) towards better imaging as the method of choice. It only takes a few quick web searches to find the expansion of microscopy technologies as well as the adoption of imaging to the previously "non-imaging" analysis instruments.This fervent competition among the instrument vendors has led to a strong pace of innovation and excellent product options for the research scientist. In this post, I'd like to take a step back and ponder whether this "imaging arms race" is leading down the right path for the cellular analysis community.

The underlying demand for improved imaging technology is the desire to visualize living cells at full resolution and in real time. Human intuition teaches us that "seeing is believing," and this applies to the study of a single cell the same as it would to an exotic giant squid. While this assumption takes a bit of a leap of faith, it is likely to be correct for cellular research. The core reason is that like the study of animal behavior, cellular biology is dynamic and non-deterministic. (In contrast, a phenomenon such as gravitational force does not require seeing to believe.) There is also sufficient scientific evidence that intracellular processes are largely "see-able"-- such as ionic flux, protein binding/localization, genetic switches, morphology/movement, etc. It is also worth highlighting the emotional power of visualization, which undoubtedly drives many purchasing decisions (for research equipment as well as in our living rooms). Our brains are wired to respond to full color, three-dimensional, 60 fps, "retina display" visuals by triggering associations with reality and truth. It is not surprising that it is rare to find a "cell paper" that does not have figures with microscope visuals. The prevalence of imaging in cell biology is deeply entrenched, and is unlikely to diminish.

The dynamic cell analysis community also receives key benefits from the much larger economy around displays, cameras, light sources, software, and digital storage driven by the same human desire for visualization. In fact, it is hard to imagine modern microscopy existing without the substantial investments made in the consumer electronics markets-- digital cameras, LED light sources, terabyte hard drives. Even more importantly, this scale is driving down costs of components such that in the near future, a cellular imaging instrument could become near-ubiquitous. (Not too long ago, cellphone CMOS cameras were thought to be too low quality for everyday use.) Another aspect of this ecosystem that has yet to spill over to biology is the tremendous progress being made in image analysis/recognition and 3D graphics software. The combined economies in security/surveillance, movies, and gaming will surely blow this space up in both capability and cost. This confluence of technology ecosystems is a great sign for the future of scientific imaging instruments.

As with any emerging technology, a key challenge is to convince the broader public to support and adopt an often foreign concept. For most people, the application of cellular analysis technology is far from mainstream, and evokes images of scientists in biohazard suits. While this acclimation and communication takes its own course (to be covered in a future post), the cellular analysis field has a strong advantage in being able to appeal to the sense of dynamic visualization (a fancy term for story). There is an inherent cinematic quality in watching a cell rebellion to turn cancerous, or the decision of a pluripotent cell to commit to a certain fate, or a cell succumbing to a pathogen invasion. For these reasons, the continued drive towards better imaging instruments, and the community's commitment to acquire better and more powerful visuals is definitely a good thing.

Monday, June 30, 2014

Welcome to the Dynamic Cell Blog

As most of us first learned in High School Biology, life starts with the individual cell-- a self contained microscopic machine. Whether as single celled bacteria, coordinated masses of cells (that we commonly associate as organisms), or populations of interacting cells (societies, ecosystems), the information carried by cells (and the actions carried out by them) underlie our daily experiences on this planet.

How is it possible that such an immense realm of existence can be enabled by such a tiny object? The answer starts with the magic of DNA-- the 3 billion base pair code (in humans) that we all know as the blueprint of life. However, the real miracle lies in the dynamics of the living cell. For you sports fans out there, think of the genetic code as the playbook, and the dynamics as the match. A big reason many of us pay good money and spend inexplicable amounts of time following sporting events is that even when constrained by the same rules, no two matches are the same. In every split-second interval, there are countless small decisions, each affecting and reacting from other players, that ultimately determine the outcome. In this analogy, as in this blog, the premise is that the dynamics define the game. It's what makes biology exciting, and the power to understand and control the dynamic cell will change life as we know it.

The purpose of this blog is primarily to collect, curate, and comment on the advance of scientific and technological breakthroughs in understanding and controlling the dynamics of living biological cells. The associated Twitter feed (@DynamicCell) will provide timely updates on new research, publications, products, and applications relevant to this topic. The authors and contributors to this site are long time enthusiasts in this field, with extensive professional experience at the leading edge of innovation. In general, we will lean towards the application of current breakthroughs to "real-wold problems." The main target audience will be technically knowledgeable, but not necessarily experts on the subject. Given the rapid nature and inherent complexity to this space, we feel that providing well-selected articles and commentary will be an efficient means for the broader research and innovation community to stay current on dynamic cell analysis and the implications of its applications.

As a starting point, you will likely see a number posts around currently hot areas in dynamic cell analysis-- live cell microscopy, predictive in vitro culture methods, biology of the brain, cancer heterogeneity (and personalized treatment), synthetic biology, and new enabling commercial products (hardware, software, wetware). My prediction is that as the field continues to advance, we will see more content on "consumer-level" products and applications in areas such as health monitoring, medical practice, and environmental impact. Our goal is to be agnostic of institutional or commercial biases, and to attempt to report as cleanly as possible the state of the field.

Having spent nearly 15 years investigating the leading edge of dynamic cell analysis, I believe that 2014 is an opportune time to launch this blog. From my perspective, the pace of discovery and emergence of applications has reached an inflection point within the last year. This is driven by the new critical mass of "bioengineering" expertise deployed in the professional ranks, the wealth of fundamental genetic information available to science, the reducing costs of high-end analysis technology, and a renewed push by the research, pharma, and government communities to solve the "hard problems" of biology with applied solutions.

Thank you for visiting. Hopefully, like me, you can't wait to see what the future will bring.