March 3, 2017

Life Sciences in the Cloud: Everything Changes, All The Time

glenn blog

Earlier this month I gave a talk at the Amazon Web Services’ (AWS’) “Cloud Computing and Life Sciences” event in Cambridge, UK. Eagle Genomics is an AWS Consulting Partner, and it’s great to be invited to speak at events like this because it helps keep us in touch with the trends in cloud computing. This blog post will summarise the talk, in particular the trends towards cloud adoption that we’ve seen over the seven years since Eagle was founded.

Note: These views are from the perspective of a UK SME; we work with lots of clients on a variety of projects so I have a pretty good overview of the industry. There’s only one caveat I’d add; Eagle is well known for our work with cloud solutions, particularly AWS, so there’s probably some section bias towards cloud in our customer base.

From my vantage point, I see that science, attitudes, strategy, language, and skills are all changing in the world of genomics IT. I’ll take each of these areas in turn.

SCIENCE changing

The science that companies and academic institutions are doing is changing far faster than traditional IT infrastructure can keep up with. Years ago it was sufficient - and fun - to build your own BeoWulf cluster in a spare office; however, that just doesn’t cut it any more. The demands of modern science, particularly genomics, mean that in-house computing budgets, resources and expertise have been really challenged to keep up. I’d argue that the cloud is the best solution for the majority of use-cases. The exceptions are the few institutions with very large-scale, nationally or internationally funded infrastructure, for example the European Bioinformatics Institute and Wellcome Trust Sanger Institute in the UK, and the National Center for Biotechnology Information in the US.

ATTITUDES changing

Eagle has long-established relationships with a number of customers, and we see attitudes changing over the course of those relationships. For example, I recall a meeting in 2012 with a particular client, where they said (I’m paraphrasing slightly) “It has to be internal unless there is a good reason to put it on the cloud”. Fast forward to 2015, and the same client is now saying  “It has to be on the cloud unless there is a good reason for it not to be”. So there’s a spectrum of cloud adoption. Companies are starting from different places and moving along the spectrum at different rates, but the direction of movement is almost universally towards the cloud, not away from it.

That said, Eagle is a business and we need to work within any constraints that our customers require, and some of those customers want an on-premise solution only. For that reason all Eagle solutions are deployable on AWS and on-premise - we make use of technologies like Chef,Ansible and Docker to help with this, but that’s another blog post! This requirement does limit our use of AWS-specific technologies, but the economics of the situation mean that, for the moment at least, the additional overhead of building the ability to deploy anywhere is justified for us.

STRATEGY changing

When cloud solutions first came along, they were seen as a version of your own IT infrastructure, but residing elsewhere. Now, it’s the case that cloud concepts and practices are influencing and shaping in-house computing resources; this is exemplified by the language people use.

LANGUAGE changing

A few years ago, we talked about data centres, grids, network fabric, job schedulers and so on. Of course, those terms haven’t gone away,  but we’re finding that terms and technologies like virtualisation, job clustering, Docker and infrastructure automation which came of age on the cloud are increasingly being used by in-house IT teams, as evidenced by the rise of the slightly misleading term “private cloud”.

SKILLS changing

The cloud has lead to the democratisation of (virtual) hardware availability; everyone’s a sysadmin now! We’ve seen the rise of the devops role. Using AWS as an example, there is still a low barrier to entry - a computer and a credit card is all you need - but the number and complexity of the various services which AWS offers is increasing rapidly. This means that we can no longer expect our bioinformatics consultants and software developers to keep up with all of the details of AWS services.

So we’re starting to hire people to do devops only, and in some cases looking at Platform as a Service (PaaS) solutions like Heroku as a more appropriate fit for some of our deployments, particularly eaglecore, our flagship metadata management product.

To summarise, we’re seeing that the pace of change is accelerating, the overwhelming momentum is towards the cloud, and this is enabling both new types of business and new companies. It’s been an exciting few years, and the next few promise to be even more so!

What do you see as the biggest changes in life science IT systems and architecture?

Topics: Ansible, AWS, Blog, Chef, Cloud, Cloud data, conference notes, CTO, Docker, eaglecore, Glenn Proctor, life sciences