Interested in how DevOps, IT Modernization and Agile practices can positively impact customer experience?
The future of pharmaceutical companies is changing. From new ways to develop and extend the use of drugs, to direct patient communication, digital pharma is coming. Many industries have seen a surge in digital business models, but pharma has generally been seen as lagging, burdened by regulation, economic factors and traditional processes. While many pharma CIOs understand the importance of a digital transition and aligning to a patient-centric model, the way forward is not clear. New value streams can be created through digital delivery of information and services, and more targeted therapies can be developed, based on data and analytics. Additionally, infrastructure can be reduced to the on-demand benefits of the cloud. These initiatives may appear to compete for IT leaders’ attention and resources, slowing the decision-making process. Defining tomorrow’s innovation today Analysts predict that, due in large part to newly available technologies and resources, the next 20 years of innovation in the pharmaceutical industry will be defined by the evolution of digital strategies in the next two years. For this reason, leading pharmaceutical companies are focusing on public cloud implementation plans to create infrastructure that will enable new ways to develop and deliver services. Customer Engagement – New methods and services to interact with consumers and collect more data for effective marketing, improving accurate dosing, and collecting additional data for a more complete health profile. Internet of Things – Enabling and capturing data from wearable objects, devices and instruments. Personalized Medicine – Understanding how drugs can affect individual, or groups of patients, using genomic analysis and trends. Product Innovation – New R&D processes to discover new opportunities.
The rise of IoT has given rise to a new generation of needs in the world of big data processing. Now we need to handle data ingress from many sensors around the world, and make real-time decisions to be executed by these devices. As such it is no surprise we see new services to handle the processing of streaming data, such as Amazon Kinesis.
Flux7 CEO Aater Suleman will be speaking at DevOps Days Austin Monday and Tuesday May 4-5, 2015. Aater’s presentation, titled “IT Process Orchestration with Jenkins” is during the Ignite block on May 5th, starting at 12:45. See the full program here. DevOps is about improvement and improving IT and process orchestration is a great place to focus. This ignite talk demonstrates how at Flux7, we’re using Jenkins on the backend to power a front-end dashboard that eases many of the processes needed for continuous integration and regression tests as well as load testing. DevOps implementation and strategy has become a primary request from organizations moving to or optimizing cloud infrastructure. We’ve observed several key needs in our DevOps practice and frequently rely on Jenkins as the back end engine to address these needs. Yet, customers needed a more streamlined interface in order to simplify processes and management further. As such, we’ve created Fluxboard, a custom dashboard that both automates some of the most frequently needed tasks and allows the developers to easily perform infrastructure tasks.
How does the Internet of Things (IoT) change the way we develop, test and release software? Always-on connectivity introduces a new set of risks and challenges for development and operations teams. Last Tuesday, Flux7 CEO Aater Suleman (@futurechips) joined an online panel on this subject as part of Continuous Discussions (#c9d9), a series of community panels about agile, continuous delivery and DevOps. Other guest panelists were Shailesh Mangal, CTO of Zephyr, and Walter Buga, CEO of Arynga. Continuous Discussions is a community initiative by Electric Cloud, which powers continuous delivery at businesses like SpaceX, Cisco, GE and E*TRADE by automating their build, test and deployment processes. The discussion focused on the implementation and practical development of IoT) devices. The panelists were asked a number of questions about what they believed were the main changes that have occurred, as IoT has changed from being a theoretical concept to the point today where it is a practical reality. The Essence of IoT The first question that the panellists were asked to ponder on was what actually is the essence of IoT? The panelists had varying views on where the focus of IoT is. To Aater “the essence is machine to machine communication, having these devices connect to each other over the Internet. I have built IoT device networks that did not involve the cloud, before the cloud actually existed, and when the amount of data was not that large, so it was not necessarily a data play.” Velocity, Quality and Features The panelists were then asked what they felt was most important in relation to IoT: velocity, quality or features. Aater felt that this very much depended on the particular IoT market segment you were focusing on.
As Docker containers picks up steam, the last weekend in March, Austin enjoyed the first Container Days #cdatx. This event was modeled after the highly successful DevOps days. It was a great event and credit goes to Boyd Hemphill from Stack Engine for spear-heading the effort. And of course credit is due to the strong Austin DevOps community. Turnout was good and conversation lively as the local software development environment is actively embracing container strategies. In fact, some of the talks were so popular, we had to move to larger rooms – thanks Hawaiian Falls for being so accomodating. The first thing we noticed about the event was the companies represented. Companies of all sizes were represented, small startups to large companies like Google and IBM, really demonstrating the breadth and depth of container adoption that is happening. Containers are emerging as a critical technology for enabling agile development and architectures and creating business advantage both in the enterprise and for the startup community. This was evident as presenters and attendees discussed the challenges, opportunities and best practices for container-based DevOps. As organizations embrace the idea of containers, infrastructure must change to accommodate them. Yet, as far as container adoption has come since our personal first foray in 2013 when Docker was still wet behind the ears and Ben Golub himself was presenting at Austin Meetups, it is evident there is still a long way to go and the couple of years will continue to see interesting, innovative progress in this space. Fittingly, the event was designed as an unconference which allowed for a lot more one on one interactions with knowledgeable people. Being an unconference we got to see what the priorities of the developers are more than we would have with a top-driven agenda.
Last week, our life sciences and healthcare solution practice was officially recognized as a Life Sciences Competency Partner by Amazon Web Services. We share this recognition with the seven original systems integration partners, announced with the launch of the program in October 2014. However, the small size of this expert group is not reflective of the amount of innovation in the health sector, but rather the complexity of issues subject matter expertise relevant for success with cloud infrastructure in this area. Indeed, digital health solutions, genomic medicine and modernizing traditional healthcare are areas that are at an inflection point from a variety of political, technical, legal/compliance and policy, and consumer-driven pressures. The sector is gaining significant attention from investors fuelling potential solutions to a variety of challenges, and spurring rapid rates of growth and change. New business models and new innovations are pushing the boundaries of traditional life sciences and digital health, the convergence of the digital and genetics revolutions with health and healthcare, is transforming the way consumers and patients manage their health, prevent disease, and obtain healthcare services. This transformative thinking requires a new approach to infrastructure. Juniper Research predicts that connected healthcare and fitness device services will produce $1.8 billion in annual revenues by 2019. According to Juniper’s new report, this is a sixfold increase from 2015, which has predicted revenues of $320 million. This area, driven by pressure from consumers, is just one example of where infrastructure is providing business advantage. The ability to rapidly scale a business while maintaining customer satisfaction is most efficiently managed using cloud-based infrastructure. Pay-as-you-go usage and reduced CAPEX expenses are welcome reliefs for startups as well as larger, budget-conscious organizations.
Last week, Amazon Web Services announced the availability of larger and faster Elastic Block Storage Volumes, something we’ve been looking forward to since the original announcement at re:Invent 2014. AWS continues to add rich features to their platform and it can be difficult to stay on top of them, and understand which new capabilities are going to impact an individual business, and how. CRN Magazine asked us for our thoughts about how this can help us in our aim to build the best, zero-administration infrastructures possible for our customers, and what it says about what AWS is trying to do now. You can read the full article here We think this is a significant new feature as volumes of data are increasing and more businesses rely on big data or Internet of Things (IoT) information. Analysts predict that the number of IoT units will easily hit 26 billion by 2020. Real time processing needs and data management challenges are becoming common in IoT deployments which calls for the need of efficient handling of capacity and real time query management in addition to the challenges like security and compliance. For example, one of our customers has built their business on genomic analytics and comparison. Several vast data sets must be analyzed quickly in order to produce a solution in a timeframe that is acceptable for patients. Vast data sets here are in the order of tera or peta bytes. 50 GB is the space required to store a single person’s DNA. It’s not too long before the size of the data set reaches the order of TB. To accomplish a solution for this, last year, we needed to have seven servers per service solely because EBS volumes weren’t large enough. You can read that case study here.
Healthcare providers, pharmaceutical manufacturers and biotechnology companies are spawning their own health tech start-up ecosystems to solve some of the most complex health problems. Often, this is accomplished through the use of high performance computing (HPC) and Big Data analytics. Patient-derived data, such as genomics, can now be compared against very large data sets to identify patterns, matches and other indicators that can provide new treatment plans and essentially better health outcomes. Organizations like Personal Peptides are using cloud computing infrastructure to quickly and economically analyze vast quantities of data for the purpose of creating precision medicine solutions. Maintaining competitive advantage through efficient use of computing resources is imperative in the competitive field of biometrics, where “first to market” and fast response are critical to success. When patients are ill, delayed treatment is not an option. Finding the best treatment plan, among many options, must be accomplished quickly. Patients often cannot wait months for results. ——————————————————————————– “Without the strategic use of AWS capabilities, we would not have had the resources to produce a commercially viable product,” Jahan Khalili, PhD Personal Peptides founder said. ——————————————————————————– You can read more about how Personal Peptides is using its AWS solution here. Both stream-based and batch-based data processing for analytics can be accomplished in AWS. It allows you to increase the speed of research by running high performance computing in the cloud and reduce costs by providing Cluster Compute or Cluster GPU servers on-demand, without large capital investments. You have access to a full-bisection, high bandwidth network for tightly-coupled, IO-intensive workloads, which enables you to scale out across thousands of cores for throughput-oriented applications.
TECHNOLOGY FIRM: If you are a technology firm, involved in the creation of software, whether it be robust enterprise-level specialized applications, mass-market consumer-level computer programs, or apps that can be used on mobile devices, DevOps is the most efficient and productive way for you to organize all the stages of your product development. As its name suggests, DevOps is a hybrid of Development and Operations. Firms that follow the principles and practices of DevOps merge the stages of both sections of the production process together, with full coordination between the relevant stakeholders, allowing development and operations to flow as a single path, side by side. With a DevOps approach in place your development teams, operations staff and quality assurance employees, allied with your management, collaborate and work together to ensure that you are able to deliver software on a continuous basis, making regular incremental improvements to what you are offering. ______________________________________________________________________________________________________ More on DevOps: ______________________________________________________________________________________________________ All the key sections of your organization work together, not as disparate sections following different strategies and independent deadlines. This approach works equally well for small start-up businesses as it does for large enterprises with huge teams. DevOps works particularly well in a cloud environment. In a cloud situation there can be constant communication between those involved in software (traditionally the Developer side of the spectrum) and hardware (traditionally more Operations-focused). With DevOps there is much more of a flow between both sets of people. The best DevOps people have a full stack of knowledge crossing both the software and hardware divides: from chip to coding to release management.
The technological world is always changing. In fact, technology is moving at such a rapid pace that a recent trend has been constant pressure on digital businesses to deliver more software updates more frequently. It’s now common for apps, software and even operating systems, like Windows and iOS, to have small weekly updates, rather than previous process of implementing major new changes every year or so. For digital businesses, technology is actually a product – not just a means to make their main task easier, quicker or cheaper, as it is for many other types of business. They use technology to deliver their products and services, for example many use websites to provide service portals or information, as well as apps, social media marketing, analysis services, and many other tasks. For these businesses, optimizing workflows in the most efficient manner is vital and a critical business success factor. Developing software does not come cheap, and particularly if you are a start-up or even an established business diversifying into software development, there can be quite a delay before there are sufficient cash flows to pay all the bills. Having an efficient software development cycle, best utilizing your developers, can make a huge amount of difference to your financial survival. The Flux7 team recently produced a white paper, titled Startup Guide: Minimizing IT Investments and Maximizing Business Outcomes, that looks closely at this issue. Here are some of our findings. Well organized developers’ workflows are essential for any organization. They help minimize the following problems: Developers come at a cost – An hourly cost for your developer can equate to a monthly cost of your cloud server setup.
Unique cloud strategies to gain business advantage Cloud computing in healthcare is driving a new era of change. As traditional health providers strive to develop strategies for using cloud technology to create the efficiencies and agility seen in other businesses, digital health and cloud native solution providers are managing a delicate balance of innovation, growth and the need for mature, secure infrastructure. Healthcare start-ups are hot right now and have recently been one of the most popular investments venture capitalists have made. In 2014, there was $4.1 billion of venture financing for health start-ups, and this trend looks certain to continue. 2015 is poised to be another big year for healthcare start-ups, as they reach new markets and offer new models of delivery. One reasons for this is because start-ups are helping to herald a technology revolution in the health sector. This is aided by a new generation of entrepreneurs in young, established businesses who are clearly passionate about healthcare and tech. They’re eager to usher in a new generation of healthcare in the cloud that offers the agility, security, stability, scalability and real-time response needed to solve health challenges. In established enterprises, it is often an innovation center that will broach the spectre of cloud computing. These offshoots are able to use cloud infrastructure, often outside of the business applications and structure, to give them the agility they need to create new solutions and means of delivery. For start-ups, organizations like our local Austin Technology Incubator have dedicated programs focused on human health – medical devices, diagnostics, research tools, and therapeutics. Their team has deep expertise in the regulatory pathways and reimbursement landscapes that start-ups must navigate. Similarly, navigating cloud infrastructure is not without challenges.
You Can’t Afford Cheap! Some people always try to do things on the cheap. They will always pick the least expensive way of doing something. They consider the lowest cost as being the most important metric when it comes to any particular job. The problem is that a low up-front cost is not always a good guide to how much a job will cost overall. Indeed a low upfront cost can often lead to expensive bills down the line as you try to recover from deficiencies encountered as a result of that low-cost option. Paying an expert to find you a solution generally ends up the cheapest option in the long run. There is usually a solid reason why people undercut others. More often than not, it is because that person does not have the skills, ability or speed to match their higher charging brethren. They literally need to charge at a lower rate so that they can get the work to learn their skills. In many cases the reason why you are paying a cheaper rate for something is because you are unwittingly consenting to being a guinea pig. What Can Experts Do? Anyone considered to be an expert, will have the qualifications and experience to prove that they can live up to the term. They will be able to perform quicker and more efficiently than others. Their hourly rate may be higher than their more amateur competitors but they will take less hours to do the necessary tasks. Often the overall job will end up cheaper because of this. They have built up a knowledge base through their experience. They know the best practices to follow, and just as importantly they will have worked out what are the bad practices that should be avoided.