Interested in how DevOps, IT Modernization and Agile practices can positively impact customer experience?
As we discussed recently, AWS microservices are being adopted widely across organizations and industries for their ability to increase service delivery and speed time to market while decreasing team overhead. As organizations begin traveling down the path to a microservices architecture, one hurdle that they often run into is enterprise password management or secret management. For, as the number of microservices increase, so too do the number of credentials—often exponentially so—creating a need for effective and efficient management. What is a Secret? Secrets are credentials like API-keys, passwords, SSH-keys, etc. that a service needs to authenticate and communicate with other services, with cloud infrastructure, traditional infrastructure such as an Oracle database, or an external SaaS payment gateway. As the number of microservices increase, it’s easy to see how the number of secrets can increase along with them. And, with the proliferation of microservices, it becomes increasingly impractical to manage and control the number of credentials manually. Credential Management At the end of the day, there are two approaches to credential management: distributed and centrally automated. Distributed management provides each individual team access to their needed credentials. In addition to being manual, the downside to a distributed approach is that it lacks separation of control with the same people who have password access often being the same developers who write code. This introduces unnecessary risk into the environment. Centrally automated management is a credential management system in which security is treated as a first class citizen. Automated credential management systems encourage an AWS Security by Design approach to building microservices environments, giving developers powerful tools to automate secret management from day one while providing critical separation of duties.
As DevOps consultants, at Flux7 we believe that Continuous Delivery (CD) is a key tenet of successful DevOps. And as heavy users of Amazon Web Services (AWS), we have a keen interest in any tools or features that streamline CD for our clients within AWS. For this reason, we are pretty excited to dive into the Amazon Pipeline Starter Kit. Now, you may be familiar with two services that Amazon has traditionally offered to help facilitate CD: AWS CodePipeline and AWS CodeDeploy. The Pipeline Starter Kit takes advantage of both of these services for people who don’t want to set up the resources themselves. The starter kit includes an AWS CloudFormation template to create the pipeline and all of its resources. (The template uses the US East region.) For those of you unfamiliar with these two services, or could use a refresher, Amazon defines them as such: AWS CodePipeline is a continuous delivery service for fast and reliable application updates. CodePipeline builds, tests, and deploys your code every time there is a code change, based on the release process models you define. This enables you to rapidly and reliably deliver features and updates. With AWS CodePipeline, you only pay for what you use. There are no upfront fees or long-term commitments. AWS CodeDeploy is a service that automates code deployments to any instance, including Amazon EC2 instances and instances running on-premises. AWS CodeDeploy makes it easier for you to rapidly release new features, helps you avoid downtime during application deployment, and handles the complexity of updating your applications. You can use AWS CodeDeploy to automate software deployments, eliminating the need for error-prone manual operations, and the service scales with your infrastructure so you can easily deploy to one instance or thousands.
Pundits have declared that 2016 is the year microservices graduate from early adopter to early mainstream adoption. The aggregate predictions are certainly right if the call volume here at Flux7 is any indication. We’ve been seeing this trend in full force as we field call after call from organizations across industries, from enterprises to startups, all looking for advice and expertise in building their own microservices architecture. What are Microservices? Before we dive into what is driving this trend, let’s take a brief look at what microservices are in the first place and the benefits organizations derive from them. Microservices are an approach to addressing a large or complex business problem using a set of smaller, simpler services that work together; a microservice runs its own unique process that contributes to an overarching business goal. It’s difficult to discuss microservices without mentioning companies like Amazon, Netflix and eBay who have pioneered the use of microservices — thousands of them — to deliver their sites to customers. For example, data shows that a single Google query can consult up to 80 microservices before results are shown to the user. Benefits Microservices provide a host of benefits, with different organizations benefiting differently from each of these. While we’ll walk through how and why, the key here is to know that regardless of the size of your organization, microservices provide clear business benefits. Better Human Resource Utilization – As employees in microservices environments are assigned to small teams who each work on their own microservice, better team management and utilization is achieved. Each team is able to work on their service independent of other teams, eliminating bottlenecks that occur with traditional monolithic efforts.
The Internet of Things (IoT) is rapidly moving from niche use cases to normal business. According to research firm IDC, about three-quarters of respondents have IoT deployment plans, or already have efforts underway. Most enterprises don’t view IoT as a sideshow or something that will provide peripheral benefit, but see these efforts as strategic to the business. The amount of data that IoT devices will generate will be staggering–as will the ability to control devices in the physical world as any other networked or Internet connected device. New business models will rise, and existing businesses will become much more efficient and effective as they learn more about their supply chains and their customers than was ever before possible. One area that is often overlooked when it comes to successful IoT initiatives is the infrastructure. But it takes an infrastructure that is agile, scalable, and secure in order for any IoT deployment to be sustainable over the years. Here are five of the key attributes that must be in place: IoT Key #1: Build the right infrastructure. All the devices that you are going to deploy and connect are going to be very data chatty. This is going to strain existing networks, new wireless networks, storage, and your analytics tools. It’s crucial that you build an infrastructure that is agile, adaptable, and can scale. This is going to, for most enterprises, mean a cloud infrastructure. Based on conversations I’ve had with CIOs and DevOps teams that have gone down the IoT path already, one of the biggest snags they eventually ran into was having to redesign their supporting IoT infrastructure.
According to Innovative Retail Technologies, 52% of surveyed retailers plan to actively move applications to the cloud this year. The initially tepid response to cloud is waning as retailers learn more about its strengths for availability and innovation. Yet, one question our AWS consultants frequently field from retailers is about achieving AWS PCI Compliance in the cloud. As most readers of this blog know, the Payment Card Industry Data Security Standard, otherwise known as PCI DSS, is an information security standard requiring organizations to incorporate controls around customer data to prevent credit card fraud. There are several ways that AWS helps its retail clients build a foundation for PCI compliance and they’ve recently announced one more in the form of a Quick Start. Defined by a structure of 12 requirements (best practices and security controls) to keep credit card data safe and secure during transit, processing, and storage, PCI DSS requires organizations to build and maintain a secure network, protect cardholder data, maintain a vulnerability management program, implement strong security measures, test and monitor networks on a regular basis, and maintain an information security policy. AWS helps achieve this goal by providing an environment which is compliant to the standard. Certification, it should be noted, is the responsibility of the company. As we all know, compliance does not automatically translate to certification. At Flux7 our AWS experts have been implementing PCI compliant AWS solutions since 2014–well before AWS formalized its first PCI program. Since our first PCI deployment for a company called GoBold, we have been implementing AWS best practices for a host of companies looking to maximize the benefits of cloud computing with world-class security and regulatory compliance.
While we write frequently about backend technologies, as AWS consultants, we also work with businesses on client systems and as the hype around AWS WorkSpaces grows, we have been fielding an increasing number of inquiries about it. A managed desktop computing service in the cloud, AWS WorkSpaces enable users to access their files, applications, and other resources through a supported device, regardless of their physical location.
Automating common administrative tasks to improve workload reliability and decrease potential risk is a common theme our consultants at Flux7 help our clients with. Doing so simplifies administration, encourages security through consistency and helps improve control over users and permissions. Amazon launched EC2 Run Command in October 2015 to help attain these benefits. Specifically, EC2 Run Command provides a simple way of automating common administrative tasks like installing software or patches, running shell commands, performing operating system changes, managing local groups and users, altering configuration files and more in Windows instances. Two months later, in December 2015, they released the same feature for Linux instances. Run Command allows users to execute commands at scale and provides visibility into the results, making it easy to manage instances. Run Command is accessible through the Commands page in the Amazon EC2 console or through the AWS CLI. In May 2016, AWS updated the Run Command service to make it even better. Let’s walk through the new features: Document Management & SharingA command document is a JSON file that includes the information (description and explanation) about the command you want to run. If you have any command documents which you execute using EC2 Run Command, you can now manage and share them. This lowers the chance of errors and variability in your system.By clicking on a document, you can examine its function and parameters before running it. You can also share it publicly or privately with other AWS accounts.
Service discovery is not new. The idea of a tool that can discover how processes and services talk to each other and help facilitate connections has been around for some time. However, with the rise of increasingly dynamic environments, the important role service discovery plays continues to grow. Indeed, since the beginning of the year at Flux7 we have seen a surge of customers looking for container-based microservices architectures that highlights the need for service discovery due to its dynamic nature. Microservices do in fact offer a great deal of agility and resiliency and when coupled with container technology, they bring immense portability. However, this new container-based microservice architecture does present a challenge that is an ideal example of why Amazon’s reference architecture for AWS service discovery is so helpful. Namely, the ability to keep track of the necessary information to communicate with the service. As I mentioned, Amazon Web Services has offered a reference architecture which Amazon describes as, a reference architecture to demonstrate a DNS- and load balancer-based solution to service discovery on Amazon EC2 Container Service (Amazon ECS) that relies on some of our higher level services without the need to provision extra resources. The service uses AWS CloudWatch events to invoke an AWS Lambda function which can automatically create AWS Route53 entries for newly created services. Why the Microservice Surge Microservices are a way of breaking a single large monolithic application into smaller composable services. These services offer APIs that other services or outside parties can use to get certain tasks accomplished. While containers are a natural fit for microservices — as they allow any application or language to be used; you can test and deploy the same artifact; and they solve the challenge of running distributed applications on an increasingly heterogeneous infrastructure.
Creating a healthy security posture is one of the key factors in achieving PCI DSS certification, especially for enterprises. Truly, when it comes to security, even the smallest of details are important and can cause huge troubles. As a result, in this post we’ll talk about how to achieve better security outcomes with help of version control and automation and how this can help you with your PCI DSS certification. The PCI Environment As you most likely already know, PCI DSS is a set of security standards. And as you know, anyone handling credit card data and transactions must satisfy PCI requirements. Depending on your merchant level, requirements can be quite strict. For example, while Level 4 merchants can conduct a self-assessment in order to achieve certification, Level 1 merchants must hire and have on site an independent auditor to thoroughly assess the merchant’s security. Regardless of where your organization sits in the merchant level spectrum, the key take away I’d like to emphasize here is that you are ultimately responsible for securing the environment where credit card data traffic flows and/or is stored. This can be everything from servers at your datacenter, to computers at the office and even the persons who use those systems. Service providers like AWS have offerings that are validated PCI DSS compliant by independent assessors. However, as AWS is only a service provider, it is not directly responsible for your security, nor your PCI certification. Said another way, while AWS provides a secure environment, your organization is responsible for securing it and achieving PCI certification.
Amazon announced its Elastic Container Service (ECS) at re:Invent 2014 using Pristine as a case study. Given Flux7’s Amazon expertise, it’s likely no surprise to frequent readers of this blog that Pristine is a Flux7 customer who we have been working with for some time now. While containers can run natively on VMs and it is possible to either fix the container to VM mapping or write home grown scripts to schedule containers on VMs, Docker orchestration engines including Kuberenetes, Swarm, and ECS provide a strong foundation for Docker projects. Of these, ECS is a very popular choice for our customers wanting to use Docker in AWS. When creating ECS clusters for clients, our AWS consultants are commonly asked if they can leverage AWS auto-scaling, a hallmark use case of the elasticity that the cloud provides. However, until recenty, it was only possible to auto-scale the actual resources in the cluster; individual services had to be scaled manually or using home grown code. Announcing Auto-Scaling of ECS Services With this announcement however, AWS has introduced a new feature which allows auto-scaling of ECS services. Note that auto-scaling was already possible for ECS clusters where more nodes could be added to the ECS cluster. Now, Amazon ECS can automatically scale container-based applications by dynamically growing and shrinking the number of tasks (containers) run by an Amazon ECS service. This new feature can increase the number of actual containers assigned to a single service based on dynamically collected metrics — such as the number of access queries to the service or the resource utilization of the containers constituting the service. In the past, ECS users used to get a basic form of auto-scaling implemented using AWS Lambda functions outline here https://aws.amazon.com/blogs/compute/scaling-amazon-ecs-services-automatically-using-amazon-cloudwatch-and-aws-lambda/.
As AWS experts we work closely with organizations who handle a wide variety of sensitive information – from patient health records to credit card data and more. Resultantly, we are always on the look-out for technology and best practice-based improvements to ensuring cloud-based security.
It’s rare to find the business that isn’t grappling with growing business pressures, whether it’s business competitors becoming more effective through the use of emerging technologies, growing global competition, and even the increased effectiveness of machine learning and artificial intelligence. Those organizations that learn how to not only embrace emerging technologies, but master them, are going to be the victors in the immediate years ahead. Those that don’t learn to master transformative technologies are not going to survive. In my interviews with CIOs, CTOs, and chief digital officers, there are many who are wrestling with how they are going to leverage transformative technologies so that their enterprises can maintain relevance. Here are the primary technologies, as I see them, driving enterprise transformation today: The accelerating adoption of cloud technologies:Saying that cloud is transformative isn’t exactly news. However, the rate of adoption of cloud services among organizations is growing rapidly. The research firm Gartner estimates that by next year, 50 percent of enterprises will have embraced hybrid public/private cloud architectures. When Amazon started publicly breaking out its cloud revenue, we learned that in 2015, Amazon Web Services earned $7.88 billion in cloud sales for 2015, an increase of almost 70% compared to the prior year. The research firm TBI forecasts the global public cloud market will grow from $80 billion last year to roughly $167 billion by 2020. Interestingly, in a survey that firm found that nearly half of participants believe the public cloud meets or exceeds the security within private clouds. This move to cloud is also proving to be the foundation that is going to drive a lot of the innovative capabilities found in agile development, IoT, continuous deployment, citizen developers, and data analytics.