Modern Apps and Microservices – The Undocumented API
Loading...
X

Modern Apps and Microservices

image

Introduction

This post is part one of a two part series that delves into an emerging approach to modern application architecture called Microservices where  applications are composed of autonomous, independently deployed, scaled, and managed services. This approach to service architecture along with the benefits of cloud platforms provides the scalable, resilient, cross platform foundation necessary for Modern Applications. In part one I will provide an overview of Microservices along with the benefits, a logical architecture and deployment scenarios. In part two of the series I will detail the design and implementation of RefM, a Microservice that provides application reference data.

The software development landscape has changed dramatically over the past decade. Disruptive technologies and design approaches have introduced entirely new types of applications and methods for building them. As Mikhail Shir of BlueMetal writes ‘…The Modern Application is user centric. It enables users to interact with information and people anywhere on any device. It scales resiliently and adapts to its environment. It is designed, architected, and developed using modern frameworks, patterns and methodologies. It is beautiful in its user experience as well as its technical implementation…’ In conjunction with these new user experiences is the need to connect to and interact with a variety of online services that provide information and transactions in a scalable, resilient and cross platform way.

The concept of distributed services is not new. Since the early days of object oriented programming, the idea that one could provide ‘objects’ in a distributed network using RPC mechanisms and message queues along with location transparency has been the holy grail of software engineering. CORBA and DCOM were early attempts to provide a language and OS agnostic approach to distributed computing but not without the heavy burden of complexity.

The Internet revolution brought about the evolution, and in many ways the simplification, of distributed computing with the introduction of Web Service Protocols such as SOAP and REST. There has been much back and forth amongst the proponents of Service Oriented Architecture on which protocol should rule the day. Without rehashing those battles, suffice it to say that REST has become the primary choice today for defining API’s to cloud hosted services. The key to applying REST is to understand that its CRUD style of API design is not focused on the underlying physical store, i.e. the database, but on the resources that are being accessed. As such, it is a good choice for API design and keeps the overall approach simple and straightforward.

Another important factor that is impacting how we think about distributed computing today is the emergence of commercial cloud platforms such as Amazon’s AWS and Microsoft’s Azure. These platforms provide pay-as-you-go access to compute and storage as well as easy access to a suite of common application services such as SQL and No-SQL databases, in-memory cache and performance analytics as well as lend themselves to automating the development, test, staging and production environments providing the foundation for Continuous Delivery

Evolution of Methodology, Process and Architecture

The early days of PC development leveraged techniques that came from the mainframe world and used a Waterfall methodology to deliver desktop and client/server applications. This process was gated in that a project phase could not start until the previous phase had completed. This approach resulted in long development cycles and if a project was going poorly, as many did, the dreaded death march to release. clip_image002

As the industry moved to web development, the methodology and process evolved to Agile/Scrum which is a much healthier approach to team structure and project management. This method of developing software has worked very well for the past several years. As the industry moves to the cloud, Agile/Scum is being leveraged as part of a high velocity, continuous product development approach called Lean Engineering.

Lean Engineering defines a set of principles that guide the creation and deployment of software products at high velocity with low risk. By leveraging a Lean Engineering approach, the risk of validating new technology, making incremental changes in process, and bringing new products to market can be lowered and a high quality result can be achieved at a faster rate. clip_image004

Lean Engineering leverages the Build-Measure-Learn cycle from Lean Startup and applies these methods to enterprise class product development. The goal is to deliver high quality, valuable software in an efficient, fast, and reliable manner through automation and increasing release cycles. Using commercial cloud platforms, you can create a highly automated on-demand process to iterate quickly through the Build-Measure-Feedback loop, incorporate into their Continuous Integration and Deployment Pipeline processes, and leverage built-in analytics tools.

Depends on How You Slice It

The interesting artifact of this evolution from desktop to web has been the horizontal slicing of application architecture to take advantage of the advancements in the platform. The first horizontal slice was client/server where the application data was placed in a server hosted database such as SQL Server or Oracle, stored procedures were developed that dealt with database CRUD operations and business logic while the client facing portions of the application were delivered as desktop executables.

clip_image002[7]

The next slice came with the advent of web based applications targeting browsers. The code that dynamically generated the web pages, provided business logic, and accessed data was moved to a web server. This middle layer communicated with the database which now resided in a third logical tier. This approach has served the industry well, but as cloud platforms mature, this 3-tier architecture is unable to fully realize the power of these cloud platforms.

Cloud platforms provide the ability to instantiate services on demand and scale them in an elastic manner. When 3-Tier applications are hosted in the cloud, the monolithic nature of the architecture limits the ability to take full advantage of elasticity and the ability to scale the entire app. Our goal is to enable dynamic scaling of individual application components (Microservices). This ability is made possible by vertically slicing the application into a suite of Microservices.

Microservices Defined

Microservices are autonomous, scalable services that provide easy-to-use API’s for a particular business function. Martin Fowler has written on the topic and defines a Microservice as…

…an approach to developing a single application as a suite of small services, each running in its own process and communicating with lightweight mechanisms, often an HTTP resource API. These services are built around business capabilities and independently deployable by fully automated deployment machinery…

Determining what constitutes a Microservice is dependent on the nature of the application but in general terms a Microservice is scoped to a particular business domain such as Customer, Product and Order or cross cutting concerns such as Workflow, Rules and Reference Data. As you begin to adopt a Microservices approach to application architecture across the enterprise, the opportunity for reuse will grow. You will discover that new applications can be composed from mostly existing reusable Microservices.

In order for a Microservices architecture to be successful, each Microservice must exhibit a set non-negotiable characteristics:

Autonomous A Microservice is a unit of functionality; it provides an API for a set of capabilities oriented around a business domain or common utility
Isolated A Microservice is a unit of deployment; it can be modified, tested and deployed as a unit without impacting other areas of a solution
Elastic A Microservice is stateless; it can be horizontally scaled up and down as needed
Resilient A Microservice is designed for failure; it is fault tolerant and highly available
Responsive A Microservice responds to requests in a reasonable amount of time
Intelligent The intelligence in a system is found in the Microservice endpoints not ‘on the wire’
Message Oriented Microservices rely on HTTP or a lightweight message bus to establish a boundary between components; this ensures loose coupling, isolation, location transparency, and provides the means to delegate errors as messages
Programmable Microservices provide API’s for access by developers and administrators
Composable Applications are composed from multiple Microservices
Automated The lifecycle of a Microservice is managed through automation that includes development, build, test, staging, production and distribution

What are the benefits of a Microservice Architecture?

There are many benefits to a Microservice Architecture. These benefits are not only technical in nature, but also improve the team culture. Instead of forming separate teams by job function, it is recommended that you create cross functional teams that combine all roles; design, development, QA, IT, program and product management, stakeholder and so on. These cross functional teams own the Microservice Product from start to finish, from concept to deployment. This approach will increase quality through ownership and create a positive working environment for all involved.

The benefits of a Microservice Architecture are:

Evolutionary They can be developed alongside existing monolithic applications providing a bridge to a future state
Open They provide highly decoupled, language agnostic APIs
Tool/Language Agnostic They are not bound to a single technology and not a one-size-fits-all approach
Deployment Flexibility Services are deployed independently
Scale Flexibility On-demand scaling of services leads to better cost control
Reusable The can be reused and composed with other services
Versioned New API’s can be released without impacting clients that are using previous API’s
Replaceable Services can be rewritten and replaced with minimal downstream impact
Owned Microservices are owned by one team from development through deployment

Microservice Logical Architecture

While the term ‘micro’ appears in the name of this architecture pattern, a Microservice is not necessarily small. The term micro indicates that the surface area of capability is scoped, bounded to an area of a business domain. Another way to say this is that a Microservice does one thing and it does it well.

The scope of a Microservice goes beyond the public facing API’s. A Microservice serves two masters; it provides a public façade that client applications and other services leverage to access its publically accessible functions, and a private façade that provides the development and administration team configuration, monitoring, batch processing, analytics, and other private concerns. The two facades share the Modes, a common code Framework for accessing stores, invoking REST API’s and serializing objects, and a common in-memory and physical repository. Taken all together these two facades and the common components represent the entire scope of the Microservice domain.

clip_image004

Component Access Description
User Experience Public Any Client that leverages the public SDK
Private An Administrators Console that leverages both the public and private SDK to provide a user experience for managing the Microservice
SDK Public SDK used by client applications and services to access the Microservice capabilities
Private SDK used by the Microservices development team and administrators to manage and monitor the Microservice
Models On-the-Wire The message formats that are emitted to and received from client of the Microservice
In-the-Store The message formats as they reside in storage. These formats may be different than what is communicated on the wire and therefore transformation may be applied
API Public The ‘public’ programmable interface provided over a lightweight communication protocol such as HTTP used by client services and applications
Private The ‘private’ programmable interface provided over a lightweight communication protocol such as HTTP used by Microservices development team and administrators
Logic PublicPrivate Rules, validations, calculations that comprise the business logic for a particular Microservice
Frameworks Common Common Frameworks for communication, caching, messaging, storage, logging, security, and other cross cutting concerns
Store Persistence Relational, Non-Relational, In-Memory and/or Message Q stores, used for model persistence and loosely coupled messaging
Automation Deployment Leveraging process and tools for continuous deployment across development, build, test, tagging and production to provide high-velocity, high-quality release cycles

Microservice Deployment Scenarios

A benefit of Microservices is that they provide the flexibility to scale individual components based on performance profiles exhibited at runtime. They can also be composed to deliver just the right set of functionality for new user experiences that target current and emerging form factors. clip_image006 Today’s cloud platforms such as AWS and Azure provide Virtual Machine based mechanisms for deploying Microservices. Each cloud platform vendor will provide various tools; API’s and patterns for defining virtual machine configurations. For sake of this discussion, we will use Microsoft Azure as the cloud platform and assume our Microservices were developed using ASP.NET Web API. The Microservices will be deployed in the same manner one would deploy a web site.

Azure provides the ability to define Resource Groups and deploy web sites (Microservices) into these Resource Groups. All the web sites in a resource group will share the resources assigned that Resource Group. Azure then lets you define Web Hosting Plans. Each Web Hosting Plan is associated with a Resource Group and defines the size of the virtual machines, the metric that will be used for elasticity, the number of instances currently running, and the maximum number to scale to when the scale metric has triggered the need for a new instance.

clip_image008

Using this model, we could choose to configure our Microservices as follows:

  • Reference Data and Customer Microservices deployed to the Resource Group A using Web Hosting Plan A therefore running on the same VM instance
  • Order Microservice are deployed to Resource Group B using Web Hosting Plan B which defines a minimum 4 VM farm

clip_image010

Another way we could go is to deploy each Microservice independently. Each Microservice is deployed into its own Resource Group each with their own Web Hosting Plan, sized appropriately for the amount of load expected and scaled according to the metric defined in the Web Hosting Plan.

clip_image012

The key point here is that cloud platforms provide great flexibility in how each Microservice is deployed and scaled. Designing load testing scenarios that emulate real world usage of the overall system will inform your initial deployment configurations. Leveraging real-time monitoring and analytics over time will provide the necessary data to tweak the deployment configurations for optimal efficiency.

There is an emerging product called Docker that provides an even more granular approach to deploying Microservices. Docker is an open-source project that provides the ability to easily create lightweight, portable, self-sufficient containers for any application. The same container that a developer builds and tests on a laptop can run at scale, in production, on VMs, bare metal, OpenStack clusters and public clouds. This feature is available today in Azure and AWS with support for Linux based applications. Windows Container support is coming in early 2015. Using a container approach, one could deploy multiple Microservice containers to individual VM’s to maximize utilization of each VM, helping to provide very flexible cost containment.

clip_image014

Summary

Modern Applications are designed, architected, and developed using modern frameworks, patterns, practices and methodologies. They are beautiful in their user experience, are built on a foundation of scalable, resilient services. Microservice Architecture is an approach to delivering highly scalable, cross platform RESTful API’s that can be developed, deployed and scaled independently of one another to provide the Modern Application product team the greatest amount of flexibility and control.

BlueMetal recommends Lean Engineering, Responsive Design, and Microservice Architecture along with Cloud Platforms to deliver Modern Applications at high velocity. If you are interested in learning more about Microservices and how they can be applied to your next development project, please contact me at bobf@bluemetal.com or reach out to the Sales Directors in your area.

In BostonMatt BienfangVice PresidentManaging Director of Sales

mattbien@bluemetal.com

617.800.0332

In Chicago Jay MahlendorfClient Directorjaym@bluemetal.com

312.219.9943

In New York Brendan FitzgeraldClient Directorbfitz@bluemetal.com

215.850.6569

6 observations on “Modern Apps and Microservices
  1. Pingback: Luper's Learnings - Azure Technical Community for Partners (February 2015) - Luper’s Learnings - Site Home - TechNet Blogs

  2. Pingback: Office 365 Developer Podcast: Episode 053 on micro services with Bob German • PC Portal

  3. Pingback: Office 365 Developer Podcast: Episode 053 on micro services with Bob German | nokipedia.com

  4. Pingback: SharePoint As a Service | Bob German's Vantage Point

  5. Pingback: RefM – A Microservice Case Study | The Undocumented APIThe Undocumented API

  6. Pingback: Dew Drop – January 6, 2015 (#1927) | Morning Dew

Leave Your Observation

Your email address will not be published. Required fields are marked *

Read previous post:
Internet of Things, A Reference Architecture

We at BlueMetal have a great deal of experience in creating solutions that leveraged connected devices streaming millions of records...

Close