Posted by Bob Familiar On August 20, 2015
The Internet of Things is transforming business and entire industries as companies shift to creating products that are smart and connected. The IoT wave is here now – enabled by low cost hardware, pervasive connectivity and established cloud services to collapse implementation efforts.
Please join your peers and IoT thought leaders at this invitation-only half-day in-person event. BlueMetal’s IoT innovators will bring real-world experiences to discuss:
- Ingesting large volumes of device and sensor telemetry
- Leveraging the Azure IoT Suite to transform and stage telemetry for alerts, notifications, real-time status and big data analytics
- Adopting BI dashboards to identify trends, provide real-time predictive analysis and delivery enhancements
Register for Cambridge Event
Register for the New York Event
Register for the Chicago Event
Posted by Bob Familiar On January 5, 2015
This post is part one of a two part series that delves into an emerging approach to modern application architecture called Microservices where applications are composed of autonomous, independently deployed, scaled, and managed services. This approach to service architecture along with the benefits of cloud platforms provides the scalable, resilient, cross platform foundation necessary for Modern Applications. In part one I will provide an overview of Microservices along with the benefits, a logical architecture and deployment scenarios. In part two of the series I will detail the design and implementation of RefM, a Microservice that provides application reference data.
The software development landscape has changed dramatically over the past decade. Disruptive technologies and design approaches have introduced entirely new types of applications and methods for building them. As Mikhail Shir of BlueMetal writes ‘…The Modern Application is user centric. It enables users to interact with information and people anywhere on any device. It scales resiliently and adapts to its environment. It is designed, architected, and developed using modern frameworks, patterns and methodologies. It is beautiful in its user experience as well as its technical implementation…’ In conjunction with these new user experiences is the need to connect to and interact with a variety of online services that provide information and transactions in a scalable, resilient and cross platform way.
The concept of distributed services is not new. Since the early days of object oriented programming, the idea that one could provide ‘objects’ in a distributed network using RPC mechanisms and message queues along with location transparency has been the holy grail of software engineering. CORBA and DCOM were early attempts to provide a language and OS agnostic approach to distributed computing but not without the heavy burden of complexity.
The Internet revolution brought about the evolution, and in many ways the simplification, of distributed computing with the introduction of Web Service Protocols such as SOAP and REST. There has been much back and forth amongst the proponents of Service Oriented Architecture on which protocol should rule the day. Without rehashing those battles, suffice it to say that REST has become the primary choice today for defining API’s to cloud hosted services. The key to applying REST is to understand that its CRUD style of API design is not focused on the underlying physical store, i.e. the database, but on the resources that are being accessed. As such, it is a good choice for API design and keeps the overall approach simple and straightforward.
Another important factor that is impacting how we think about distributed computing today is the emergence of commercial cloud platforms such as Amazon’s AWS and Microsoft’s Azure. These platforms provide pay-as-you-go access to compute and storage as well as easy access to a suite of common application services such as SQL and No-SQL databases, in-memory cache and performance analytics as well as lend themselves to automating the development, test, staging and production environments providing the foundation for Continuous Delivery. Read More
Posted by Bob Familiar On November 11, 2014
Lean Engineering defines a set of principles that guide the creation and deployment of software products at high velocity with low risk. By leveraging a Lean Engineering approach, the risk of validating new technology, making incremental changes in process and bringing new products to market can be lowered and a high quality result can be achieved at a faster rate.
Every discipline requires a set of principles or assertions to build upon. As disciples of the practice of software engineering, it is imperative that we define a clear set of unwavering principles that guide the process, methodology and architecture for the products we create.