This week we announced a number of new versions for products in our Hitachi Content Platform Portfolio. These included HCP v8.0 our object storage platform, HDI v6.1 cloud gateway, HCP Anywhere v3 for file synch and share, and general availability of our Hitachi Content Intelligence for big data exploration and analytics which we announced in 4Q2016. In my last post I talked about some of the many new features and capabilities that are integrated in this portfolio, which Gartner and IDC recognize as the only offering that allows organizations to bring together object storage, file synch and share, and cloud storage gateways to create a tightly integrated, truly secure, simple and smart cloud storage solution.
One of the benefits that I failed to mention is the benefit that this provides for the DevOps process. The agility and quality of the products in this portfolio are a great example of the DevOps process that is used by the HCP development and QA teams. Hitachi Content Platform, which is recognized by the industry for cloud computing excellence, is also one of the tools in our DevOps tool chain in Waltham where we develop our Hitachi Content Platform portfolio.
Recently there have been some articles about difficulties in orchestrating the DevOps tool chain. A DevOps tool chain is a set or combination of tools that aid in the delivery, development, and management of applications throughout the software development lifecycle. While DevOps has streamlined the application development process from the old "waterfall approach", DevOps toolchains are often built from discrete and sometimes disconnected tools, making it difficult to understand where bottlenecks are in the application delivery pipeline. Many of these tools are great at performing their intended function but may not apply all the disciplines needed for enterprise data management.
HCP's main benefit for DevOps is its high availability capabilities, which can help insulate downstream test automation tools from software upgrades, hardware maintenance/failures, etc. as well as insulating from availability issues in the upstream tools as well. We use Jenkins for continuous integration and if it goes down or is being upgraded, the downstream test tools don't notice or care since they are fetching builds from the always-online HCP.
HCP’s Metadata Query Engine (MQE) helps abstract away where the artifacts are located and named in a namespace. As long as the objects are indexed, MQE will find them and present them to the client, regardless of the object name & path. Even further downstream, after the automated tests are run, we can again take advantage of HCP by storing the test results and logs on the HCP (preferably in a separate namespace than the build artifacts). HCP’s security and encryption features ensure a secure enterprise environment, which is not always available with DevOps tools. DevOps is about automation and HCP can automate managing the space consumption by taking advantage of retention & disposition to "age out" and delete old logs or old builds, or tier them off elsewhere for long-term storage (such as an HCP S storage node or public cloud). HCP also provides an automated backup solution, using its replication feature as a way to get copies of the backups off-site for DR. HCP Anywhere and HDI are also valuable to ensure a secure and available distributed environment.
There is no doubt that DevOps has contributed to the Speed and Agility of HCP development. In return the use of HCP in the DevOps tool chain has made the development more secure and available, and facilitated the quality integration of features and products in the HCP portfolio.
Enrico Signoretti, Head of Product Strategy at OpenIO, in a March 2017 Gigaom report called Sector Roadmap: Object storage for enterprise capacity-driven workloads, wrote the following: “The HCP (Hitachi Content Platform) is one of the most successful enterprise object storage platforms in the market. It has more than 1700 customers, with an average cluster capacity between 200 and 300TB. … Alongside the hardware ecosystem, HDI (remote NAS gateway) and HCP Anywhere (Sync & Share) stand out for the quality of their integration and feature set.”
If you already have or are planning to implement an HCP, consider including DevOps as another tenant in your multitenant HCP. For reference you can download this white paper written by our HCP developers on how they use HCP as a Continuous Integration Build Artifact storage system for DevOps.