Hu Yoshida

Hitachi Data Systems – A Blueprint for Building an Effective Data Center

Blog Post created by Hu Yoshida Employee on Jun 14, 2017

Twenty five years ago when I returned to California after living in Japan, I was able to realize my dream of building my own custom home. At that time home builders in California were buying up farmland and throwing up houses as fast as they could. I wanted something different. Something with the quality and attention to detail that I was used to in Japan but on a larger scale. I was fortunate to find an architect and contractor who shared my vision for quality. Building a house is a very manual process with opportunities for human error at every step. While other contractors hired low cost itinerant subcontractors who moved from site to site to do the framing or shingle the roof, our contractor kept the same subcontractors for each job to ensure quality execution and follow through.


House plans.png


My colleague in Infrastructure Solutions, Tony Huynh, believes that architecting and building an effective data center is akin to how a proven home builder creates a blueprint for a rock-solid home that customers buy with confidence in its long  term value. Both the IT Architect and the home builder has similar qualities – solid engineering prowess, efficient execution, and a predictable, high quality result. Conversely, a shoddy home builder can put something together quickly (and even cheaper), but the results will reflect their effort. Tony contributed the following post to explain the benefits of using the Hitachi Automation Director in building an effective data center.


The approach to implementing a modern data center should be looked at in its’ totality.  This is especially true when you’re considering deploying flash technology for your critical tier 0 and tier 1 applications. In addition to the core hardware “bits”, an assessment should be taken for the robustness of the underlying OS, as well as complementary software that makes the solution complete, whole – a solid brick house.


Intellectual property and engineering prowess has always mattered, more so now more than ever.

For our Hitachi VSP F all flash and VSP G hybrid arrays, we offer Hitachi Automation Director – our data center automation software that helps reduce, in some cases manual processes by 90% by utilizing templates for automatic provisioning and other resource intensive tasks.  By reducing manual processes, this can then result in significantly reduced probability of human errors. And we all know the financial, brand, and customer impact of a single keystroke error.


Today, we have a large Hitachi Automation Director customer that previously spent 23 hours+ manually provisioning storage for their AIX servers, and they did this 100+ times a month (that’s not a typo). With Automation Director, they have now reduced the same provision process down to less than <50 minutes.



23 Hours x ~100 times a month = ~2300 manual hours. HIGH PROBABILITY FOR HUMAN ERRORS



<50 minutes x ~100 times a month = 83 hours. SIGNIFICANT REDUCTION IN HUMAN ERRORS

Imagine what can be accomplished with the additional ~2200 hours freed up per month in IT resources



We are constantly investing in features that help customers optimize their VSP F and G Flash deployments. Introducing Hitachi Automation Director 8.5.1, with  automatic flash pool optimization.


This new feature reduces the manual processes associated with increasing flash pool sizes, which would take seven steps.


With HAD 8.5.1, this can now be collapsed into two steps (see figure below). That’s a 70% reduction over manual processes, but the benefits are compounded since HAD will automatically increase the pool size without future storage admin/user intervention.




When it comes to choosing a solution for critical flash workloads, it’s important to look for the entirety of the solution. Your VSP Flash deployments can reach its maximum effectiveness with Hitachi Automation Director by reducing manual provisioning processes by 90%. This has a direct impact on significantly lowering risks of human errors, and re-directing those IT resources to other strategic projects.