Hitachi Vantara has announced a series of hardware and software enhancements to their Hitachi Unified Compute Platform [UCP] of converged, hyperconverged and rack-scale systems. On the hardware side, support was announced for NVMe caching and the new VSP [Virtual Storage Platform] systems, and the introduction of Hitachi’s first GPU servers. On the software application side, Hitachi Vantara announced new SAP HANA certifications and new Pentaho tiering for SAP data. Non-SAP enhancements include Oracle Enterprise Data Warehouse Optimization, and new Big Data reference architectures with Cloudera and Mongo DB.
“Our UCP portfolio is very important to Hitachi Vantara,” said Chris Gugger, Director, Infrastructure Solutions Marketing at Hitachi Vantara.”When we sell a UCP system, we drive a lot of hardware and software with it, like the Hitachi Content platform. UCP’s CI and HCI and rackscale solutions also leverage a lot of our own IP – like our VSP. UCP is also important for its foundational capability.”
“From the application lens, our HCP CI and HCI are an opportunity for us to build out more development and engineering and solutions on a platform that can support Mode 1 [Traditional] data centre apps, and next-gen Mode 2 apps like SAP HANA,” said Paula Phipps, Director, Solutions Marketing, at Hitachi Vantara “Our UCP systems are the perfect platform to add more integrations and more services, and get up to application level to help customers become digital businesses.”
The hardware enhancements include New Intel Optane NVMe SSD caching in the Hitachi DS120 server,
“This is our first NVMe caching support, and will ensure both high performance and low latency transaction processing,” Gugger said. “NVMe caching is very hot right now. We have also added support for the new VSP storage models that we introduced last month, and which have been showing good momentum.” The VSP systems provide faster, more efficient performance and containers support, utilizing cross-platform AI-powered analytics and IT automation software.
“This GPU server is a complete net-new,” Gugger indicated. “It is well suited for VDI, industries with high graphics workloads, and around workloads in analytics. We think this will appeal to channel partners for these reasons.”
New converged servers include the Hitachi DS220 and DS240.
“These new 2U servers for scale-up and scale-out will provide more server options,” Gugger said.
UCP rack-scale announcements include new Intel Skylake processor support on the RS VF120F, and support for the new VMware vSAN 2017 6.7 qnd VCF 2.3 releases.
Hitachi Unified Compute Platform Advisor, the Hitachi UCP IT management and orchestration software, also has been enhanced.
“The automation has been enhanced with policy-based provisioning, and the addition of converged orchestration management of advanced storage replication,” Gugger indicated.
The announcement’s being made while the SAP SAPPHIRE event is underway in Orlando is likely not a coincidence, because the SAP part of the new applications ecosystem solutions is critical.
“We are announcing brand new SAP HANA appliance and TDI [Tailored Data Center] certifications,” Phipps said. “The new appliance certifications scale up to 8 CPUs and 12TB and are ideally suited for SAP Business Suite 4 HANA mission-critical enterprise operations. The TDI certifications scale to 64 nodes per frame and feature new Hitachi VSP flash storage systems.
“We have also made an integration with the SAP Data Cockpit centralized management console,” Phipps added. This adds resource forecasting analytics and threshold alerts to Hitachi Server and Storage Adapters for the SAP HANA Cockpit.
“The SAP component is critically important to the announcement because SAP has drawn a line in the sand by announcing that customers must move to HANA by 2025,” Phipps stressed. “In this business, for the data centre, that’s right around the corner. SAP has redesigned its entire business suite from top to bottom to work with HANA. After 2025, none of their ERP or CRM modules will work on anything but HANA, so this becomes very critical for our install base.”
New tiering for SAP data comes from a new Pentaho Data Integration.
“We are using our Pentaho software and its data integration capabilities to take the less frequently used data from production and automate tiering to a certified Hadoop cluster,” Phipps indicated.
A new Hitachi Solution for Databases with Oracle Enterprise Data Warehouse optimization offloads warm and cold data to a data lake consisting of a certified MongoDB cluster running on Hitachi UCP RS systems.
“This provides support for a new Oracle EDW use case,” Phipps said. “We are also now using Pentaho to take that less used or unused data over to a MongoDB.” This can reduce the cost of software licensing, scale-out capability and hardware acquisitions while supporting analytics on Oracle EDW and MongoDB.
Finally, Hitachi is announcing new pre-tested, pre-validated infrastructure featuring certified Cloudera Enterprise Data Hub and MongoDB Enterprise compute clusters, for Big Data analytics.
“These new application analytics are really important for both our global and regional SI partners,” Phipps said. “Our channel partners really need to work with both traditional and next-gen data centre apps, and with these we have done the engineering and development work for them so they can modernize themselves for their customers. They need these next-gen application integrations in particular to stay relevant.”