Cloud-scale analytics for the data centre

Combined offering delivers unified analytics and machine learning on a reliable platform.

  • 2 years ago Posted in

Vertica has announced integration with H3C ONEStor to deliver the benefits of cloud-native analytics to enterprise data centres. Together, Vertica and H3C empower analytically driven companies to elastically scale capacity and performance as data volumes grow and as machine learning initiatives become a business imperative – all from within hybrid environments.

“With this integration, data-driven leaders in the APAC region will benefit from a powerful combination of industry-leading platforms that accommodate any present and future strategic analytical and machine learning initiatives,” said Scott Richards, vice president and general manager of Vertica. “H3C has a solid presence in this region, enabling our joint customers to run Vertica’s cloud-optimised architecture with H3C’s ONEStor to meet the most demanding performance and financial requirements – from enterprise data centres or private clouds.”

Vertica with H3C ONEStor enables organisations to adopt cloud innovation for analytics wherever their data resides, even if the timing or costs of cloud migration is not feasible. Combining these two technologies offers fast analytics while simplifying data protection with easy backup and replication features. In addition, it provides 99.9999999 percent of reliability for storing mission-critical data as it leverages cloud technologies for on-premises deployments.

“We’re delighted to offer Vertica analytics and machine learning on top of our ONEStor from H3C. The analytical performance offered by Vertica combined with the data reliability of ONEStor offer ultra-large-scale and capacity, unmatched analytical performance, and high data reliability,” said Yili Liu, VP of Cloud and Intelligence Product Line from H3C.

The combined offering delivers high-performance analytics and machine learning with enterprise-grade object storage to enable organisations to:

 

•   Address scalability needs for now and in the future – Elastically scale-out to support terabytes to petabytes of data and thousands of users as your analytical and machine learning needs increase.

•   Leverage separation of compute and storage architecture – Administrators can scale compute and data storage resources separately to address varying dynamic workload requirements.

•   Simplify database operations – The solution offers excellent reliability by including many features for data protection. Data loss is extremely rare, offering up to nine 9’s of reliability.

•   Address all data consumer needs – Isolate analytical workloads to accommodate various data consumer needs – from business analysts to data scientists – without competing for resources.

The 2024 State of Data Intelligence Report finds companies struggling with AI governance more than...
On average, only 48% of digital initiatives meet or exceed business outcome targets, according to...
Fivetran equips over half of Trinny London's workforce with self-service analytics, accelerating...
Techcombank, one of Vietnam’s leading financial institutions, has implemented the Databricks Data...
New survey data from Cohesity reveals that consumers surveyed worldwide are highly concerned about...
As the speed of decisions increases, new Confluent research shows half of C-level executives are...
NinjaOne AI program focuses on customer success and thoughtful adoption over hype.
The new seven-story Fitzrovia-based space will be one of the company's largest offices outside of...