advertisement

Sears’ MetaScale partners with Hortonworks

PRNewswire

Hoffman Estates-based MetaScale, the IT managed services subsidiary of Sears Holdings Corp., said it is partnering with Hortonworks, a commercial vendor of Apache Hadoop.

The partnership will focus on case-specific end-to-end solutions that leverage Apache Hadoop’s ability to reduce performance bottlenecks associated with data extraction, which could include reducing or alleviating mainframe capacity constraints and costs, as well as reducing overall business intelligence and data warehousing costs, the company said in a release.

As a result of the new partnership, MetaScale is positioned to help customers successfully deploy Apache Hadoop through the Hortonworks Data Platform, a highly scalable, open source production-grade platform that stores, processes and analyzes massive volumes of data. Powered by Apache Hadoop, HDP features MapReduce, HDFS, HCatalog, Pig, Hive, HBase, Ambari, and Zookeeper — some of Apache Hadoop’s most popular and essential projects.

“We have witnessed time and again that traditional enterprises face numerous challenges when trying to move from an Apache Hadoop experiment to a production application with business value and payback. The partnership between MetaScale and Hortonworks addresses those challenges by bringing together HDP and MetaScale’s ability to design, build, host, integrate, and operate production-grade solutions on customer premises or on a private cloud,” said Krishna Nimmagadda, head of marketing and business development at MetaScale.

“We successfully completed multiple Apache Hadoop initiatives that enable our customers to reduce cost and complexities. The exciting part about this is the underlying big data framework that comes with those initiatives and which often lead to new business models and revenue enhancing opportunities,” she added.

Mitch Ferguson, Hortonworks vice president of business development added the partnership with MetaScale “offers organizations an easy-to-use and easy-to-deploy solution for developing and running Apache Hadoop-based applications in production.

“The combination of simplicity of deployment and expert skills that this creates will accelerate Apache Hadoop projects for organizations looking to exploit big data,” Ferguson added.

Article Comments
Guidelines: Keep it civil and on topic; no profanity, vulgarity, slurs or personal attacks. People who harass others or joke about tragedies will be blocked. If a comment violates these standards or our terms of service, click the "flag" link in the lower-right corner of the comment box. To find our more, read our FAQ.