IBM Cloud Authors: Yeshim Deniz, Elizabeth White, Pat Romanski, Liz McMillan, Stefan Bernbo

News Feed Item

Veristorm Launches vStorm Enterprise: Brings Hadoop to Mainframe for Secure, Affordable Analytics

Veristorm today announced the release of vStorm Enterprise, the first Hadoop-based Big Data platform to run natively on the leading mainframe servers. With drag-and-drop access to mainframe databases and files, and the first commercial Hadoop distribution for Linux on mainframe, the vStorm Enterprise platform enables access and analytics of both sensitive mainframe data and other big data sources in a secure, scalable environment.

“Mainframes process 60% of all transactions and so to our users the valuable use cases for Big Data processing and analytics will leverage mainframe data,” said Sanjay Mazumder, Founder and CEO of Veristorm. “Governance, technical challenges, and skills shortages around COBOL make offloading complex and expensive. We have solved this problem by moving the workload to the infrastructure processing the data.”

Len Santalucia, CTO of Vicom Infinity, an IBM Premier Business Partner, said "Our customers value our ability to bring them innovative solutions. The response to Veristorm’s Hadoop solution for Linux on mainframe servers has been tremendous because it allows users to gain more value from their investment by providing significant insights into their mainframe data. Veristorm is a strategic component that helps eliminate the mainframe quarantine and distributed server sprawl."

Secure Access to Essential Data

The vStorm Enterprise platform enables analytics on mainframe logs and customer data, without delay or security risk. It can copy tables from database applications and files directly into Hadoop on Linux running on the mainframe. Additionally the virtual appliance can connect to external x86 environments, avoiding the complexity of COBOL copybook and security risks, compliance challenges, and delays of offloading data.

Familiar and Scalable

The zDoop software is the first commercial distribution of Hadoop for Linux running on mainframe servers. With support for the leading enterprise Linux distributions, zDoop creates a supported and familiar environment to bridge the mainframe and distributed talent pools. It reduces processing and analytics costs by shifting work to dedicated Linux processors and from a metered to fixed software cost model. Customers can have a scalable private cloud for Big Data without additional footprint or deployment delays by leveraging zDoop on the mainframe’s scalable, on-demand processing capacity.

About Veristorm

Veristorm helps enterprises take full advantage of their entire dataset, including their most sensitive enterprise data, for analytics solutions that leverage big data, reduce the cost of development, and ensure security. Through Veristorm’s virtual appliance data connector technology and zDoop, the industry’s first commercial Hadoop distribution for Linux on the mainframe, we’re bringing the best of mainframe and distributed technologies to bear on classic problems and new opportunities. Veristorm is an IBM Business Partner. For more information, visit www.veristorm.com.

Web: www.veristorm.com
Twitter: www.twitter.com/veristorminc

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...