Game-changing innovation for AI and big data from IBM Storage
Artificial intelligence (AI) is already a world-changing driver of business today and will increasingly be so in the future. IDC predicts that, by 2019, 40 percent of digital transformation (DX) initiatives will use AI services; by 2021, 75 percent of commercial enterprise apps will use AI, over 90 percent of consumers will interact with customer support bots, and over 50 percent of new industrial robots will leverage AI.
Ranked the #1 AI market revenue leader from 2016 to 2018 by IDC, IBM is announcing both new solutions and significant innovations to existing systems designed to help enterprises of all types and sizes prepare for the rewards of leveraging AI and the next generation of big data analytics.
Big data analytics solutions search for insights hidden in massive data sets. AI applications take this to a new level, learning and becoming more effective by processing more and more information. The potential rewards are substantial and may help with better and faster fraud detection and data security, deeper understanding of complex market trends, greater manufacturing efficiency and accuracy and more responsive and intelligent customer service, among many other benefits. But these new solutions place enormous strain on traditional IT infrastructure. Enterprises that cannot cost-efficiently transform legacy systems to handle, move, store and process these massive data streams may not thrive in such a business climate.
The simplest way to deploy our fastest, most-scalable file system for AI and big data workloads
IBM has established an accelerated cadence of innovation and product development focused on providing market-leading solutions for the requirements of AI-driven organizations. The critical area of emphasis is management and storage of unstructured data. Announced today, IBM Elastic Storage System 3000 (ESS 3000) is the latest in our line of high-performance, highly flexible scale-out file system solutions engineered to handle the toughest unstructured data challenges. ESS 3000 integrates into existing infrastructures to unify data and provide a foundation on which to build scalable, flexible AI data pipelines. ESS 3000 core technologies are used as the storage for the Summit and Sierra supercomputers at Oak Ridge National Laboratory and Lawrence Livermore National Laboratory, currently the fastest supercomputers on the planet. The all-new ESS 3000 is now our simplest way to deploy the same world-class storage that powers them, enabling you to have a small (or big) slice of supercomputer storage for your own AI and big data workloads.
ESS 3000 leverages the ultra-low latency and massive throughput advantages offered by Non-Volatile Memory Express (NVMe) flash storage. The integrated scale-out data management solution is designed to be one of the simplest ways to deploy world-class file system technology. You can start with a single NVMe-powered node and grow capacity, performance, and resilience — as needed — with performance that scales nearly linearly with capacity. ESS 3000 utilizes containerized IBM Spectrum Scale for simplified management and upgrades. The efficient 2U building blocks start at under 25TB, with 40GB/sec performance and exabyte scalability.
ESS 3000 is based on an NVMe-powered IBM Storage platform. It’s a preconfigured, tuned solution in a single box to accelerate time to value. The containerized installation and software upgrades help to minimize demands on IT staff while helping to maximize operational efficiency. IBM Spectrum Scale then optimizes data protection and system reliability.
“Flexible, easily scalable file system storage is a priority for many of our customers,” notes Raj Randhawa, CEO of IBM Business Partner Horizon Computer Solutions. “With the introduction of many new AI and big data analytics applications, our customers are being inundated with unstructured data. We need to be able to furnish solutions to them that can address this challenge, plus start small and then easily grow as large as necessary. The new IBM ESS 3000 will help meet these requirements very well, with NVMe performance and multicloud capabilities. We’re very pleased to see the announcement of this powerful new solution from IBM Storage.”
Expanding an open data ecosystem for AI and analytics
Along with the new ESS 3000 solution, IBM is also announcing a number of enhancements to existing systems and product lines. Unstructured data is the fuel powering AI and big data applications. It’s a key asset produced by business operations and research activities around the world. But gaining value from this asset requires technologies that unify data insights. It also requires technology designed for openness, interoperability, and the management and optimization of data up to exabyte scale. This is IBM Spectrum Discover.
IBM Spectrum Discover serves the important role of classifying and labeling data with custom metadata. This not only makes it easier to find and recall data for analysis but can actually increase the value of data by imbuing it with additional semantics and meaning.
Consistent with our vision to provide insight into all unstructured data wherever it resides, IBM Spectrum Discover is expanding its support for data sources with a new data connector to IBM Spectrum Protect. The new connector enables IBM Spectrum Discover to harvest and extend metadata for records stored in backup and data protection systems managed by IBM Spectrum Protect. It provides exceptional insight, discoverability, and access to files in backup environments, informing the ability to restore specific data to more accessible tiers of storage when needed for analytics or other purposes.
IBM is also announcing the IBM Spectrum Discover Application Catalog—a new community-supported repository of open-source Action Agents that extend the functionality of IBM Spectrum Discover. Action Agents can help users support new and proprietary file formats, create and extend workflows with custom behaviors, and invoke actions in external applications, among many other tasks. The catalog will help IBM Spectrum Discover users search for available extensions and install them from a command line interface (CLI). IBM has already seeded the IBM Spectrum Discover Application Catalog with extended attribute extractors for IBM Spectrum Scale and IBM Cloud Object Storage, and a sample agent as a starting point for developing your own.
“The latest enhancements to Spectrum Discover demonstrate to our clients that IBM Storage is making substantial investments and commitments to this new data tool,” adds Luca Tani, Storage Brand Manager of IBM Business Partner Var Group S.p.a. “This strengthens our conversations with our clients about how we can help them move toward AI. Spectrum Discover has a lot to offer.”
Other announcements in storage for AI and big data
IBM Spectrum Scale, the software-defined storage engine that powers IBM ESS 3000 and data management complement to IBM Spectrum Discover, has not been standing still over the past months. IBM is announcing more enhancements, starting with a new IBM Spectrum Scale Erasure Code Edition that is designed to enable deployment on storage-rich commodity servers for greater flexibility and cost efficiency. The new edition offers all the capabilities of IBM Spectrum Scale Data Management Edition — scalability, performance, manageability, hybrid cloud connectivity and container support — plus the storage efficiency of powerful IBM-distributed erasure coding. It’s a superb choice for commodity server-based production environments requiring durability, reliability and performance. IBM is also releasing a no-charge IBM Spectrum Scale Developer Edition for educational and lab prototype environments. It provides full product functionality and no time limitation for capacities up to 12TB.
IBM Spectrum Scale added support for the industry-standard Container Storage Interface (CSI) across all IBM Spectrum Scale configurations and deployment models. This latest IBM Spectrum Scale version also provides improved fidelity for the Apple SMB protocol, support for select IBM FlashSystem models with thin provisioning enabled, automated monitoring and reporting of an even wider range of system data, and seamless scaling of the monitoring capability to hundreds of nodes. There is also a new pre-tested AI/analytics blueprint that simplifies deployment of storage solutions optimized for AI and big data analytics workloads.
A new version of the IBM Elastic Storage Server (ESS) architecture demonstrates that this powerful storage platform is continuing to innovate within its own framework. This new version incorporates all the latest IBM Spectrum Scale upgrades, plus adds support for Red Hat Enterprise Linux 7.7 and support for deploying new ESS 3000 nodes alongside existing IBM Elastic Storage Server solutions within a single cluster.
IBM is moving forward with innovations and solutions that help your organization get the most from AI and big data analytics. AI has the power to transform business. Partner with IBM and let it transform yours.
 IDC: Worldwide Storage for Cognitive/AI Workloads Forecast, 2018–2022, DOC #US43707918, April 9, 2018
 IDC: Worldwide Artificial Intelligence Market Shares, 2018: Steady Growth – POCs Poised to Enter Full Blown Production, doc #US45334719, July 2019
The post Game-changing innovation for AI and big data from IBM Storage appeared first on IBM IT Infrastructure Blog.