What if you could process and analyze all your data instantly? Learn about a new solution to address your data challenges with a unique modular design that scales easily and economically for businesses of any size.
Imagine, for a moment, the possibilities that would open up. In retail, you would be able to deliver the right offer to the right person—based on that person’s entire purchasing history—right at the point of transaction. Or, in manufacturing, real-time supply-and-demand matching could become a reality. For finance, consider dynamic portfolio risk estimation with up-to-the minute market information. For the vast majority of organizations, transforming to a business that can take action in real-time based on real-time insights derived from all their data is a vision. For some, though, it is a journey they have already begun.
One way companies have started this journey is by moving their data environments to in-memory computing—to accelerate analytics and use more of their data. SAP, Oracle and Microsoft, and other leading software vendors, have been delivering in-memory databases for several years. SAP HANA, Oracle Database In-Memory and SQL Server with in-memory capabilities are all examples of this new class of database, and customer adoption is maturing as companies start realizing the value of processing data at the speed of memory.
Still, a large portion of data goes unused for analytics, in the range of 60-to-70 percent(1). The database layer alone can’t address the new data demands, with levels of volume, velocity and variety never before seen. What is required is a fundamental shift in compute, a new memory-centric infrastructure paradigm.
Going beyond Memory-Driven Computing
At HPE, we have been leading this innovation with Memory-Driven Computing, a revolutionary architecture that addresses the data dilemma with the promise of previously unobtainable scalability and flexibility. In order to truly harness the full value of in-memory computing—while addressing today’s giant data sets and tomorrow’s growth, the underlying infrastructure can’t be made up of conventional systems.
With data growing at an exponential—yet unpredictable—pace, the infrastructure supporting in-memory databases must be powerful enough to cope with very large data sets and flexible enough to grow with the demands of the business. In addition, for companies adopting in-memory databases, converging transactions and analytics for real-time insights becomes realistic. But this comes with a challenge: your critical transactional workloads demand a reliable platform delivering the highest levels of uptime.
Now announcing HPE Superdome Flex
To help you in this journey, I am excited to announce the introduction of HPE Superdome Flex, the industry’s most scalable and modular in-memory computing platform, to address your data challenges today and tomorrow. A major milestone in the Memory-Driven Computing innovation roadmap, this new platform will help you stay ahead of your competitors as you turn critical data into real-time business insights.
HPE Superdome Flex combines the proven mission-critical reliability of HPE Integrity Superdome X with the world-class scalable technology acquired from SGI to deliver an unprecedented combination of scale, modularity, flexibility and reliability, so you can turn those insights into action, and action into success—with peace of mind that your business will be always on.
The unique modularity of Superdome Flex allows you to start small, at 4-sockets, and grow at your own pace—without sacrificing performance. Scaling seamlessly up to 32 sockets, in 4-socket “building blocks,” it gives you the compute power you need, no matter how much data you have or how fast it is growing. The scalable modular architecture helps you avoid overprovisioning and disruptive upgrades, with all the cost and complexity those carry.
With the 4-socket modular building block you can scale up or out, and convert a scale-out configuration to a scale-up platform, or vice versa. Each building block can be configured to match your specific workload requirements with a wide variety of memory size and capacity, processor and core count, and amount and type of I/O options. Superdome Flex can be configured with as little as 768GB of memory, and grow up to 48TB of memory, and even more once larger DIMM sizes come into the market. You can rest easy, knowing that you have plenty of room for growth.
Superdome Flex is designed for mission-critical availability and delivers proven Superdome reliability capabilities—not present in other standard x86 servers—to safeguard your most critical workloads and ensure they are continuously available. These include the Error Analysis Engine that gives you best-in-class predictive fault handling and initiates self-repair without operator assistance.
We have also a unique “Firmware First” approach to help you contain errors at the firmware level before any interruption can occur at the OS layer. And we deliver advanced and unique resiliency capabilities across every subsystem—memory, I/O, processor and fabric—for prompt error detection and system self-healing.
On top of all the platform capabilities, HPE’s broad range of consumption models and Pointnext services including Flexible Capacity, together with our partnerships and expertise, give you a complete solution for your critical data applications—whether you need a smaller 4-socket environment, a 32-socket powerhouse, or somewhere in between.
With solutions based on Superdome Flex, you can be confident in your ability to keep up with the unprecedented data flows in your digital core—the set of processes, applications and data that are most critical to your business—while delivering business continuity and agility to respond quickly and efficiently to business change.
Learn more about Superdome Flex
For more details on how you can get the compute power you need to leave your limits behind, visit www.hpe.com/superdome
And stay tuned for future posts as we share more details on Superdome Flex solutions during HPE Discover Madrid.
(1)“Global Business Technographics Data and Analytics Survey,” Forrester Research, Inc., 2016