- Getty Images
Hewlett Packard Enterprise has come a big step closer to launching a computer called The Machine that it’s been talking about, researching, and developing since 2014. On Tuesday, it announced that is has a prototype of this computer that is specifically designed for the big data era.
It uses a new kind of memory to be able to store and instantly analyze mind-boggling amounts of data, potentially even a limitless amount of data. The current prototype that HPE is showing off today contains 160 terabytes (TB) of memory, which is enough to store and work with every book in the Library of Congress five times over, the company says.
But this new kind of memory can expand far beyond that. HPE expects to be able to build a machine that reaches up to 4,096 yottabytes … which is big enough to hold 250,000 times all the data currently stored in the world. For instance, The Machine can crunch through “every digital health record of every person on earth; every piece of data from Facebook; every trip of Google’s autonomous vehicles; and every data set from space exploration all at the same time, ” HPE CEO Meg Whitman wrote in a blog post.
She calls The Machine HPE’s moonshot device.
“No computer on Earth can manipulate that much data in a single place at once. And this is just our prototype,” she wrote.
Not only has the company invented a new kind of memory to build this computer, but the company is breaking from its long-standing partnership with Microsoft and building a new operating system, based on Linux, to run this computer. It is also using ARM chips as the main processor, not Intel chips.
Here are the technical specs for this prototype:
- 160 TB of shared memory spread across 40 physical nodes, interconnected using a high-performance fabric protocol An optimized Linux-based operating system (OS) running on ThunderX2, Cavium’s flagship second generation dual socket capable ARMv8-A workload optimized System on a Chip Photonics/Optical communication links, including the new X1 photonics module, are online and operational Software programming tools designed to take advantage of abundant persistent memory