La nueva máquina tiene 160 terabytes (TB) de memoria y ha sido diseñada para trabajar con Big Data (grandes conjuntos de datos). Es un monstruo de la informática capaz de analizar el equivalente a 160 millones del libros al mismo tiempo, asegura HP. “Los datos hoy día crecen más rápidamente que nuestra capacidad para computarizarlos”, explicó Jason Pelc, responsable de la fotónica del proyecto… HP asegura que en 2020, 100.000 millones de dispositivos conectados generarán una demanda mucho mayor de lo que las infraestructuras actuales son capaces de gestionar.
Designed to work on big data, it could analyse the equivalent of 160 million books at the same time, HPE said.
The device, called The Machine, had a Linux-based operating system and prioritised memory rather than processing power, the company said.
HPE said its Memory Driven Computing research project could eventually lead to a “near-limitless” memory pool.
“The secrets to the next great scientific breakthrough, industry-changing innovation or life-altering technology hide in plain sight behind the mountains of data we create every day,” said HPE boss Meg Whitman.
“To realise this promise, we can’t rely on the technologies of the past, we need a computer built for the big data era.”
Prof Les Carr, of the University of Southampton, told the BBC The Machine would be fast but big data faced other challenges.
“The ultimate way to speed things up is to make sure you have all the data present in your computer as close to the processing as possible so this is a different way of trying to speed things up,” he said.
“However, we need to make our processing… not just faster but more insightful and business relevant.”
“There are many areas in life where quicker is not necessarily better.”
Fuente: http://www.bbc.com