Posting in Architecture
IBM's future Big Data plan could generate more data in one day than the entire Internet.
IBM and ASTRON have announced a 5-year collaborative effort to research fast, efficient and low-power exascale computing in order to run the world's largest radio telescope -- and further research in space, matter and our origins.
Partnering with ASTRON, the Netherlands Institute for Radio Astronomy, IBM and the institute plan to research the 'exascale' computer systems that are required for the Square Kilometer Array (SKA), the world's largest radio telescope.
The five-year 32.9 million-Euro project is expected to mean the completion of the SKA in 2024. Once completed, the telescope will be used to explore evolving galaxies, dark matter and the origin of the universe -- dating back approximately 13 billion years.
It is estimated that the processing power required to operate the telescope will be equal to several millions of today's fastest computers -- making the research paramount if the radio telescope is to be of future use.
Known as the DOME project, will mean reading, analyzing and storing an exabyte of data every day -- which is twice the internet's current daily traffic.
Big Data is a continual theme in industry and research departments -- but require extremely high specification computer systems in order to collect and process such large streams of data. In order to tackle this problem, the partnership will aim to develop computing architecture and data transfer systems with capacities far beyond current technological levels.
Ton Engbersen of IBM Research said:
"This is Big Data Analytics to the extreme. With DOME we will embark on one of the most dataintensive science projects ever planned, which will eventually have much broader applications beyond radio astronomy research."
The SKA, currently in development, will have millions of antenna in order to collect radio signals from an area the size of the United States. It will be 50 times more sensitive than any former radio device and more than 10,000 times faster than today's instruments.
After processing the data it collects, the telescope is expected to store between 300 and 1500 petabytes (314572800 to 1572864000 GB) per day.
Image credit: NASA Goddard Space Flight Center
- Big data is big deal for big government
- The Morning Briefing: Big Data
- Using big data to score companies on sustainability
- Cloud may generate more jobs than the Internet, studies predict
- CERN’s European cloud computing mission
Apr 1, 2012
I submited an abstract to use all the GPS time dilation data from Cell, Satelight, and so on... As a huge Gravitational Telescope... I want to take a picture of GOD.
It is said our reach should always exceed our grasp. Apparently we are in no danger of ever resting on our laurels. There is always a path further on to climb.
Great story, Charlie. To put it in perspective, though, while this amount of data boggles the mind in human terms, it is one grain of sand in a vast beach of data that the observable Universe has to offer!