GDT::Grid::Utilitarian::Archive::Year 2009

Grid Utilitarian
A 26 Billion Pixel Image
It's all relative, but I consider 26,031,250,000 a lot of pixels.

I tweeted the following on 2009.12.29 at 5:42am MST.

   297500 x 87500 pixel (26 gigapixel) image:

If you click the shortened hyperlink, be sure to click on the thumbnails to experience the zooming.

[30 December 2009, top]

ComputerWorld Says Exascale Computing By 2018 posting says we might have exascale computing by 2018.
   "The total capacity of the latest Top500 list of the most 
    powerful supercomputers, released at SC09, was 27.6 petaflops, 
    up from 22.6 petaflops in the previous list, released in June."

22% increase in six months isn't bad., IT community await exascale computers

[13 December 2009, top]

Cray Launches Exascale Research Initiative in Europe
One exaflops is one quintillion flops.
   "As part of a company-wide goal of reaching sustained exascale 
    performance by the end of the next decade, Cray Inc. announced 
    the launch of its Exascale Research Initiative. This research 
    initiative will explore new ideas and technologies for overcoming 
    the challenges of delivering a supercomputing system capable of 
    sustained application performance."

One exaflops is a million trillion flops.

   "Initially, Cray will assemble a team of researchers located at 
    EPCC, which is the supercomputing centre at The University of 
    Edinburgh, and at the Swiss National Supercomputing Centre (CSCS). 
    Cray will closely collaborate with these centers along with its 
    European software partners such as Allinea Software Ltd."

On exaflops is a billion billion flops. Launches Exascale Research Initiative in Europe

[02 December 2009, top]

Al Gore is Pro Supercomputing reported the following.
   "Supercomputers, he said, can be used to show the world how 
    climate change is affecting the earth in human terms. The 
    visualization capabilities of supercomputers can produce 
    a visceral reaction to the potential for catastrophe."

And they quote Al Gore saying.

   "Supercomputing has given us the most powerful tool in 
    the history of civilization." -- Al Gore says supercomputing can be killer app in climate change

[30 November 2009, top]

Jaguar is World's Fastest Supercomputer
The computing roadmap is for 10 petaflops early 2011; 20 petaflops in 2012; and 1000 petaflops in 2018-2020. On 17 November 2009 the following was reported.
   "Jaguar, which is located at the Department of Energy's 
    Oak Ridge Leadership Computing Facility and was upgraded 
    earlier this year, posted a 1.75 petaflop/s performance 
    speed running the Linpack benchmark. Jaguar roared ahead 
    with new processors bringing the theoretical peak capability 
    to 2.3 petaflop/s and nearly a quarter of a million cores."

Note: The world's three fastest supercomputers are in the United States and two of them are in Tennessee. The 4th and 5th fastest systems are in Germany and China, respectively.'ss Jaguar Claws its Way to Number One, Leaving Reconfigured Roadrunner Behind in Newest TOP500 List of Fastest Supercomputer

[30 November 2009, top]

Phones as Supercomputers
Eric Schmidt (CEO of Google) "knows" the future of computing. When Schmidt speaks, I listen.
   "A billion people on the planet are carrying supercomputers 
    in their hands. Now you think of them as mobile phones, but 
    that's not what they really are. They're video cameras. They're 
    GPS devices. They're powerful computers. They have powerful 
    screens. They can do many many different things, and oh, 
    by the way, you can talk on them too. That's what the 
    mobile phone of today is."--Eric Schmidt

The "location, location, location" edict of the 20th is "mobile, mobile, mobile" in the 21st century.

Another quote from Eric Schmidt...

   "It's [cloud computing] a bigger phenomena than, for example, 
    the PC industry, and probably the next big wave of computing," CEO Imagines Era Of Mobile Supercomputers

[28 October 2009, top]

Work Begins On FutureGrid
The Indiana University press release from 10 September 2009 started with the following.
   "A group of information technology researchers at Indiana 
    University has been chosen by the National Science Foundation 
    to lead a four-year, $15-million project to develop new software 
    to link together the supercomputers of tomorrow and enable new 
    approaches to scientific research for problems of massive scale. 
    $10.1 million will come from the NSF, with project partners 
    providing the balance."

The project is called FutureGrid and the "network's processors will be located at IU, UC-San Diego, USC, Univ. of Chicago/Argonne National Labs, Univ. of Florida and Univ. of Texas at Austin." to lead nationwide research network to expand supercomputer capabilities

[11 September 2009, top]

KMA Buys 600+ Teraflops Cray
Korea Meteorological Administration (KMA) in South Korea is getting a 600 teraflops supercomputer from Cray.
   "Once completed, the Cray supercomputer at KMA will be the 
    largest integrated solution for operational numerical weather 
    prediction in the Asia Pacific region and one of the largest 
    in the world," said Peter Ungaro, Cray president and CEO.

The KMA must "identify the potential of severe weather systems in a more timely fashion" and this is why they need high-performance computing systems.

KMA's Cray will have a "multi-petabyte storage, archival and back-up system." Awarded Supercomputer Contract From the Korea Meteorological Administration Valued at More Than $40 Million

[08 September 2009, top]

10 Petaflops Computing By Early 2011?
I posted this to my Facebook on 27 August 2009.
   Yesterday, one of the motivations I gave for learning 
   about numbers into the quadrillions was the fact that 
   the Roadrunner supercomputer can do 1.64 quadrillion 
   arithmetic calculations in one second. After yesterday's 
   classes. I came across an InfoWorld article about there 
   being a supercomputer in early 2011 that will be to 
   perform 10 quadrillion calculations per second. aims for 10-petaflop supercomputer

[27 August 2009, top]

Making Money in HPC is Not Easy
I believe it...
   "'How do you make a small fortune in high-performance computing?'
    There are several variations on the joke, but they all end with 
    the same punch line, 'Start with a large fortune and ship at least 
    one generation of product. You will be left with a small fortune.'"
    --Daniel Reed

And I agree with Reed that we're at an inflection point when it comes to HPC.

   "I believe we are at an inflection point, where new approaches 
    must both survive and flourish if we are to continue to deliver 
    higher performance in effective and reasonable ways." Making a Small Fortune

[15 August 2009, top]

Predicting Mother Nature is Hard
It remains tough to predict the mother of all mothers (i.e. Mother Nature).
   "Weather forecasting has been transformed by the advent of 
    Earth-observing satellites, leaps in computing power and 
    more advanced models of the atmosphere and oceans, but it 
    remains a business built on uncertainty."

In a nutshell: High-performance computing systems with high-performance visualization systems require high-performance algorithms and software. The algorithms and software are the bottlenecks. weather forecasting: Still uncertain despite leaps in technology

[29 July 2009, top]

DARPA Says It Needs Extreme Supercomputing
From Government Computer News (
   "The Defense Department wants to take supercomputing to the next 
    level by funding the development of a new breed of supercomputers 
    that will be smarter and faster and yet smaller and require much 
    less power than today's massive machines."

The DoD is correct to think the following...

   "DOD officials believe such computers will be necessary to make 
    sense of the avalanche of data that will gush forth from 
    tomorrow's network-tethered sensor systems."

Network-tethered nanosensor systems... investigates extreme supercomputing

[09 July 2009, top]

From the Cray I to Roadrunner...
Great pictures!

From the Cray I to the Roadrunner... of the coolest and most powerful supercomputers of all time

[11 June 2009, top]

Korea Meteorological Administration Picks Cray
Cray Inc. announced that the "Korea Meteorological Administration (KMA) has selected Cray as the preferred bidder for a multi-year contract to provide KMA with a next-generation supercomputer." Kudos to Cray!

The Cray press release stated the following.

   "Based in Seoul, Korea, KMA's mission is operational weather 
    forecasting and climate research for the benefit and welfare 
    of the Korean public and industry.  KMA will use the proposed 
    supercomputing products to provide more accurate weather forecasts 
    for the East-Asia Pacific region through increased model resolution, 
    new forecasting models, increased ensemble sizes and the 
    implementation of advanced data assimilation."

Being able to accurately predict Mother Nature can save lives.

[10 June 2009, top]

100 Petaflops by 2016?
I came across this while looking on the message board (which is usually nothing but noise).
   "If we calculate the performance predictions for the 
    TOP500 list, then we will see a 100 Petaflops system 
    most likely in the year 2016."


   "And hopefully Exascale Systems will be seen first in 2019." trends in High Performance Computing

[05 June 2009, top]

Europe Doing Petaflops Computing
The sub-headline read: "Gauss Centre for Supercomputing gets the first European petaflop computer / Three new supercomputers for European research now in Julic."
   "The supercomputer JUGENE will secure Europe independent access 
    to a decisive key technology of the 21st century," said Prof. 
    Dr. Achim Bachem, Chairman of the Board of Directors of 
    Forschungszentrum Julich and Coordinator of the European 
    Supercomputing Alliance PRACE.

And Dr. Bachem isn't kidding...'s Fastest Computer Unveiled in Julich

[01 June 2009, top]

Supercomputers Requires Supersoftware
Supercomputers keep getting more and more super, but the same cannot be said for the software that is executed by the supercomputers.
   "Science and engineering are advancing rapidly in part due 
    to ever more powerful computer simulations, yet the most 
    advanced supercomputers require programming skills that 
    all too few U.S. researchers possess. At the same time, 
    affordable computers and committed national programs 
    outside the U.S. are eroding American competitiveness 
    in number of simulation-driven fields." In Computer Simulation Superiority

[06 May 2009, top]

Univ. of Texas Scores Big HPC Win Over ASU
Wow... I've been asleep with the wheel.

On 4 May 2009, I learned that Dan Stanzione, Director of ASU's High Performance Computing Initiative (HPCI), is leaving ASU to become the Deputy Director of the Texas Advanced Computing Center (TACC) at the University of Texas at Austin. Kudos to Dan Stanzione. Texas is a true 21st century state when it comes to CSTEM and I am confident Stanzione is going to do great things in Texas.

Stanzione's split from ASU took me by surprise, but he is one this world's leading computer scientists and from a HPC perspective Arizona is a third-world state compared to Texas. I hope Stanzione maintains his connections with TGen and the Biodesign Institute at ASU. Advanced Computing Center Announces Dan C. Stanzione Jr. as Deputy Director

[06 May 2009, top]

I Hope Rackable Regrets SGI Purchase
Wow... from the "too bad" department... SGI lives! Penguin CEO on the SGI/Rackable deal

[03 May 2009, top]

I Gave a HPC Lecture At SCC
On 1 April 2009, I gave a lecture at Scottsdale Community on the topic if high-performance computing in the 21st century. About ten people attended the lecture.

GDT::AzGrid::20 Petaflops by 02012

[05 April 2009, top]

Cloud Computing is Happening
Cloud computing is happening.
   "One of the fastest-growing Web industries, cloud computing lets 
    you rent additional storage space and processing power over the 
    Internet without your IT guy having to wheel more million-dollar 
    machines into the server room. You pay as you go, hooking up as 
    many servers as you need for as long as you need them."

However, some are leery...

   "In geekspeak, you 'reach into the cloud' - provided you're 
    prepared to take the risk of leaving your data there."
    --Cindy Waxer computing: Supercomputers for hire

[29 March 2009, top]

Illinois Must Support the NCSA
The NCSA helps put Illinois on the high-tech map. The NCSA helps make Illinois an important state in the 21st century, but that can all change if Illinois turns it back on the NCSA.
   "When the University of Illinois won a $208 million federal grant 
    to build the nation's fastest supercomputer, Gov. Blagojevich 
    said the state would kick in money for a building to house it."

   "But 17 months later, the school is wondering if it will ever 
    see the money."

Let's hope Illinois politicians continue to support the NCSA. NCSA be stiffed on state funds to house Blue Waters?

[03 March 2009, top]

Pentagon Buys Five Supercomputers
Government Computer News reported that the "Defense Department's High Performance Computing Modernization Program has awarded a $40 million contract to purchase five high-performance computers and support services for its research and development centers."
   "To achieve breakthroughs in designing and testing 
    next-generation materials and weapons systems, DOD 
    researchers require high-performance systems that 
    are scalable and reliable," said Cray Henry, 
    director of the HPCMP, in a statement. 

Hey Cray Henry... why didn't you buy Cray computers? purchases supercomputers

[03 March 2009, top]

Coming in 2012 -- DOE's Sequoia
The Department of Energy's Sequoia supercomputer is scheduled to be installed at the Lawrence Livermore National Laboratory in 2012. The IBM Sequoia is going to be 20-petaflop machine.
   "Every time you do predictive science, the next question is: 
    How confident are you in that prediction? It turns out that's 
    a very easy question to ask and a very profound question to 
    try to answer," said computer scientist Mark Seager of 
    Lawrence Livermore National Laboratory. "The way that 
    we do that is by running a whole bunch of simulations. 
    Instead of just one simulation, you do 4,000." See Your Petaflop and Raise You 19 More

[03 February 2009, top]

Roland Piquepaille Died on 2009.01.05
Roland Piquepaille died on 5 January 2009. According to ZDNet, Roland "spent most of his career in software, mainly for high performance computing and visualization companies, working for example for Cray Research and Silicon Graphics. He left the corporate world in 2001 after 33 years immersed into it." Roland maintained a blog about "How new technologies are modifying our way of life."

Roland Piquepaille has been added to the GDT::DeadTeam. Piquepaille's Technology Trends

[10 January 2009, top]

About the Grid Utilitarian
The Grid Utilitarian is a blog devoted to high-performance computing. This includes grid-based utility computing and 21st century Informatics. This blog was created on 3 October 2004 and it starts 2009 with 192 postings.

Grid Utilitarian Archives: 2008 | 2007 | 2006 | 2005 | 2004

[01 January 2009, top]

Creator: Gerald Thurman []
Last Modified: Saturday, 05-Jan-2013 11:17:33 MST

Thanks for Visiting