GDT::Grid::Utilitarian::Archive::Year 2005

Grid Utilitarian
2005 End of Year Clean Up
End of year clean up.

I liked the following quote from Fran Berman, director of the San Diego Supercomputer Center.

   "It's not easy for you and I to buy an Indy 500 car 
    and to maintain that. That's where it's important 
    to have government and large-scale investment in 
    these kinds of computers. ... And a real concern 
    from the scientific community right now is that 
    (U.S.) leadership is really falling behind."

Let's hope the U.S. Government will pump more money into supercomputing during 2006.

[28 December 2005, top]

Supercomputing Enabling Great Science and Technology
Supercomputers are being used across numerous domains. Supercomputing is going enable remarkable advances in science and technology.
   "Supercomputers have allowed us to check our models against 
    our understanding of spin's effect on a reaction, and our 
    models have been closely checked by experiment. The results 
    suggest that our understanding of electron behavior is sufficient 
    to create virtual models of molecules that we can then 'react' with 
    one another in simulations that accurately predict what will happen 
    when they meet in the physical world."

Purdue.edu:: Purdue scientists see biochemistry's future - with quantum physics

Understanding of the complex chemical reactions involved in combustion processes is non-trivial and supercomputers are coming to the rescue.

   "The calculation-performed using 1,400 parallel processors-took 
    only 23 hours to complete and achieved a sustained efficiency 
    of 75 percent, compared to the 5 to 10 percent efficiency of 
    most codes. For comparison, the best one-processor desktop 
    computer would have required a three and a half years and 
    2.5 terabytes of memory to run the calculation."

ScienceDaily.com:: High-performance Computing May Improve Combustion Efficiency

[28 December 2005, top]

Hewlett-Packard Does Utility Computing
Hewlett-Packard announced it was expanding its Utility Computing offerings. For example, HP Flexible Computing Services permits companies, for a price, to quickly acquire additional IT resources as needed. At HP, they "see a new world of computing where application services execute on shared IT resources." HP's vision of our "new world of computing" is not new, but we are rapidly moving into this new world and it looks as though HP wants to remain a player.

HP.com:: HP Opens its Data Centers to the Public via Utility Computing Services

[Extra] The headline says it all about the power of supercomputing... {News.Yahoo.com:: IBM's World Grid Draws Members, Fights AIDS }

[28 December 2005, top]

Build It and They Will Use It (Maybe)
Getting a computing grid up and running is difficult on many fronts as evidenced by Sun Microsystems.
   "It has been harder than we anticipated," said 
    Aisling MacRunnels, Sun's senior director of 
    utility computing in an interview. "It has been 
    really hard. All of this has been a massive learning 
    experience for us a company. I am not embarrassed to 
    say this because we have been on the leading edge."

ChannelRegister.co.uk:: Sun's grid: lights on, no customers

[Extra] I know all about wasted (i.e. unused) computing power. {GlobalTechnology.com:: Supercomputers wasted without trained users }

[28 December 2005, top]

Ohio Professor Receives National Computational Science Award
Capital University Professor Ignatios Vakalis has received the "Undergraduate Computational Engineering and Sciences (UCES) Award." The award announcement was made at the Supercomputing 2005 conference.
   "Vakalis serves as project director for the Keck consortium 
    and is Capital University's principal investigator for the 
    grant awarded from the W. M. Keck Foundation. The consortium 
    includes 12 institutions across the country: Capital University, 
    College of the Holly Cross, Harvey Mudd College, The Ohio State 
    University, Pomona College, San Diego State University, San Diego 
    Supercomputer Center, Shodor Education Foundation, Skidmore College, 
    University of Wisconsin - Eau-Claire, Wittenberg University and 
    Wofford College."

The Keck consortium has an interesting mix of schools.

   "The consortium serves as a model for institutions to collaborate 
    and develop computational science materials and curricula. It also 
    serves as a catalyst in forming a new consortium of institutions. 
    Consortium goals are to develop class-tested educational materials 
    for a variety of computational science courses, infuse computational 
    science curricula to member institutions, and prepare the next 
    generation of scientists so they are equipped with the necessary 
    computational tools."

OSC.edu:: Ohio Professor Receives National Computational Science Award

[28 December 2005, top]

Supercomputing Requires Lots Of Power (Energy)
I know I'm constantly annoyed with the batteries for my digital camera (and batteries in general). I don't know why we have seen better advances in battery technology. It appears that if advances are not made, then that could limit tomorrow's computing capabilities.

CNET News.com:: Power could cost more than servers, Google warns

[24 December 2005, top]

Crow Promotes ASU's Fulton HPC Initiative
Last week, on 2 December 2005, I posted a comment to Michael Crow's blog suggesting ASU hire a couple of guru-level Computer Scientists. I also lamented about how the state of Arizona is "off" the map when it comes to HPC.

Dr. Crow did not process my comment personally, but he did delegate that responsibility to ASU's UTO (University Technology Officer) Adrian Sannier. Sannier's reply ignored Computer Science in general, but focused on HPC.

    [2005.12.05]
    "An ASU bright spot in this area is the work of 
     Dan Stanzione and company. Funded by Mr. Ira Fulton, 
     the Fulton High Performance Computing Initiative not 
     only established a new high performance research 
     cluster here, but has provided a focal point for 
     cluster operations and program development support 
     here at ASU."

Sannier's reply contained a hyperlink to the Fulton High Performance Computing Initiative, which I was already aware of, so I clicked the hyperlink and my visit to the website prompted me to post the following reply to Sannier's reply.

   gthurman Says:
   December 8th, 2005 at 6:43 am

   Thank You for the response. I, too, am currently reading
   "The Singularity is Near." Providing the hyperlink to the
   Fulton High Performance Computing Intiative website is
   appreciated, but the current state of the website needs
   addressing. Examples: The last news item on the homepage
   is dated 12 July 2005. I know this is not true, but it gives
   the appearance that nothing has been happening the last half
   of 2005. The homepage doesn't even have a title object defined
   and web usability guru Jakob Nielsen claims the title is the 
   most important webpage object. If you go to the 'personnel' 
   webpage, 'Stanzione, Dan' is the only person with a hyperlink 
   and it doesn't work (i.e. it is suffering from linkrot)."

As of 10 December 2005, my reply had not generated a reply. Note: I had to open establish an account in order to post my reply to Sannier.net.

[10 December 2005, top]

I Posted an HPC Related Comment To Crow's Blog
Michael Crow, the president of Arizona State University, started a blog on 1 December 2005 and I posted the following HPC related comment to his blog.
   Gerald Thurman Says:
   December 2nd, 2005 at 5:52 am

   Welcome to the blogosphere.

   It would be excellent news if ASU were to add a couple 
   guru-level Computer Scientists to its Computer Science 
   programs. Although we are currently transitioning into 
   the next era of high-performance computing (HPC) and 21st 
   century Informatics, Arizona is "off" the HPC map. Arizona 
   will be helped in this area with Google and eBay coming to 
   the state, but Arizona's universities have access to grant 
   monies that can help turn this situation around. Note: I am 
   fully aware that most of our 20th century politicians have 
   been neglecting the funding for long-term Computer Science 
   related research projects. Again, maybe ASU and the state 
   of Arizona can help these politicians "see" our 
   computing futures.

As of 3 December 2005 there had been no comment to my comment.

MichaelCrow.net:: A New Page

[03 December 2005, top]

Bill Gates On Supercomputing, Software in Science, And More
When Bill Gates speaks, people listen and in this interview Gates speaks a lot. It is a worthy read.

InformationWeek.com:: Q&A: Bill Gates On Supercomputing, Software in Science, And More

[26 November 2005, top]

Supercomputing Used In a Myriad of Applications
The following are some hyperlinks that the Grid::Utilitarian has been sitting on. Supercomputing is being employed in a myriad of real-world applications.

[19 November 2005, top]

SC|05 Gateway to Discovery
SC|05 is an international conference on high performance computing, networking and storage. The 2005 supercomputing event started 12 November 2005 and it run unitl the 18th. This year it is being held in Seattle, Washington, and the keynote speaker will be Bill Gates. The ACM and IEEE Computer Society are the SC|05 conference series sponsors.

Supercomputing.org:: SC|05 Gateway to Discovery

[14 November 2005, top]

Is SGI Going Out of Business?
From a press release perspective, Silicon Graphics Inc. (SGI) had an impressive October of 2005. Yet, Wall Street has SGI's stock priced as if the company is going to go out of business. Scientists and researchers like SGI's products and I'm confused about what is going on with the company.

GDT::Computing::Bit:: Is SGI Going Out of Business?

[05 November 2005, top]

From ScienceDaily.com--Supersizing Supercomputers
When I need to give a nutshell description of the next era of computing, I simply reply: infinite computing power, infinite storage, infinite bandwidth with information presented using high-performance visualization systems. The next era of computing is going to be the great enabler of biotech, nanotech, robotics, and space travel. The title caught my attention: "supersizing the supercomputers."
	"These supercomputers of the future will provide orders of magnitude 
	 more computing power, but their increasing complexity also requires 
	 experts in computational science, mathematics and computer science 
	 working together to develop the software needed for the science."

Supercomputing enables real-time 21st century Informatics that in turn will enable scientists to make amazing discoveries.

ScienceDaily.com:: Supersizing The Supercomputers: What's Next?

[Extra] Speaking of real-time 21st century Informatics... Google, Inc. is a world leader in this area. Google's mission is to "index" all of the information in the world. {GDT::BAB:: Google Says It Needs 300 Years}

[22 October 2005, top]

Linux Clustering Gaining In Popularity
Running Linux-based clusters is becoming a popular way to get affordable high-performance computing.

I may try to attend the LCI HPC Revolution 2006 conference being held during the first week of May, 2006, in Norman, Oklahoma.

[15 October 2005, top]

Supercomputing Used To Predict Flow of Toxic Waters
The floodwaters generated by Hurricane Katrina contained "organic and chemical pollutants such as sewage and oil." Supercomputers were used to run simulations that in turn provided timely inputs as to predict the flow, control, and eliminate the toxic waters.
	"In the immediate wake of Hurricane Katrina, scientists and 
	 research centers from across the country came together to 
	 generate information on the contaminated floodwaters and 
	 offer it to hazardous materials experts and public 
	 health officials."

	"In a matter of hours, the University of North Carolina at 
	 Chapel Hill's Marine Sciences Program and Renaissance Computing 
	 Institute (RENCI), together with the National Center for 
	 Supercomputing Applications (NCSA), played a key role in 
	 that effort by providing rapid-response computing and 
	 modeling capability."

UNC.edu:: UNC computer, marine scientists collaborate to predict flow of toxic waters from Katrina

[Extra] Supercomputing collaboration between the University of North Carolina, Duke University and North Carolina State University. {RenCI.org:: The Renaissance Computing Institute }

[08 October 2005, top]

SGI Systems Help Predict Weather, Build Buildings
Last week I wrote a short editorial paragraph about how the supercomputing industry should be benefiting by hurricanes Katrina and Rita. More computer power can only help improve weather forecasting capabilities. Supercomputers can only help architects and constructionist to build stronger and smarter buildings that may do a better job handling Mother Nature's wrath.

SGI.com:: SGI Technology Powers Real-Time Hurricane Forecasts and Weather Information

[Extra] The 3 September 2005 posting was SiliconValley.com:: Supercomputers aid weather forecasting .

[01 October 2005, top]

Supercomputers Needed, Yet Cray and SGI are Penny Stocks
Hurricane Katrina hit the U.S. on 8/29/2005 followed by Hurricane Rita on 9/24/2005. Supercomputing can help with weather prediction and you would think that this would be good for companies that do supercomputing; however, the stocks of Cray and SGI are currently penny stocks as they keep falling toward $0 per share. Granted, large computer companies are providing stiff competition--for example, IBM, Sun Microsystems, Dell, Oracle, Hewlett-Packard-- but supercomputing is the next era of computing and Cray and SGI make good supercomputers.

[24 September 2005, top]

U.S. Continues to Fall Behind in HPC
This title caught my attention: "Perfect Storms, Competitiveness, and the 'Gretzky Rule'"

Q: What is the 'Gretzky Rule'?
A: "Skate to where the puck will be."

The following quotes are by Dr. Fran Berman, SDSC Director and HPC Endowed Chair, UCSD. [UCSD is the University of California at San Diego and SDSC is San Diego Super Computing]

	"These days, competitiveness in high performance computing 
	 is commonly measured by ranking on the Top500 list."

The Top500 List is in a constant state of flux.

	"The current top spot on the list is occupied by Livermore's 
	 Blue Gene, however the emergence several years ago of the 
	 Japanese Earth Simulator (now at spot 4) provided a 
	 'wake up call' (Jack Dongarra from the University of 
	 Tennessee, Knoxville called it 'computenik' in the 
	 New York Times) to the U.S."

Being number one today doesn't mean anything because the Top500 List is in a constant state of flux.

	"To create an environment in which U.S. scientists and engineers 
	 are competitive involves developing an environment where the best, 
	 the brightest, and the most creative can work, and over the long 
	 periods of time required for fundamental advances."

The next era of computing is about infinite power, infinite storage, infinite bandwidth and high-performance visualization systems, but advances in these areas take money (i.e. funding).

	"For many of us in academia, the increasing competitiveness of 
	 our colleagues in Europe and Asia through committed funding programs 
	 and resources, the drop in support in the U.S. for research, education, 
	 and information infrastructure, and the increased outsourcing of 
	 technology innovation and service outside of the U.S. are creating 
	 a 'perfect storm' that will batter U.S. leadership and competitiveness 
	 not just now, but over the next generation."

TaborCommunications.com:: Perfect Storms, Competitiveness, and the 'Gretzky Rule'

[17 September 2005, top]

With Funding, Weather Prediction Will Only Get Better
Supercomputers cannot prevent hurricanes (and other acts of Mother Nature), but they can help predict when and where hurricanes are going to hit. Weather forecasters did an okay job predicting hurricane Katrina's behavior.

According to the Mercury News the National Oceanic and Atmospheric Administration (NOAA) estimates that "10 percent of the industries that contribute to the gross domestic product in the U.S. are weather- and climate-sensitive."

Mercury News also reports that the National Weather Service has a "large installation of dozens of IBM p690 Regatta servers, all networked together into a supercomputing cluster with high-speed IBM communication switches, based on Intel chips. The cluster system is located in an IBM facility in Gaithersburg, Maryland."

Lastly, Mercury News tells us that "sales of meteorology systems were about $231 million in 2004, or about 3.2 percent of the total supercomputer market of $7.25 billion."

SiliconValley.com:: Supercomputers aid weather forecasting

[03 September 2005, top]

Big Ben at the Pittsburgh Supercomputing Center
The University of Pittsburgh has a highly respected supercomputing center and they are probably happy with their Cray XT3 high-performance computer system named Big Ben.
   "Acquired via a $9.7 million grant from the National 
    Science Foundation (NSF) in September 2004, Big Ben 
    - the first XT3 system to ship from Cray - comprises 
    2,090 processors with an overall peak performance of 
    10 teraflops: 10 trillion calculations per second."

The PSC (Pittsburgh Supercomputing Center) reports the following to help us realize how power of Big Ben.

    "If every person on Earth, about 6.5 billion people, 
     held a calculator and did one calculation per second, 
     they would altogether still be 1,500 times slower 
     than Big Ben."

PSC.edu:: Pittsburgh Unveils Big Ben the Supercomputer

[27 August 2005, top]

IDG Interviews Don Becker
The IDG News Service interviewed HPC guru Don Becker and here are some quotes from the interview. Note: Becker is the co-founder of the Beowulf clustering project.

On Linux...

	"Linux has fulfilled the promise of Unix, going 
	 from [running on] a wristwatch to the fastest 
	 [high-end] machine."

On grids...

	"Grid tools have been primarily developed on Linux, so 
	 that's their platform of choice."

	"We at Scyld define a cluster as something you can administer 
	 from a single point and where you can install applications so 
	 they're immediately available. A grid has separate administration 
	 in the domain part of its definitions and you're trying to work 
	 with people across a company and across the world."

On open source licenses...

	"I think there's room in the space for about 10, maybe 
	 not even that many."

	"The GPL (General Public License) is clearly not perfect, 
	 but like any major standard you deal with in the technical 
	 area, it's working well enough. It's not as intuitive as 
	 it needs to be."

InfoWorld.com:: Becker on Linux, clustering, grid

[20 August 2005, top]

Supercomputing and Linux Do Mix
Cluster World Magazine has merged into Linux Magazine. By itself this is not news, but it does indicate that in the next era of computing all computers will be supercomputers and Linux will be a key operating system.

Hardware.NewsForge.com:: Linux lays groundwork for world's top supercomputers

[30 July 2005, top]

USC Moves Up In the Supercomputer Rankings
Kudos the University of Southern California for taking a lead in the world of supercomputing. According to the USC News, "USC's supercomputer has been ranked the nation's fourth most powerful computer system in an academic setting." USC's cluster can do 7.291 teraflops (or 7.291 trillion calculations per second).

USC.edu:: Supercomputer Rises in Rankings

[23 July 2005, top]

NCI Receives 2005 Bio-IT World Best Practices Award
Silicon Graphics, Inc. (SGI) played a key role in helping the National Cancer Insitute (NCI) be one of the "six Grand Prize winners of Bio-IT World magazine's third annual Best Practices Awards." NCI's award was in the category of "Knowledge Management."

Yesterday (08 July 2005), SGI's stock made a new 52-week low and many on Wall Street feel SGI is heading for a Chapter 11 ruling. SGI makes great products, but great products does not ensure corporate success.

The NCI uses a Silicon Graphics Prism visualization system powered by Intel® Itanium® 2 processors and running the Linux® operating environment.

SGI.com:: NCI Receives Bio-IT World Best Practices Award in Knowledge Management

[09 July 2005, top]

HPC Enables Computational Science
If America wants to stay a superpower, then it must be a leader in computational science. Size alone will not ensure America's sustainability in the world.

Computational science is 'critical' "because it enables investigation of extremely complicated phenomena and processes - such as nuclear fusion, folding of proteins, the atomic organization of nanoscale materials, and the global spread of disease - that other methods cannot characterize fully if at all."

	"Computational science - the use of advanced computing capabilities 
	 to understand and solve complex problems - is now critical to 
	 scientific leadership, economic competitiveness, and national 
	 security." -- John H. Marburger III, Science Advisor to President 
	 Bush and Director, Office of Science and Technology Policy

TaberCommunications.com:: Dear Mr. President: Computational Science 'Critical'

[25 June 2005, top]

Japan Building Supercomputer For 2010
Japan wants to develop a supercomputer that will be able to a quadrillion calculations per second. The supercomputer will be developed by NEC, Hitachi Ltd., the University of Tokyo and Kyushu University.

What's one quadrillion?

   1015 = 1,000,000,000,000,000 = 1 peta-unit
Reuters.com:: Japan eyes advanced supercomputer as early as 2010

[11 June 2005, top]

Iomega Using Nanotechnology to Increase Storage Capacity
Iomega has been around for 25 years and it looks like they have some nanotechnology in the works that will help us realize "infinite storage." On April 12, 2005, Iomega was awarded U.S. Patent No. 6,879,556 titled "Method and Apparatus for Optical Data Storage." The patent is the first in a series of nanotech-based " subwavelength optical data storage patents sought by Iomega. The patent covers a novel technique of encoding data on the surface of a DVD by using reflective nano-structures to encode data in a highly multi-level format."

Corporate-IR.net:: Iomega Corporation Announces Two New Patents in the Fields of Nano-Technology and Compatibility of Digital Devices

[30 May 2005, top]

Are SGI and Cray Going Out of Business?
Silicon Graphics and Cray Computer are two companies involved in the world of supercomputing. Recently Cray had its stock delisted from the NASDAQ and now SGI is at risk of having its stock delisted from the NYSE.

On 09 May 2005, SGI announced Paradigm(TM), a provider of petroleum geoscience and engineering software to the oil and gas exploration and production industry worldwide, has purchased five SGI® Altix® systems.

On 10 May 2005, Cray announced that the Nevada's Desert Research Institute (DRI) selected the Cray XD1(TM) supercomputer as a computational platform for carrying out environmental research that involves high-resolution modeling and complex forecasting. The system will "help investigators find solutions to pressing issues related to the earth's atmosphere, soil, hydrology and ecosystems."

[14 May 2005, top]

One Terabit Per Square Inch
This week's posting about Nanochip, Inc. was this week's posting to the GDT::Nanotech::SmallBlog. Nanotechnology is enabling infinite storage capabilities. ["Deletion" is one of the more difficult aspects of data management; infinite storage can eliminate the need for "deletion."]

Arrays are important structures for storing data in computer programs. Arrays are also important biotech and nanotech objects.

Nanochip Inc. has developed prototype "arrays of atomic-force probes, tiny instruments used to read and write information at the molecular level." These arrays can record up to one trillion bits of data -- known as a terabit -- in a single square inch. "It is roughly equivalent to putting the contents of 25 DVDs on a chip the size of a postage stamp."

[side-bar] On 01 May 2005, the latest "News" on the Nanochip, Inc. Home Page is dated 08 March 2004.

Nanochip.com:: Home Page

[01 May 2005, top]

Arizona Universities To Use CENIC
It appears as though Arizona State University and the University of Arizona will connect to "next generation Internet" via CENIC. From the CENIC website:
   "CENIC is charged with designing, provisioning and operating 
    robust, high capacity, next generation Internet communications 
    services through a cohesive infrastructure for its associates 
    and affiliates."

At a recent CENIC presentation, emphasis was placed on the word robust.

High-capacity implies stuff like "One Gigabit or Bust."

There is a chance that the Maricopa Community Colleges will be able to use CENIC; therefore, gaining a "robust, high capacity" Internet connection to spots in California and Nevada.

CENIC.org:: Corporation for Education Network Initiative in California

[23 April 2005, top]

135.5 Teraflops and Counting
The IBM Blue Gene/L supercomputer at Lawrence Livermore National Laboratory has reached a processing speed of 135.5 trillion flops. This supercomputer is still under construction and in the end it may hit 360 trillion flops. [A flop is floating-point operation and a trillion flops is a teraflop.] {BBC.co.uk:: Faster Supercomputer Gets Faster }

[07 April 2005, top]

World's Largest Computing Grid Gets Larger
The Large Hadron Collider Computing Grid (LCG) is the world's largest international scientific grid with over 100 sites in 31 countries. The Large Hadron Collider (LHC) is being built at CERN near Geneva, Switzerland. The LHC is a "particle accelerator used to study the fundamental properties of sub-atomic particles." Although the world's largest computing Grid, CERN's IT group says current processing capacity is "estimated to be just 5% of the long-term needs of the LHC." {CERN.ch:: World's Largest Computing Grid Surpasses 100 Sites }

[26 March 2005, top]

Supercomputing Changing the Battle Fields of War
Supercomputing and advanced visualization systems are playing an increasingly important role in how future wars will be conducted. The Battle Command Battle Lab at Fort Huachuca, AZ, is "evaluating ways to fuse intelligence data from multiple sources into a cohesive combat picture with help from an array of technologies, including new server and visualization systems from Silicon Graphics." {SGI.com:: Press Release }

[19 March 2005, top]

Using Screen Savers for Distributed Computing
Nature.com had an article that detailed a "British distributed computing experiment that harnessed the processing power of 90,000 PCs in 150 countries to simulate future climatological trends" to predict global warming is more serious than currently estimated. {CommonDreams.org:: Global Warming is 'Twice as Bad as Previously Thought' }

The aforementioned global warming prediction was accomplished by having people download a special screen save onto their computer. When the screen save kicks in, their computer goes to work doing some form of computing. This is becoming an increasingly popular technique for getting some distributed computing power.

[12 March 2005, top]

Supercomputing Helping Surgeons Do Better Surgeries
Supercomputing and high-resolution visualization systems are allowing surgeons to go through simulations that mimic upcoming surgical procedures. {Wired.com:: No More Crash-Test Surgery }

[05 March 2005, top]

Foster and Tuecke on the Future in Grid Computing
BetaNews.com chatted about Grid computing with Ian Foster and Steve Tuecke on 21 February 2005. Foster is associate director of the mathematics and computer science division of Argonne National Laboratory and a professor of Computer Science at the University of Chicago. Foster is "considered one of the founders of the international Grid community." Tuecke cofounded the Globus Alliance and a web services guru. {BetaNews.com:: The Future in Grid Computing }

[27 February 2005, top]

Cray Supercomputers Can Really Compute
Cray, Inc. is a maker of supercomputers and during the end of January 2005 the company issued the following press releases.
  • Cray Tapped to Deliver One of World's Most Powerful Supercomputers for use by U.S. Army Corps of Engineers [2005-01-27] More...

  • Japan Science and Technology Agency Orders Cray XT3 Supercomputer for University of Tokyo Software Research [2005-01-28] More...

  • HPC Veteran and Former Hewlett Packard Executive Mamoru Nakano Joins Cray Inc. as President of Cray Japan [2005-01-25] More...

During mid-February of 2005 Cray reported it is creating a supercomputing center at the Department of Energy's (DOE) Oak Ridge National Laboratory (ORNL). The Cray Supercomputing Center of Excellence will "support the DOE's plans to build the world's mowt powerful supercomputer for open (non-classified) scientific research at ORNL." More...

[21 February 2005, top]

The Cell (supercomputer on a chip) by IBM, Sony, Toshiba
IBM issued a press release on 08 February 2005 that started off as follows.
   "A consortium of consumer electronics and computing 
    companies has unveiled the details of jointly-developed 
    chip that promises high-speed entertainment and media 
    transmission uses. Code-named Cell, the new microprocessor 
    is widely rumored to be the core of Sony's PlayStation 3, 
    expected in 2006, and Toshiba will use the chip in high 
    definition televisions."

The Cell is called a "supercomputer on a chip" and it "features multi-core architecture and ultra high-speed communications capabilities."

The Cell will support multiple operating systems (i.e. is it "OS neutral").

{IBM.com:: Hard Cell: Details of New Chip Unveiled }

[12 February 2005, top]

The Globus Alliance -- Building an Open Grid
The following is from the Globus Alliance homepage.
   "The Globus Alliance is developing fundamental technologies 
    needed to build computational grids.  Grids are persistent 
    environments that enable software applications to integrate 
    instruments, displays, computational and information resources 
    that are managed by diverse organizations in widespread locations."

Globus is sponsored by governmental organizations such as DARPA, US DOE, NSF, NASA along with major corporations such as IBM, Microsoft and Cisco.

{Globus.org:: The Globus Alliance}

[Extra] The following was posted by EDUPAGE and they obtained it from the New York Times.

   "The Globus Consortium, which includes IBM, Intel, HP, Sun
    Microsystems, and Nortel Networks, will work to develop 
    grid computing tools geared specifically for corporations,
    as opposed to existing tools, which typically focus on the 
    needs of academic and research organizations."

The following comes from an eWeek posting by Lisa Vaas.

   "Ian Foster, a consortium board member 
    who led the original team that developed the tool kit, compares 
    Globus Consortium to the Open Source Development Labs in which 
    Linus Torvalds works, where the goal is to take Linux and make 
    it ready for enterprise use."
{eWeek.com:: Grid Computing Takes the Linux Route }

[29 January 2005, top]

Forbes Say Utilities To Rule in 2005
Forbes.com is predicting that a "big trend" for 2005 is "utilities in all forms." Some of the forms they highlighted included utility computing along with utility software. In addition, they mentioned that utilities themselves are going to be major players in 2005 as "broadband over powerlines moves from a science project to a business." {Forbes.com:: Quentin Hardy On Technology }

[14 January 2005, top]

UC-Davis Turns to Sun; Germany Turns to SGI
This week's posting is a collection of postings that didn't make into last year's version of the Grid Utilitarian.

Forbes.com:: Atmospheric Research Center Plugs Into Grid Computing [2004-11-23]

[Sun Microsystems and UC Davis Partner]
University of California, Davis, is a leading research university in areas including computational science and engineering, physics, chemistry, and bioinformatics. Sun Microsystems announced its selection of UC-Davis as a Center of Excellence (COE) in Public Health and Safety. The COE utilizes Sun's grid computing environment to "increase the effectiveness and efficiency of computing resources and enable collaboration among four key research centers to advance experiments and programs."

UCdavis.edu:: Center to Support Advanced Computing in Public Safety [2004-10-26]

[SGI Usage Continues to Grow]
SGI Altix 350 Dominates Mid-Range Servers on Scientific and Engineering Benchmark [2004-12-15]

The Bavarian Academy of Sciences and Humanities (BAdW) selected SGI Altix systems to support Germany's new national supercomputing system (LRZ). The LRZ will have 3,328 dual-core Intel® Itanium® 2 processors capable of executing 69 trillion calculations per second. The LRZ is also using a 660-terabyte SGI® InfiniteStorage solution for data storage. More...

[07 January 2005, top]

Welcome to Year 2005
Happy New Year! welcome to 2005. The 2004 Grid Utilitarian has been archived.

[01 January 2005, top]


Author: Gerald D. Thurman [deru@deru.com]
Last Modified: Saturday, 05-Jan-2013 11:17:33 MST

Thanks for Visiting