The Next Era of Computing
Computing in the 21st Century

This document was created to support a talk given by Gerald Thurman on 4 April 02007 at Scottsdale Community College (SCC). Thurman's talk was part of SCC's Kelly Lecture series and it followed a nanotechnology lecture given during February of 02007 by the Arizona Nanotechnology Cluster.

Updates since the talk was given.

The Present Era of Computing

Question: How many of you have heard of IBM?

IBM's early history in a nutshell...

   01888:  Tubulating Machine Co. (TMC) founded by Herman Hollerith
   01911:  TMC merged with Computing Tabulating Recording Corp. (TRC)
   01916:  CTR listed on the NYSE
   01924:  CTR re-named International Business Machines (IBM)

   01914-01956:  IBM was lead by Thomas J. Watson

During 01943, Watson was quoted saying: "I think there is a world market for maybe five computers." Watson's quote is used because this lecture is about the future and nobody can predict the next femto-second.

Fast forward to 01990 (one year before WWW)...

The following quote is from "The Analytical Engine: An Introduction To Computer Science Using Hypercard" written in 01990 by Rick Decker and Stuart Hirshfield.

   "There has never been a technology in the history of the 
    world that has progressed as fast as computer technology... 
    If automotive technology had progressed as fast as computer 
    technology between 1960 and today, the car of today would 
    have an engine less than a one tenth of an inch across; the 
    car would get 120,000 miles to a gallon of gas, have a top 
    speed of 240,000 miles per hour and it cost $4."
Fast forward to 02007 (the present)...

IBM's BlueGene/L is the top ranked computer on the TOP500 Supercomputing Sites list. BlueGene/L in a nutshell.

   installed at Lawrence Livermore National Laboratory
   131,072 processors
   32,768 gigabytes of main memory
   280,600 gigiflops* sustained performance

   *flops -- floating-Point operations per second

IBM is not alone with respect to leading the world into the next era of computing.

Cray Inc...

Seymour Cray (01925 to 01996) is considered by many to be the "father" of supercomputing.

   "If you were plowing a field, which would you rather use? ...  
    Two strong oxen or 1024 chickens?  Anyone can build a fast CPU; 
    the trick is to build a fast system."

Despite his death, Cray's company remains a supercomputing leader.

On 3 April 02007, Cray Inc. announced that Oak Ridge National Laboratory (ORNL) set a new performance record for the Weather and Research Forecast (WRF) meteorological modeling software. ORNL's Cray XT4 system ran the advanced WRF code on a total of 12,500 processors, achieving sustained performance that reached an unprecedented 7.1 teraflops. At this level of performance, scientists can generate a one-day, 2.5-kilometer-resolution weather forecast covering the entire continental United States in as little as 18 minutes, compared to the several hours it would take on a less efficient system."

Computer scientists and engineers around the world are working hard to learn the "trick" mentioned by Seymour Cray.

Google Inc...

Many believe Google, Inc. has the world's most powerful computer, but Google does not have to share that information with the world. Mountain View, California-based Google is building facilities for server farms in The Dalles, Oregon, and Lenoir County, North Carolina. Google has selected these locations because they can provide a reliable, affordable, endless stream of power (energy) to Google's forever growing cluster of computers.

Rob Pike, Unix guru who is currently a Commander at Google, has been quoted saying the following: "The Web is too large to fit on a single machine so it's no surprise that searching the Web requires the coordination of many machines, too. A single Google query may touch over a thousand machines before the results are returned to the user, all in a fraction of a second."

Arizona State University...

On 19 October 02006, it was announced that ASU researchers are "partners in a supercomputing project that has been awarded a five-year, $59 million NSF grant. The Texas Advanced Computing Center (TACC) at the University of Texas-Austin is the lead institution for the project, which will provide a high-performance computing system for the nation's research scientists and engineers." The supercomputer system is to "achieve a peak performance in excess of 400 trillion flops, more than 100 trillion bytes of memory and 1.7 quadrillion bytes of disk storage." [ASU.edu:: Fulton High Performance Computing Initiative]

Data Transmission...

For almost two decades Sun Microsystems has advocated "the Network is the Computer." During March of 02007, Alcatel-Lucent successfully transmitted a "world record 25.6 Terabits per second (Tb/s) of optical data over a single fiber strand." Also during March of 02007, Sun CEO Jonathan Schwartz gave a speech in which "asserted it was faster to send a petabyte of data from San Francisco to Hong Kong by sailboat, than by the Internet." [In a subsequent blog posting, Schwartz did math to validate his assertion.]

U.S. Governmental Support Beyond DARPA?

On 12 March 02007, the High Performance Computing R&D Act ( H.R. 1068) passed the House. A similar bill failed to pass the Senate last year, but optimism is high that the HPC R&D Act will be sent to President Bush before the end of 02007.

The Next Era of Computing

The next era of computing is the era of High-Performance (Productivity) Computing (HPC).

Looking ahead to 02008...

HPC is happening all over world; therefore, the Defense Advanced Research Projects Agency (DARPA) is playing a critical role in advancing HPC in the United States. From autonomous vehicles ( Grand Challenge; video) to autonomous robotic submarines, DARPA's involvement is helping push for peta-scale computer sometime in 02008. Peta-scale computing implies computers executing one quadrillion floating-point operations per second (flops).

What's a quadrillion?

   one quadrillion = 1,000 trillion

   one quadrillion = 1,000,000,000,000,000

   one quadrillion = 10 raised to the 15th power (peta-)

   one quadrillion flops = one peta-flops

The following is my one sentence description of the next era of computing: A grid-based cyber-infrastructure that provides infinite computational power, infinite storage, infinite bandwidth and infinite services (utilities). In other words, HPC is pervasive computing services enabled by supercomputers, computing clusters, and high-performance data storage and visualization systems.

                  +---------------+
   submit job --> | supercomputer | -->  no output (effect only)
                  +---------------+

                  +---------------+
   submit job --> | supercomputer | -->  no or yes (0 or 1)
                  +---------------+

                  +---------------+
   submit job --> | supercomputer | -->  a number or set of numbers
                  +---------------+

                  +---------------+
   submit job --> | supercomputer | -->  table of information
                  +---------------+

                  +---------------+
   submit job --> | supercomputer | -->  massive report
                  +---------------+

                  +---------------+     +----------------------+
   submit job --> | supercomputer | --> | visualization system | 
                  +---------------+     +----------------------+
                                            [infinite views]
Hardware...

Cluster computing can take advantage of COTS (Commodity-Off-The-Shelf) hardware. A single machine can contain multiple central processing units (CPUs). Multi-core microprocessors are chips containing two or more CPUs. Intel is creating a multi-core chip that can do one trillion calculations per second. Virtualization allows one machine to run multiple operating systems (e.g. Linux, BSD, proprietary Unixes, Microsoft Windows, etc.). In the past, if a system needed four different operating systems, four computers were needed; today, one machine with two dual-core chips could suffice.

Software...

Free/Libre and Open Source Software (FLOSS) can be the source of a never ending supply of low cost, usable, reliable, efficient and secure software tools. FLOSS adoption is continuing to expand world-wide.

The Free Software Foundation was established 01985. The word free in Free Software implies freedom. Lots of Free Software can be obtained for zero cost, but Free Software does not imply free software. Open Source was initiated around 01998. Open Source differs from Free Software with respect to approach, philosophy, values and the "criterion for which licenses are acceptable." Although the Free Software movement and the Open Source movement are separate movements, they can and do work together on some practical projects. To an extent the following is accurate: "Open Source is a development methodology; Free Software is a social movement."

HPC environments could enable programmers to think sequentially while the HPC determines how to parallelize and distribute that thinking. This allows existing algorithms to take advantage of the supercomputing enabled utility computing without modification.

Computing in the 21st Century

Computational This, Computational That

Computational biology, computational chemistry, computational engineering, etc. We've been doing mathematical and scientific computing (number crunching) for many decades. HPC is enabling computational computing across and between all disciplines. Recall, in 02008, we might have computers that can execute one quadrillion floating-point operations per second. Computational computing allows us, in a timely manner, to better simulate, model, test and visualize the behavior of objects and functions as inputs and outputs get infinitely small and infinitely large. [ASU.edu::Decision Theater --Visualizing Possibilities, Realizing Solutions]

Thanks to HPC, FORTRAN (1957) remains an important programming language. More modern languages include open sourced R (GNU S) and Fortress. During April of 02007, Bioinformatics.org offered a course titled "R for Biologists."

This Informatics, That Informatics

Bioinformatics, biomedical informatics, chemical informatics, financial informatics, environmental informatics, social informatics ( ning?), weather informatics, etc. We've been doing informatics (i.e. information science) for decades, but supercomputing is enabling 21st century informatics to occur across and between all disciplines. Data can take on many forms to effectively become infinitely reusable information. It has been estimated that 161 billion gigabytes, or 161 exabytes, of information was generated during 02006. During 02007, Sir Tim Berners-Lee, the "father" of the WWW, continues his work on creating the semantic web.

HTML is the Hyper-Text Markup-Language. HTML allows plain-old-text documents to contain "tags" that are used to instruct browsers as to how to display webpages. XML is the eXtensible Markup-Language and it is used to give "semantics" to the text contained in plain-old-text documents. XML-based markup-languages are used to support the various forms of informatics. (e.g. MathML, CML [Chemical], RTML [Remote Telescope], DCML [Data Center], etc.; YouTube.com:: The Machine is Us/ing Us)

ASU's Fulton School of Engineering now includes the School of Computing and Informatics. They will begin offering a M.S. in Biomedical Informatics starting fall of 02007.

This Thinking, That Thinking, Computational Thinking?

Critical thinking, logical thinking, radical thinking, long-term thinking, etc. In the computing world, we've been doing computational thinking for decades because it works extremely well. Jeannette Wing believes computational thinking can be effectively used in other disciplines. Wing is head of Computer Science at Carnegie Mellon University and she is leading the NSF's new Computer and Information Science and Engineering Directorate. Dr. Wing says: "The ideas in computing, the abstractions we bring from CS, will pervade all other disciplines --not just other sciences and engineering--but also humanities, arts, social sciences, entertainment, and everything."

On 26 March 2007, Microsoft announced it is giving Carnegie Mellon $1.5 million over the next three years to establish the Microsoft-Carnegie Mellon Center for Computational Thinking. "Increasingly, scientists and researchers rely on computer science to enable them to sift through massive amounts of data and find breakthroughs that could provide new insights into the human body, the earth we live on and even the universe," said Rick Rashid, Microsoft Research senior vice president.

Some Final Comments

Approximately 317 of the world's 500 fastest computers are in the United states. Leading HPC geographical areas in the U.S. include Silicon Valley (and other parts of California), Texas, Pittsburgh and the east coast. Arizona is not on the HPC map.

The need for HPC spans all disciplines (energy, heathcare, engineering, weather prediction, economic forecasting, visualization, etc.); however, in 02007, many disciplines have not demanded HPC.

HPC is expensive; however, powerful low-cost computing systems can be built using COTS and FLOSS. And, thanks to the Internet, idle computers around the world can become computing cluster nodes (workers).

Most of the computers at SCC do minimal computing. For example, during the semesters from about 9:00 P.M. on Thursday to 7:30 A.M. on Monday, almost all of the computers at SCC do zero computing. These same computers go virtually unused during breaks and over the summer months. Despite their lack of use, SCC replaces its computers on a periodic basis.

This lecture avoided the sci-fi aspects of the next era of computing, but these aspects should not be ignored. Examples: cyber-warfare; brain-resident Google objects (augmented intelligence); supercomputing nanobots; smart robots; self-replicating supercomputers; and so on. [Wired.com:: Why the future doesn't need us by Bill Joy in 02000.]


Creator: Gerald D. Thurman [gthurman@gmail.com]
Created: 14 February 02007