Abstract: In this talk we examine how high performance computing has changed over the last 10-year and look toward the future in terms of trends. These changes have had and will continue to have a major impact on our software. A new generation of software libraries and algorithms are needed for the effective and reliable use of (wide area) dynamic, distributed and parallel environments. Some of the software and algorithm challenges have already been encountered, such as management of communication and memory hierarchies through a combination of compile--time and run--time techniques, but the increased scale of computation, depth of memory hierarchies, range of latencies, and increased run--time environment variability will make these problems much harder. We will focus on the redesign of software to fit multicore architectures.
Bio: Jack Dongarra received a Bachelor of Science in Mathematics from Chicago State University in 1972 and a Master of Science in Computer Science from the Illinois Institute of Technology in 1973. He received his Ph.D. in Applied Mathematics from the University of New Mexico in 1980. He worked at the Argonne National Laboratory until 1989, becoming a senior scientist. He now holds an appointment as University Distinguished Professor of Computer Science in the Computer Science Department at the University of Tennessee and holds the title of Distinguished Research Staff in the Computer Science and Mathematics Division at Oak Ridge National Laboratory (ORNL), Turning Fellow at Manchester University, and an Adjunct Professor in the Computer Science Department at Rice University. He is the director of the Innovative Computing Laboratory at the University of Tennessee. He is also the director of the Center for Information Technology Research at the University of Tennessee which coordinates and facilitates IT research efforts at the University.
He specializes in numerical algorithms in linear algebra, parallel computing, the use of advanced-computer architectures, programming methodology, and tools for parallel computers. His research includes the development, testing and documentation of high quality mathematical software. He has contributed to the design and implementation of the following open source software packages and systems: EISPACK, LINPACK, the BLAS, LAPACK, ScaLAPACK, Netlib, PVM, MPI, NetSolve, Top500, ATLAS, and PAPI. He has published approximately 200 articles, papers, reports and technical memoranda and he is coauthor of several books. He was awarded the IEEE Sid Fernbach Award in 2004 for his contributions in the application of high performance computers using innovative approaches and in 2008 he was the recipient of the first IEEE Medal of Excellence in Scalable Computing. He is a Fellow of the AAAS, ACM, and the IEEE and a member of the National Academy of Engineering.
Can ICT help farmers and food crisis? --Seishi Ninomiya
National Agriculture and Food Research Organization (NARO), JP
In 20th century, the human population became ca. 10 times as many as that of 100 years ago from 0.6 billion to 6.5 billion. This is a typical chicken-or-egg issue but anyway, this drastic increase was supported by technological revolution in agriculture such as efficient production of fertilizers, invention of pesticides, machinery, irritation system, plastic films and cross breeding. We can clearly identify that chemistry and engineering took important roles in the agriculture of that century. Such agriculture did not only secure enough food for the explosively increasing population but also released famers from hard labor. But, at the same time, it brought us drawbacks such as tremendous energy consumption and serious impacts of agricultural chemicals on environments.
In the 21st century, the demand for food is still growing and food scarcity is obvious. There are several causes that interacts each other. In addition to the population still growing, the steeply increasing demand of animal feed to support the growing consumption of stock farm products in the economically growing countries such as BRICs whose total population is almost a half of the global population, and the global trend to use some of the crops as materials of carbon-neutral biological energy are also bringing high demand for food. And the recent unstable and extreme weather condition have been making food production unstable.
In the 21st century, to fulfill such food demand by we cannot take the same way as that of the 20th century. Sustainable agriculture with less environmental impacts, high quality with safety must be realized at once in addition to its productivity. This means that we have to realize completely new agriculture to fulfill such requirements simultaneously. Because of the complexity of the issues and huge amount of data back behind to be handled, we cannot find any solution and optimization without being harnessed by ICT. In this talk, I will show some of the practical examples about how ICT can contribute to solve the issues, introducing the activities of the APAN Agriculture Working group.
How Terascale Experience Will Shape Petascale Systems - William Kramer [Deputy Director of Blue Waters Project, National Center for Supercomputing Application, USA]
The next generation of Petascale systems presents incredible challenges in order to be able to effectively support science and engineering. This talk will focus on lessons learned fielding Terascale systems and discuss how these lessons can be applied to the Petascale era, using experiences from NERSC, NCSA and other systems with the example Petascale target platform of Blue Waters.
Dr. Kramer joined NCSA in 2008 as the Deputy Director of the Blue Waters Project. Blue Waters is under development and will be the largest unclassified system in the US in 2011. Supported by the US National Science Foundation and the University of Illinois, Blue Waters will provide a sustained Petaflops/sperformance for a wide range of applications of importance. As Deputy Director, Bill is responsible for all aspects of the Blue Waters project from the system itself to a range of collaborations that provide value added components to the system. Prior to this role, Bill was General Manager of the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory and Head of High Performance Computing for LBNL. Before that he was a Branch Chief for Computational Services at NASA's Numerical Aerodynamic Simulation (NAS) Facility. Bill is known for his experience in fielding early release, very large HPC systems that provide very high quality services to a very broad range of scientific domains. Blue Waters will is the 20th such system he is fielding. He has advanced degrees from Purdue University, University of Delaware and UC Berkeley.
Toward the global collaboration: EUAsiaGrid
Abstract:In the last few years Grids have made the transition from early prototypes to production infrastructures that are becoming an essential tool for collaborative e-Science communities. The EUAsiaGrid proposal contributes to the aims of the EU Research Infrastructures FP7 Programme by “promoting international interoperation between similar infrastructures with the aims of reinforcing the global relevance and impact of European e-Infrastructure”. The EUAsiaGrid project aims at empowering scientific collaborations throughout the Asia-Pacific region and together with European scientific communities, contributing to the setup of a sustainable framework for the grid infrastructure, by leveraging the work done in the EGEE project. The consortium is made out of 15 partners will profit of the long experience in the grid management and operations of the four European partners and of Academia Sinica, the ROC in the Asia Pacific region. It will cooperate with other regional project like EUIndiaGrid and EUChinaGrid, as well as with the Asia-Pacific federation of the EGEE-III project.
Networking is still an issue as the provision of network connectivity differs widely between partner countries and can also differ between different partners in the same country. By seeking to elicit information from potential users about potential uses and which follows this up by providing a seed resource has proven successful and is considered by the EUAsiaGrid project as a potential model for furthering wider adoption and the development of new application areas. Targeted e-Science applications such as high-energy physics, computational chemistry, disaster mitigation, life science, e-Social science, and digital archive etc. will be the drivers to realize regional collaborations with the new generation e-infrastructure and new paradigm.