What is Supercomputer
Recently there has been a lot of discussion on supercomputing centres and their importance for not only companies that deal with commercial activities but mainly for both theoretical and applied research in connection with executing sophisticated computations in various scientific fields and solving multidisciplinary problems. Supercomputers are currently used primarily to perform highly calculation-intensive tasks such as those related to genome research, physical simulations (e.g. nuclear explosion simulation, weather forecasting), cryptoanalysis, modelling of chemical and biological processes, analysis of large volumes of data (LHC-CERN, geophysics) etc.
Supercomputer is a term used to describe a high-performance computer, computer system or computer centre. Some tasks require specialized supercomputers to be solved, e.g. Earth Simulator used in weather forecasting. However, present trends rather point to universal supercomputer centres whose performance and storage capacity is shared by more subjects coming from the fields of science, research, education as well as commercial sphere.
History of supercomputers
First supercomputers were designed in the 1960s. In the 1980s and early 1990s a large number of small companies entered the market to be soon driven out by big ‘traditional’ companies. Labelling any computer as a supercomputer is only possible if we at the same specify the time in which it was designed and constructed. Relativity of this attribute is closely related to rapid advances in information technology, as it may take only a few years and the contemporary personal computers become equal to few years’ old supercomputers.
Architecture of modern supercomputers
Supercomputer architecture has now stabilized. Top 10 most powerful supercomputers on the TOP500 list share the same basic architecture. Each of them is a cluster of MIMD (Multiple Instruction stream, Multiple Data stream) multiprocessors, each processor of which is of SIMD (Single Instruction/Multiple Data) architecture. The supercomputers differ radically in the number of multiprocessors per cluster, the number of processors per multiprocessor, and the number of simultaneous instructions per SIMD processor.
- A computer cluster is a collection of computers that are interconnected via a high-speed network. Each of the computers processes separate tasks of operating system.
- A multiprocessing computer is a computer operating on the basis of one operation system and using more than one CPU, however, the number of user applications is different from the number of processors. The processors share tasks using the SMP (Symmetric multiprocessing) and NUMA (Non-Uniform Memory Access) technologies.
- A SIMD processor activates identical instructions on more than one set of data at the same time (e.g. vector processors). Accompanying benefit is increased effectiveness and lower energy consumption while maintaining the performance.
Supercomputing centres in the world
The present state of supercomputing centres (SPCs) in the world is open-and-shut: a majority of the SPCs (51%) can be found in the USA, whereas other states are represented by only units of percentage – the UK 5 %, Germany 6%, France 5% and not a single percent as for the Czech Republic. A dramatic development of supercomputing technology has been recently going on in China, which is already with its 12% share of performance in the www.top500.org list on the second place behind USA.
Applications of supercomputing centres
Present-day supercomputing centres play an important role mainly in research and development that is currently highly dependent on demanding computer simulations and calculations. High-performance supercomputing centres are taken advantage of by a wide range of classical and multidisciplinary fields of science. Efforts are made to concentrate massive computing performance as well as extensive data storage into consolidated data centres that are subsequently approached by individual companies, organizations, universities, and research centres for high-performance computing and data capacity needs.
Statistical data is used from www.top500.org.