The Industrial Revolution of the 21st century is the Information Revolution. The computers’ high-performance computing power gives us the possibility of extracting highly useful knowledge to enhance efficiency of industries, service companies and administrations.
In this context, supercomputation, because of the computers’ increased competence and their programming capabilities, has become a key instrument for those countries, businesses and entities who want to ride with the most developed and advanced ones. The Information Revolution consists in the computing capacity of doing calculations and algorithms.
The most popular use of supercomputation in industry is the possibility to make experiments with the aid of supercomputers instead of using physical resources. This enables cheaper costs and accelerated design processes for products, as such experiments are faster and less costly. And nowadays, far more precise.
What do we mean by “reproducing an experiment in a supercomputer”? It means simulating the physics and chemistry of natural phenomena in a computer-based environment, and solving mathematical models, i.e. the equations, that shape these phenomena.
With the current capacity of calculations offered by computers, there are now phenomena that twenty years ago could not be simulated with enough precision, and now they can. For example, combustions. When simulating combustion, it is not possible to achieve total precision, but the level of precision of computers is now sufficiently high to help designing different types of combustion devices, from a car engine to a gas turbine or an industrial combustor. Second example, the interactions between any fluid-based mechanical devices: we can now simulate, in minute detail, the functioning of a wind park or a hydraulic pump. Third example, exploring fossil fuels: with the current technology, we can obtain not only images but also the physical properties of the soil with unprecedented precision thanks to petaflop machines. These machines can do one trillion operations per second (1015). And the exaflop machines (1018), due to appear at the beginning of the coming decade, the speed of calculations is likely to rocket.
The Barcelona Supercomputing Center – Centro Nacional de Supercomputación (BSC) is provided with the supercomputer MareNostrum, ranked 16 on the Top500 list of the fastest supercomputers in the world. MareNostrum 4 has a computing power of 13.7 petaflops, a main memory of 390 terabytes and 14 terabytes of hard disc storage.
But, when talking about supercomputation for the industry, we should consider that we do not always need to have at our disposal the fastest machine in the world. In fact, the term “supercomputation” is closer in meaning to “parallel computing”. Some industrial problems require less sophisticated computers than MareNostrum, but they do require parallel computers. For this reason, it is important to develop software that can use parallel machines efficiently. One of the main contributions of BSC to the industry is helping develop this type of parallel software.
Nowadays, we have achieved such precision in electronic development that we can now create gigantic neural networks that can be used to tackle real problems. The procedure to create these networks is to compile the data of the problem and train a neural network. This neural network will make predictions on the behaviours of this system and will act as an equation-based simulator.
Sometimes, some industrial problems cannot be simulated using standard methods, i.e. by solving a series of equations that can model physical resources. There may be different reasons for this. Usually, the reason is uncertainty, as there is something we ignore, which prevents us from solving the equation. For example, in a waste incineration plant, we may ignore what fuel we are burning just because we ignore the exact composition of the waste. When such situations occur, we can model the phenomenon simply using data, and we do not need to use equations. Nowadays, we have achieved such precision in electronic development that we can now create gigantic neural networks that can be used to tackle real problems.
The procedure to create these networks is to compile the data of the problem and train a neural network. This neural network will make predictions on the behaviours of this system and will act as an equation-based simulator. Neural networks were first discovered back in the 1960s, but twenty years ago, it was not possible to obtain useful results for real problems, simply because circuits at the time did not allow creating large enough networks. Now, advances in electronics, have made this possible.
NATURAL LANGUAGE PROCESSING
Another new application of supercomputation is in the field of social sciences, for example, in the field of natural language processing, i.e. human language. Through new statistical algorithms that use the computers’ large calculation capacity, we can now work with texts automatically to, for example, extract or find information from an array of databases or documents. Without these tools, we would only have the opportunity to search keywords. What is this useful for? For example, processing legal information, or processing all the clinical information of hospitals, or processing information for administrations, as nowadays the multiple databases and documents they handle are scattered and have different formats.
In the field of social sciences, when we analyse large amounts of data, there is the problem of Big Data. For example, when managing complaints of citizens in a large city or analysing information from social media, the amount of data is so sizeable that we need supercomputers and special algorithms to process information and obtain useful conclusions.
In sum, the processing power and algorithms can be the key to improve management of all types of processes requiring optimization and efficiency. In the 21st century, having these two allies will be a determining factor for those businesses, administrations and entities that aim to have an important role in their respective areas.