26th OPEN ACCESS COMPETITION RESULTS
The Allocation Commission decided on the allocations within the 26th Open Access Grant Competition as follows:
Researcher: Martin Mirbauer
Generating Sky Imagery with Parametric Cloud Cover for Photorealistic Rendering
Karolina GPU 4000, Karolina VIZ 40
Computer generated imagery is omnipresent in our modern society, from Visual Effects (VFX) in movies or computer games, to visualizations in architecture and product marketing. These images are synthesized using a light transport simulation within a virtual 3D scene. For increased realism, the scenes are usually surrounded by a 360° spherical image that serves both as a natural light source (from the Sun and sky) and a realistically looking background. To this day, these spherical images have either been photographed or computed using clear-sky models, optionally with manually painted cloud cover. This does not allow flexible editing of the sun position or cloud distribution. Our approach combines an accurate, state-of-the-art clear-sky model in the background, with a novel machine-learned model to generate cloudy pictures learned from real photographs and meteorological data. The outputs then resemble realistic spherical panoramas directly usable in photorealistic rendering. For increased artistic freedom, our approach allows setting various parameters, including geolocation, date and time, and weather conditions. With this project, we aim to advance not only computer graphics but the wider field of machine learning techniques as well.
Researcher: Martin Fajčík
Karolina GPU 8000, Karolina VIZ 40
In open-domain automated fact-checking, the system is provided with claim and a database of facts, i.e., Wikipedia or News Media. Then it is asked to find factual evidence supporting or refuting this claim. At last, the system also needs to decide, whether the fact should be supported, refuted, or there is not enough information available to make the decision. Current cutting edge systems for automated fact-checking (i) often focus on building a traditional fact-checking pipeline consisting from retrieval system, relevant sentence selection, and fact veracity prediction, (ii) are trained and tested on datasets which require extensive sentence-level relevance annotation, (iii) provide limited interpretability, which still requires extensive reading and human reasoning to understand. In this project, we will focus on building a system which can identify relevant evidence in large quantities of retrieved documents, along with predicting veracity, jointly. Furthermore, the early prototype of our already developed system shows promising results in its interpretability, providing supporting and refuting evidence of all granularity levels from articles, paragraphs, sentences up to words, while having only a sentence-level supervision.
Researcher: Antonín Vobecký
Weakly-Supervised Multi-Modal Learning for Scene Understanding
Karolina GPU 8000, Karolina VIZ 40
The objective of the research is to improve the accuracy of machine learning models under the conditions of low amounts of annotated training/testing data as well as the lack of correctly defined data distributions in the context of autonomous driving applications. The key open problem in these low-data regimes is low model accuracy in situations that are rare or missing in the training data or overfitting to the training data distribution that inherently differs from testing and validation. On the other hand, large amounts of un-annotated data with multiple sensor modalities (multiple sensory captures for the same scene and time) as well as virtual data are available in the automotive set-ups. In this project, we will develop a new generation of large-scale multi-modal self-supervised learning techniques that will overcome the need for costly and hard-to-obtain annotations. The potential impact of this work is increased safety on roads with better, more accurate, and robust car perception systems.
Researcher: Jakub Klinkovský
Development and optimization of the LBM solver using the TNL library
Karolina GPU 4000, Karolina VIZ 40
The lattice Boltzmann method (LBM) is a popular method for fluid flow simulation. It can be efficiently implemented for massively parallel architectures, it can be applied to highly turbulent flows, and it can be coupled with other methods such as the mixed-hybrid finite element method (MHFEM) to allow multi-physics simulations. We have successfully applied LBM in various physical scenarios, including turbulent air flow in the atmospheric boundary layer, flow in aortic valve phantom, and water vapor transport in the near-surface boundary layer above a moist soil. In this project, we will develop improvements to our implementation of LBM based on the open source library TNL (https://tnl-project.org/). The main objectives are performance and scalability testing, code optimization and the development of advanced boundary conditions.
Researcher: Roman Bushuiev
Interpretation of mass spectra with self-supervised machine learning
Karolina GPU 8000, Karolina VIZ 40
Metabolomics is a research field that studies the metabolism of biological systems in health and disease. Mass spectrometry (MS) is the most prominent experimental tool of metabolomics. It enables measuring the composition of biological samples in terms of their molecular masses, yet not allowing to directly elucidate unknown metabolites. State-of-the-art computational methods deduce structures of molecules from their tandem mass spectra, relying on previously annotated spectral libraries. The libraries do not cover the diversity of metabolites and MS conditions resulting in only small fractions of MS data being annotated. As a significant drawback, less than 10% of a human metabolome has been discovered to date. We propose to break the limitation with self-supervised deep learning applied to a vast unannotated MS database containing billions of spectra. We implement a Transformer neural network and train it in a self-supervised regime by hiding fragments of mass spectra and training the model to restore them. To succeed in the task, the model must determine the patterns characteristic to spectra and therefore learn the structural properties of underlying metabolites. Consequently, the representations extracted from the Transformer serve as a starting point for a wide range of metabolomic problems. In preliminary experiments, we proved the potential of the proposed method and are currently advancing the neural network architecture and scaling the training procedure.
Researcher: Luboš Šmídl
Acoustic transformers for speech recognition #2
Karolina GPU 8000, Karolina VIZ 40
Motivated by a human brain and by how children learn new skills, deep neural networks became very powerful in solving very hard NLP tasks (such as speech recognition) while being conceptually simpler. The training of such large neural networks is possible only due to a huge amount of unlabeled data available on the Internet together with the stunning computing power of modern GPU clusters. The aim of the project is to use the power of a node with 8 A100 GPU for training acoustic transformers for speech recognition. Currently, the use of transformer technologies in the field of natural language processing is popular and challenging. Acoustic transformers are based on a similar principle - with the help of large computing power and a huge amount of untranscribed data, a representation in the latent space (embedding space) is found. This representation can then be used in the tasks of speech recognition, speaker detection, diarization, keyword detection, query by example etc.
Researcher: Vladan Stojnić
Learning Image Representations Using Limited Supervision
Karolina GPU 3000, Karolina VIZ 40
Until recently, training of deep neural networks for computer vision applications mostly relied on the use of big, annotated datasets for training. However, annotation of these datasets is extremely time-consuming process which creates the need for developing methods will be able to learn powerful image representations from small amount of labeled data or preferably no labeled data at all. In recent years self-supervised and semi-supervised learning methods have produced amazing results that are close to or even surpass supervised methods. Although, the results of these methods look really promising most of the methods were developed for images of natural scenes with coarse-grained labels and they don’t perform as well on more domain specific datasets with fine-grained labels. In our work we are trying to develop methods that will be able to tackle these domain shifts as well as fine-grained labels.
Researcher: Jan Lehecka
Text Transformers for NLP tasks #2
Karolina GPU 8000, Karolina VIZ 40
In the last few years, deep neural networks known as Transformers have dominated the research field of Natural Language Processing (NLP). These highly sophisticated models benefit from the combination of a huge amount of unlabeled text data available on the Internet, self-supervised training methods motivated by the learning skills of a human brain, and still increasing computational power of high-end GPU and TPU clusters. The main idea behind text-based Transformers is to let the model read as much text as possible during the pretraining phase in order to learn high-level language representations of individual words while paying sophisticated attention to its context. This is performed by pretraining the model on artificial tasks based on repairing corrupted or perturbed inputs. After that, the model is able to generate contextual embeddings encoding both syntax and semantics of the input text. These embeddings can be easily fine-tuned to solve a large variety of NLP tasks, such as text classification, text generation, chatbots, etc. Our research is focused on three types of Transformers: (1) Encoders are suitable for text classification tasks (DeBERTa ), (2) Decoders are suitable for the generation of text (GPT-2 ), and (3) Encoder-Decoder models solve text-to-text problems (T5 ). Our models have already scored state-of-the-art results in several Czech NLP tasks, including text classification , sentiment analysis , or post-processing of ASR output .
Researcher: Valeria Butera
Anti-Cancer Drugs Loaded on Functionalized CuO NPs
Karolina GPU 5000, Karolina VIZ 40
Marine algae are a rich source of biologically active compounds that can be utilized in various food and pharmaceutical applications. In this study, we will use periodic boundary condition (PBC) calculations at the DFT level of theory to investigate potential formation of a complex between the anti-cancer drugs cisplatin and its derivates nedaplatin and oxaliplatin, and the CuO nanoparticles (NPs)synthesized using red algae extract. It was observed experimentally that the interaction of the anti-cancer drugs with the algae-synthesized CuO NPs greatly enhances its anticancer effect, prolonging the drug-release up to 120 h. This study is part of a on-going collaboration with the experimental group of prof. Tamer Shoeib of the American University of Cairo. Obtaining the access to GPUs will allow us to fulfill our duties in this experimental-computational project.
Researcher: Vladimir Petrik
FocalPose++: Enhanced Object Pose and Camera Model Estimation
Karolina GPU 8000, Karolina VIZ 40
In recent years, various image and video processing methods achieved high accuracy in many tasks including image-based object 6D pose estimation , robot pose estimation , and 3D scene reconstruction . These methods assume known camera intrinsic parameters that limit their applicability in the wild, for example, for automatic learning from YouTube videos. In our recent work , we addressed the problem of simultaneous estimation of camera parameters together with object pose by utilizing the render and compare methodology. The goal of the proposed project is to further improve the performance of these neural network models. Progress on this challenge would allow estimation of the object 6D pose in uncalibrated setups with high accuracy. This would open-up the possibility of learning robot manipulation skills from uncurated videos for industrial or home robotics.
Researcher: Jan Pichl
Alquist: An Empathetic Conversational Artificial Intelligence
Karolina GPU 2600, Karolina VIZ 40
The open domain-dialogue system Alquist aims to conduct a coherent and engaging conversation that can be considered one of the benchmarks of social intelligence. One of the critical aspects of a successful conversation is to be empathetic and reflect on what a user is actually saying. Generative models are the most suitable way to effectively conduct a coherent empathetic conversation as they can handle various user utterances and generate meaningful responses. The generative models are trained on extensive conversational data with the goal of generating the most suitable response given the conversation context. Optionally, additional input information can be provided, e.g., knowledge graphs and user profiles. Our research goal is to train conditioned generative models to have more control over the response of the model and reflect the user preferences. We have several successful experiments with the generative models conditioned with dialogue acts. We plan to experiment further with the additional inputs such as topics, sentiment, knowledge graphs, and user profiles to create more focused generative models.
Researcher: Pavlo Polishchuk
Comparison of de novo design approaches
Karolina GPU 1000, Karolina VIZ 40
Discovery of new biologically active agents is a non-trivial task due to vast amount of possible drug-like chemical structures, 1033. It is impossible to enumerate the whole chemical space to perform exhaustive virtual screening. Therefore, do novo design methods received great attention last years. They adaptively explore the accessible chemical space to enumerate compounds with preferable properties to discover primary hits which can be further optimized. Here, we will evaluate different de novo design tools based on neural networks to compare their output with the fragment-based approach developing in our group.
Researcher: Radim Špetlík
Weak Signal Analysis in RGB Images
Karolina GPU 4000, Karolina VIZ 40
Glass-reflection removal, or glass-glare removal, is a problem of significant practical importance with applications ranging from license plate reading  to digital cleaning of camera optics . Given a single photo, the task is to remove reflections, or glares, without affecting the background. Much research in recent years has focused on instances of reflection removal constrained by specific qualitative attributes, such as requirement of full-screen reflection [6,7,8]. Glass-glare removal research has assumed constraints by development environment [3,1], or requirements of additional specialized hardware [4,5,2]. We address the general problem of reflection, or glare, removal aiming at reducing heavy constrains required by previous work with focus on the whole spectra of the problem – we consider reflections of all sizes and strengths and we only require a single RGB image as an input.
Researcher: Lukas Neuman
Architecture Search for Deep Learning
Karolina GPU 1700, Karolina VIZ 40
Thanks to deep learning techniques, Artificial Intelligence field made tremendous progress in the past years, and as a result has been successfully applied to many practical tasks in computer vision, natural language processing (NLP), speech recognition, etc. Whilst training deep networks relies on machine learning algorithms, which given training data optimize network parameters for a specific task, the network architecture (i.e. the network connectivity pattern and as well as the operations used by individual network nodes) is almost always hand-crafted by human, using trial and error approach. This also applies to the most popular network architectures such as ResNet  or Transformers , including commercially exploited network architectures like GPT-2 . Architecture Search methods aim to overcome this limitation, by proposing algorithms which systematically explore possible deep network architectures and automatically find the most promising architectures. The main challenge of such algorithms is the fact that the search space of all possible deep network architectures is exponentially large, which makes naïve methods such as exhaustive search impossible to use, and therefore an efficient search strategy has to be applied to quickly discard architectures which do not lead to good accuracy, and to focus only on the most promising network architectures.
Researcher: Vojtěch Pánek
Training Visual Features for Localization with Compact Environment Representations
Karolina GPU 5100, Karolina VIZ 40
The goal of visual localization is to estimate from which position and orientation a given image was taken. This method is often used for the localization of mobile robots (UGVs, UAVs, cars) to enable their autonomy or for extended reality devices (headsets, cellphones) for blending between real and virtual worlds. The cornerstone of the visual localization system is the data association between a model of the environment and the captured images. One possible way to represent the environment is to have a 3D model created from images by photogrammetry. The data association is then done between the image from the camera and the rendered images. We want to extend the range of usable 3D models by models without any color information. That would enable the use of models captured by range scanners (LiDAR) or represented by neural implicit volumes. Our previous experiments show that some of the existing data association algorithms work surprisingly well for the task without any adaptation (they were originally developed for color images). The goal of the project is thus to explore how well the association algorithms work when adapted for the task. To this end, we will investigate multiple training strategies and evaluate the performance of the algorithms in our existing localization framework.
Researcher: Lukáš Soukup
Pokročilé rozpoznávání a vytěžování ručně psaných textů s využitím neuronových sítí
Karolina GPU 3200, Karolina VIZ 40
The project is focused on hand-written text recognition (HTR). In the research we will follow the state-of-the-art approaches in HTR based on transformers. We will use an internal large-scale dataset of Czech historical documents. The outcome of the research shall help with the digitization of the hand-written documents.
Researcher: Tomas Soucek
Weakly supervised learning for video understanding
Karolina GPU 8000, Karolina VIZ 40
Building machines that can automatically understand complex visual inputs is one of the central problems in artificial intelligence with applications in autonomous robotics, automatic manufacturing or healthcare. The problem is difficult due to the large variability of the visual world. The recent successes are, in large part, due to a combination of learnable visual representations based on neural networks, supervised machine learning techniques and large-scale Internet image collections. The next fundamental challenge lies in developing visual representations that do not require full supervision in the form of inputs and target outputs, but are instead learnable from only weak supervision that is noisy and only partially annotated data. This project will address this challenge and will develop video representations learning from only weak annotations. More details are on http://impact.ciirc.cvut.cz/ and https://data.ciirc.cvut.cz/public/projects/2022LookForTheChange
Researcher: Oldřich Plchot
Training transformer based neural models for speech applications
Karolina GPU 8000, Karolina VIZ 40
In the last few years, the field of speech processing has benefitted from the rapid progress of Self-supervised learning (SSL) techniques and leveraging of large amounts of untranscribed data. The tranformer-based models such as WavLM, HuBERT and Wav2vec are pretrained on as much as 94 thousand hours of unsupervised speech data. They are widely used as a powerful feature extractor for many different speech tasks, such as automatic speech recognition (ASR), speaker verification (SV), or Language Identification (LID). Nowadays, we need to rely on pre-trained models for third parties such as Microsoft, Google, or Meta and experiment with their finetuning for a specific task. We want to set up and test the training of such models and their modified variants on the data of our choosing, which would best fit the intended speech application or data domain. During this special testing and benchmarking call, we intend to optimize available training pipelines and determine our following achievable research with the IT4I infrastructure.
Researcher: Varun Burde
Building and evaluating accurate 3D models from images for robotic manipulation
Karolina GPU 4800, Karolina VIZ 40
Manipulating objects is a core capability for many robots. A pre-requisite for manipulation is object pose estimation, where the goal is to estimate the position and orientation relative to the robot as this information informs the robot on how to interact with the object (how to approach, how to grasp, etc.). The current state-of-the-art object pose estimation algorithm relies on detailed and accurate 3D models of the objects. In this project, we will investigate constructing such 3D models from images. We will evaluate a wide range of existing 3D reconstruction algorithms, ranging from classical approaches based on multiview geometry to modern approaches based on implicit neural representations (e.g., NeRF and its variants). Evaluation criteria are run-time efficiency and the accuracy of the resulting 3D model, measured as the accuracy of object pose estimation possible with the constructed 3D models. Based on the most promising existing approaches, we will develop novel, machine-learning based algorithms for highly efficient 3D reconstruction from images. This will enable to build the 3D models required for object pose estimation on the fly, enabling robots to handle novel types of objects and thus to better deal with the complex real world they need to operate in.
Researcher: Maxime Pietrantoni
Robust Visual Localization and Image Retrieval under Changing Conditions
Karolina GPU 5000, Karolina VIZ 40
Visual localization is the problem of estimating the exact camera pose for a given image in a known scene, i.e., the exact position and orientation from which the image was taken. Localization algorithms are core components of systems such as self-driving cars, autonomous robots, and mixed reality applications. When deploying visual localization algorithms on real world systems five criteria must be taken into account: localization accuracy, runtime, memory consumption, robustness to extreme appearance and structure changes, and user privacy. With rising privacy concerns associated with cloud-based localization services, we want to find alternatives to image based localization methods. The aim of this research project is to develop a more abstract scene representation that stores semantic information but not privacy-relevant details. To that end, we seek to design systems that reach a high scene understanding level as we believe that having networks that can reason about geometry and semantics on a temporal scale would facilitate privacy-preservation. In our current work, scene understanding is pursued by uncovering pseudo-semantic segmentations. We have shown that such representation can lead to memory-efficient scene representation that achieves some level of privacy at the cost of localization accuracy. Successfully investigating the trade-off between accuracy, robustness to scene changes and privacy would further support the deployment of real world localization systems.
Researcher: Tomas Jenicek
Day-night image retrieval
Karolina GPU 4000, Karolina VIZ 40
Image retrieval is an important and active area in computer vision. The task is to query-by-image in a large indexed collection of images, where the search is based purely on the image content. Applications include content-based browsing and search in large image collections, visual localization, image annotation, data collection for 3D reconstruction, and many others. The current state-of-the-art retrieval methods are based on Deep Neural Networks which are trained on a GPU. In our project, we address image retrieval under significant illumination changes, such as between day and night images, where the appearance changes dramatically. This is currently an active field of research because of its applicability for real-world tasks such as autonomous driving.
Researcher: Ondřej Dušek
Multilingual data-to-text generation with pretrained language models
Karolina GPU 1200, Karolina VIZ 40
This project will involve experiments in natural language generation based on pretrained language models [1,2] related to the NG-NLG ERC project on language generation , which is currently starting at Charles University under Ondřej Dušek’s lead. We will examine several approaches to finetuning different pretrained language models to allow generating text from data in multiple languages. We will focus on the WebNLG dataset , which contains sets of knowledge-base triples from DBPedia , coupled with corresponding natural language descriptions in English. Task is to generate a descriptive sentence given a set of triples. We will also experiment with machine-translated versions of the dataset – with Czech, as this is our particular focus, as well as with the low-resource languages Irish, Welsh, Breton and Maltese, which are envisioned for a future shared task challenge . Additional techniques to explore are the use of machine-translation models, using additional large monolingual data for language modeling, or text editing approaches  for postprocessing. We also plan to explore possibilities of conditioning the models with external data from knowledge graphs .
Researcher: Anton Bushuiev
Machine-learning-guided staphylokinase engineering
Karolina GPU 4000, Karolina VIZ 40
Stroke is one of the leading causes of death and disability worldwide and one of the most frequent causes of dementia and epilepsy . In this project, we leverage recently proposed machine-learning algorithms to identify thrombolytic drug candidates against stroke, which will be further tested in laboratory within the STROKE program . In detail, the staphylokinase protein is a very attractive thrombolytic drug candidate that may be an inexpensive and safer alternative to the clinically used alteplase . Partnering with a microplasmin molecule, it promotes enzymatic degradation of fibrin clots inside blood vessels. However, a natural form of staphylokinase lacks the necessary binding affinity to the partner, limiting its clinical efficiency. Therefore, the goal of this project is to find the mutations of staphylokinase that preserve its necessary properties and increase its affinity to the microplasmin. “Screening” the whole space of possible mutations is infeasible in a laboratory, but can be achieved with appropriate computational methods. We will apply recently proposed machine-learning algorithms to explore the huge space of possible staphylokinase mutants to find the best drug candidates.
Researcher: Pavel Šuma
Metric learning for efficient inference at small image resolutions
Karolina GPU 2400, Karolina VIZ 40
Performance of deep neural network is typically increasing with its size and computational complexity. Most research focuses on improving performance and therefore relies on models that are expensive to build and deploy. Yet there are many valid cases where the neural network has to be used on a smaller device, such as a security camera, or a mobile phone. The objective of our work is to alleviate the limitations of these systems while preserving their strengths. We focus primarily on the task of large-scale image search in a database based on common visual cues. Currently, our method decreases the number of model parameters up to four times by sacrificing only around 2% of the performance.
Researcher: Ivo Oprsal
Wave propagation and ambient vibrations of near-surface geological structures
Barbora CPU 2778, Barbora VIZ 40, Karolina CPU 2813, Karolina VIZ 40
Earthquakes are manifested by strong ground motions often affecting structures and consequently having strong social and economic impacts. This is especially pronounced in the Urban areas. They typically suffer of significant casualties and earthquake damage caused to infrastructure in a number of locations over The World, every year. Strong ground motions are generated in earthquake source, travelling through regional geology, and finally reaching a local geological setting. The near-surface geology is the most critical point in terms of strong-ground motions and damage to structures. The 3D effects of the wave propagation modeled by Finite-differences (FD) are predictable quantity in the strong earthquake ground motion prediction (Figure 1). This allows to conduct large number of numerically simulated ground-shaking scenarios representing all earthquakes potentially appearing in a respective region. The synthetic data is then used to estimate earthquake impact on structures in metropolitan area to efficiently mitigate and fight seismic hazard via engineering anti-seismic structure design, re-enforcement of existing structures, sophisticated urban projecting, and disaster mitigation planning. Lower damage to structures and consequent lower casualties are direct with important economical and societal impact of the methods. Fig 1: Ground motion in Osaka due to M7.1 1993/10/12 event: 0.00-1Hz(left) 0.05-0.3Hz(right) Black triangles are stations, solid line depicts bayshore.
Researcher: Katerina Ruzickova
Slope analysis from geologist view, in Poland
Barbora VIZ 40, Karolina CPU 59, Karolina VIZ 40
The project continues the project Slope analysis from geologist view, which studied relationship between kind of soil and terrain slopes. Where was proved the hypothesis, that different soils have different compactness in the Czech Republic, which have the impact on terrain relief. Study analogous with previous one is intended to be done for Poland. This project should prove this relationship via slope analysis in geological districts.
Researcher: Dominik Legut
Magnetism at interfaces – from quantum to reality
Barbora CPU 50000, Barbora GPU 2000, Barbora VIZ 40, DGX-2 1000, Karolina CPU 30000, Karolina GPU 15000, Karolina VIZ 40,
Permanent magnets are a key technology for modern society with applications in air conditioning, mobility, or power generation. In state-of-the-art permanent magnets the atomic-scale defects, like for instance in the grain boundary phase, have the most significant influence on the macroscopic properties (e.g. coercivity), but these effects are the least understood. In this project, we develop a quantitative theory of coercivity, taking into account the local atomic structure, the spatial variation of the intrinsic magnetic properties, and the physical microstructure of the magnet. To achieve this goal, we bridge the length scales between ab initio, atomistic spin dynamics, and continuum micromagnetic simulations. Atomic defects at interfaces and grain boundaries will be considered already at the smallest possible length scale and former assumptions based on bulk material properties can thus be avoided. A quantitative description of the effect of defects at interfaces and grain boundaries, inaccessible experimentally, becomes accessible via a validated multiscale coercivity model.
Researcher: Michal Hrabánek
Photo-realistic Architecture Visualization
Barbora CPU 2, Barbora GPU 26, Barbora VIZ 40, Karolina CPU 2, Karolina GPU 41, Karolina VIZ 40
An important part of showcasing a building design is a visualization of the designed building. Architecture visualization is the only way to show how the building could look like, thus it needs to be of high quality. A potential customer could decide if he wants to build the building according to its visualization. Within the project, computational resources will be used to create high quality photo-realistic architecture visualization. Using a very computationally demanding method called ray tracing is needed to achieve such visualization. Created visualizations will be used in a bachelor thesis and as a showcase of possible usage of rendering on a supercomputer.
Researcher: Raul Chametla
Dust feedback effect on the migration of accretIng low-mass planets in dusty and magnetized disks
Barbora VIZ 40, Karolina CPU 40000, Karolina FAT 0, Karolina GPU 6250, Karolina VIZ 40, LUMI-C 6250, LUMI-G 5000
The direction of planetary migration is governed by the total torque on the planet exerted by the disk of gas and dust. However, for models of isothermal disks with realistic profiles, the disk’s torque is generally a negative quantity that leads to a decrease of the planet’s orbital radius with time. The main problem with the total torque being a negative quantity is that low-mass planets should fall toward the central star in much less time than the life time of the disk. One of the most promising mechanisms to stop inward migration is to consider thermal diffusion in a gaseous disk. The aim of this project is to investigate the feedback effect of the dust on the total torque that felt the planet embedded in a protoplanetary disk formed of gas and dust, by three-dimensional (3D) high-resolution hydrodynamical (HD) and Magnetohydrodynamical (MHD) multifluid simulations. The novelty of our research is that, in addition to including thermal diffusion, it also includes the back-reaction of dust on gas and perturbations in density generated by a set of background spiral waves induced by a net vertical magnetic field in the disk. Therefore, the results of our research can serve to better understand the migration of low-mass planets, since our models are more realistic than those previously reported in the literature.
Researcher: Sergiu Arapan
Ab initio calculation of volume magnetostriction via magnetically constrained supercells
Barbora VIZ 40, Karolina CPU 24000, Karolina GPU 10000, Karolina VIZ 40
The spontaneous volume magnetostriction ωs is a fundamental property of magnetic materials, defined as the fractional volume change between magnetically ordered and paramagnetic (PM) state below the Curie temperature. This property is responsible for interesting features like anomalies in the thermal expansion coefficient in magnetic materials and is widely used in many commercial applications like precision machine tools, lead frames for integrated circuits, thermostats, astronomical telescopes, seismographic devices, laser light sources, etc. The experimental and theoretical study of ωs is a nontrivial task due to the difficulties in the characterization of the equilibrium volume of a hypothetical PM-like state. In this study we try to estimate the equilibrium volume of the PM state of cubic Fe, Co, and Ni materials by performing magnetically constrained supercell calculations.
Researcher: Pierre Koleják
Highly efficient terahertz spintronic emitters
Barbora VIZ 40, Karolina CPU 1000, Karolina VIZ 40
Terahertz technologies find many applications across medical imaging, security revealing of drugs, explosives and weapons, ultrafast telecommunications, and quality diagnostics of integrated micro-circuits, food and manufacturing. Nowadays, the main barrier is the acquisition cost of such technologies for widespread application. Although many terahertz sources were already developed, the recent progress in terahertz spintronics changes the play rules due to high versatility, low-cost fabrication and slightly lower efficiency than standard sources. The efficiency of such a spintronic device can be drastically enhanced by integration with another structure. Here, we apply the photonic and plasmonic structures, which are designed to maximize the optical to terahertz conversation by engaging electrons and modifying field distribution.
Researcher: Jaroslav Resler
PALM simulations for the project TURBAN
Barbora VIZ 40, Karolina CPU 90000, Karolina VIZ 40
The newly developed urban climate model (UCM) PALM (www.palm4u.org) is the first complex UCM based on large eddy simulation approach. It allows to perform detailed simulations of the conditions in urban areas, mainly with respect to phenomena of the thermal comfort and air quality. Our long‑term significant contributions to the model development and validation allows us the efficient and reliable usage of the model for assessment of the urban climate adaptation measures. The goals of the TURBAN project (Turbulent-resolving urban modelling of air quality and thermal comfort, Norway Grants) include additional improvements of the model and its configurations, model validation and its utilization to assessment of urban development scenarios, comparison of the model with simpler models, and the fusion of the modelled data with observations. As the model is computationally demanding, all these simulations require a large amount of the parallel computation power which exceed the usual in-house resources. The supercomputer facilities in IT4I and Sigma2 (Norway) allow to manage this challenge efficiently. This multi-year proposal follows our previous IT4I standard project (OPEN-24-33) whose purpose was to prepare, test, and tune needed model configurations in Karolina environment.
Researcher: Kryštof Mráz
Fluid Flow Simulations in a Complex Computational Domains
Barbora VIZ 40, Karolina CPU 5080, Karolina VIZ 40
Porous structures and products with a complex inner geometry are still considered as a great challenge for conventional CFD (computational fluid dynamics). Unlike classical CFD methods (e.g., the finite volume method), the Lattice Boltzmann has proven itself as a promising option for such complex computational domains. The aim of this project is to utilize the Lattice Boltzmann method for numerical simulation of flow through hollow fiber heat exchangers. These heat exchangers contain hundreds or thousands of hollow fibers with outer diameter approx. 1 mm. Such a complex geometry of a heat exchanger made it impossible to simulate it by the conventional CFD. However, the comprehensive numeric simulation of the whole heat exchanger is highly desirable, because it would fill the gap between rather simplifying analytical models and empirical experiments. The local and explicit nature of the Lattice Boltzmann method makes it more than suitable for a massive parallelization and high-performance computing.
Researcher: Jan Brandejs
The role of quantum electrodynamics in heavy-element chemistry
Barbora VIZ 40, Karolina CPU 11328, Karolina FAT 50, Karolina GPU 3704, Karolina VIZ 40, LUMI-C 3079, LUMI-G 2920
Quantum chemical calculations are today in a position where they not only assist, but may also challenge experiment, at least for molecules containing light atoms only. Once heavy atoms are present, achieving the same accuracy becomes challenging, largely due to relativistic effects. When surveying the physics that has to be included for a reliable description of such systems, quantum electrodynamics (QED) should be considered. Studies so far indicate that QED-effects (electron self-energy and vacuum polarization) reduce relativistic effects by about 1%. However, such investigations have been limited to valence properties, since there are currently no reliable tools for general molecules to study the core region, where QED-effects are generated. The goal of the project is to compute molecular properties accurately for the core region using recently implemented effective QED potentials.
Researcher: Martin Matys
Laser-driven ion acceleration using multi-layer targets
Barbora VIZ 40, Karolina CPU 5800, Karolina VIZ 40
Laser-plasma ion accelerators are currently receiving particular scientific attention as promising source of accelerated charged particles, since they are able to generate much stronger electric fields in comparison with conventional accelerators. The use of multi-layer targets of heavy and light materials is often suggested to reach required energies of light ions for several impressive applications, including proton therapy for the treatment of the cancer cells, nuclear fusion, production of PET (positron emission tomography) medical isotopes, generation of ultrashort neutron pulses, radio isotope source, etc. In this project we will investigate the application of this type of targets assuming the target interaction with the current state-of-the-art PW-class laser systems like L3 laser at ELI Beamlines via computationally demanding 3D and 2D particle-in-cell (PIC) simulations. The application of the plasma shutter in this scenario, resulting in shaping of the laser pulse profile and consequently improving the properties of the accelerated ions will be also investigated.
Researcher: Dominika Mašlárová
Multi-MeV ring-shaped electron beams from a laser wakefield accelerator
Barbora VIZ 40, Karolina CPU 6563, Karolina VIZ 40
Laser-based electron accelerators represent a promising concept of the next-generation accelerators. Their main advantage is a remarkably short acceleration length. Therefore, these accelerators introduce a more compact and cheaper option compared to conventional accelerators, leading to better accessibility to electron accelerators in research, medical and industrial facilities. In one of the most popular laser-based methods, called laser wakefield acceleration, electrons are injected into a plasma wave (wakefield), generated and dragged by a few-tens-of-fs, ultra-intense laser pulse in optically transparent plasmas. The plasma wave consists of several periods, which, in the case of high-intensity lasers, generate a train of ion cavities, while propagating through plasmas. Each cavity contains a wave phase that is simultaneously accelerating and focusing. As a consequence, the electron beam can be well collimated on the axis of the laser pulse propagation. However, in addition, the nonlinear features of laser-plasma interaction can generate and accelerate an electron beam that has a transverse shape of a ring. Ring-shaped electron beams can be used to generate X-ray pulses of specific features or to collimate other on-axis particle beams of ions and positrons. Thus, the explanation of the formation of such ring electron beams is crucial for further applications and can be explained by carrying out an extensive computer simulation study.
Researcher: Sviatoslav Shekhanov
Nonlinear laser absorption under high-energy-density conditions.
Barbora CPU 8000, Barbora VIZ 40, Karolina CPU 2100, Karolina VIZ 40
With the advent of a new generation of powerful lasers, the study of high energy density physics has attracted a widespread interest. One of the application that has made a strong contribution to the development of high energy density physics is inertial confinement fusion (ICF). An important topic in the ICF research are the effects caused by various parametric instabilities generated by the interaction of intense electromagnetic wave with a hot dense plasma. It is important to control the interplay between scattering instabilities (stimulated Brillouin and Raman scattering) and laser absorption due to the two plasmon decay (TPD) instability and particle collisions. Achieving an efficient absorption of the laser spike in the shock ignition scheme  is a serious and unresolved issue. Laser pulse propagates through a long and hot plasma corona created by the preceding compressing laser pulse; collisional absorption is very low in such plasmas and collective absorption related to excitation of parametric instabilities could be important. However, it is not known how much energy can be absorbed and reflected and how absorbed energy is distributed between thermal and suprathermal (hot) particles. The subject of this work is to investigate the processes of nonlinear laser propagation and absorption in a hot dense plasma under conditions, which are relevant to the ICF.
Researcher: Dagmar Zaoralová
Machine learning for vacancy formation probability prediction in nitrogen-doped graphene – data collection for algorithm training
Barbora VIZ 40, Karolina CPU 120000, Karolina VIZ 40
Nitrogen-doped graphenes (NGs) are promising materials usable in a variety of applications thanks to their high electric conductivity, large specific surface area, and good thermal and chemical stability. Namely, NGs are applicable in catalysis, supercapacitors, fuel cells, lithium-ion batteries, spintronics, electromagnetic devices, and molecular sensors. The properties of NGs are tunable by the number, composition, and structure of defects and vacancies. However, ‘by-hand’ designing of all shapes and sizes of vacancies that may possibly occur in the real NG sample is extremely demanding or nearly impossible. Fortunately, the current development of artificial intelligence, more precisely of machine learning, offers tools to tackle this issue. In order to make accurate predictions of NG’s vacancy stability, it is necessary to collect training data that include systems in their ground states as well as in states out of the energy minima. Therefore, in this first step, we plan to run Density Functional Theory (DFT) calculations of a large number of NG supercells containing defects to collect a sufficient amount of training data. We will apply the Vienna Ab initio Simulation Package (VASP) for calculations in periodic boundary conditions. We believe that the developed algorithm will help with large-scale screening of a plethora of possible structures that may be present in real NG samples for further research.
Researcher: Thibault Derrien
FIrst-principLe Investigation of elliPtical excitatIon and plasmoNic properties of laser-irradiated materiAlS (FILIPINAS)
Barbora VIZ 40, Karolina CPU 105000, Karolina VIZ 40
This proposal will support the EU project Horizon 2020 RISE “ATLANTIC” No. 823897. This project aims at combining several theoretical formalisms together in view of improving the predictions capabilities for the development of applications based on intense laser processing of solids. In this 5-years EU project (2019-2024), the HiLASE Centre (FZU, Dolni Brezany) is driving the effort for describing the excitation of the electrons in various laser-irradiated materials, along with describing the transient change of optical properties of these materials. By transferring the insights gained from available first-principles microscopic descriptions [acquired from the time-dependent density functional theory (TDDFT)] to large-scale approaches, the project guides the invention of novel usages of intense laser light for modifying and functionalizing bulk and nano-materials. In this context, the present HPC project focuses on the excitation of electrons upon elliptical states of light polarization in semiconductors and on transient optical properties of nanomaterials. In the frame of ATLANTIC, this project will also support training of young researchers in using advanced theoretical techniques adapted to the problems met in the engineering field of laser processing.
Researcher: Jan Schee
Imprint of non-GR theories on gravitational waveform
Barbora CPU 4800, Barbora VIZ 40, Karolina CPU 5200, Karolina VIZ 40
Numerical relativity (NR) is one of the active area of research in the field of relativistic astrophysics to investigate a large variety of phenomena of astrophysical interest. General relativity (GR) has been spectacular in describing many astrophysical events in weak field regime. But a strong field test of gravity is yet to be done. In order to contribute in this direction of research, we plan to investigate the signatures of theories of gravity other than Einstein‘s GR on the observed gravitational waveform. In doing so, we will study extensively the outcome of the numerical relativity simulations incorporating different gravity theories. In particular, we will focus on the imprint of the various theories of gravity on the gravitational waveform.
Researcher: Jan Kotek
Numerical simulations above a sunspot
Barbora VIZ 40, Karolina CPU 8300, Karolina VIZ 40
Processes in the Solar atmosphere are of high importance for the safety of spacecraft, astronauts and in case of severe storms also of power grids and electrical appliances on the Earth. One of the main sources of our knowledge about the solar atmosphere is the propagation of small disturbances such as magneto-hydro-dynamic waves in it. By comparing observations and theoretical calculations, we can find out about the nature of physical processes on our nearest star. Problem is, that these disturbances are only small deviations from local equilibrium and they can be difficult to model on a very dynamic background. We use a novel, more realistic than the contemporary and stationary 3D model of the solar atmosphere above a sunspot to circumvent this problem.
Researcher: Ota Bludsky
Boron-containing catalysts for alkanes oxidative dehydrogenation
Barbora VIZ 40, Karolina CPU 25000, Karolina VIZ 40
The utilization of significant reserves of natural and shale gas to meet the ever-increasing demand for olefins has emerged as a sustainable alternative to oil cracking processes. A promising way to convert the relatively inert light alkanes into corresponding olefins is their oxidative dehydrogenation. Recently, high activity and selectivity of dispersed forms of boron oxide (BOx) formed on the oxidized surface of the hexagonal boron nitride catalyst was reported. However, the nature of the active centers and their respective stability on different inorganic supports is still not yet well understood. The selectivity and stability of the individual types of dispersed boron oxide species on various inorganic supports will be investigated in oxidative dehydrogenation of light alkanes including ethane, propane, n- and iso-butane. This will allow an improved design of selective catalysts for the conversion of light alkanes into corresponding olefins.
Researcher: Dagmar Zaoralová
Nitrate to ammonia conversion using PCN-250 Fe3 MOF as a catalyst
Barbora VIZ 40, Karolina CPU 10000, Karolina VIZ 40
The overpopulation of Earth, decline of the availability of fossil fuels, and increasing energy demands call for clean, secure, and renewable energy sources. Ammonia is a fundamental substance for the production of fertilizers, pharmaceuticals, refrigeration, textiles, etc. It is also a green fuel with high energy density and high hydrogen capacity. Electrochemical reduction of nitrate is a prospective way toward ammonia production which could decrease the energy demand to a minimum along with converting nitrate, a harmful pollutant, into value-added products. The reduction of nitrate to ammonia (NRA) proceeds through multiple reaction pathways and at a potential region where competitive processes such as the evolution of hydrogen or nitrogen gas may occur. Therefore, it is necessary to design a specific electrocatalyst that dismisses the competitive processes and facilitates the NRA. Metal-organic frameworks (MOFs) have been reported as promising catalysts for NRA. However, the real potential of MOFs is still rather unexplored. Therefore, we plan to explore the energy profiles of the NRA and of possible side reactions catalyzed by the PCN-250-Fe3 MOF to support experimental observations. To this end, we plan to utilize program packages for calculations on cluster models, namely Gaussian. We believe that this study will be a significant contribution to understanding the catalytic activity of MOF materials and that it will open new possibilities for tailoring properties of MOFs applicable in electrocatalysis.
Researcher: Martin Jirka
Deciphering the role of an external magnetic field on laser-plasma interaction
Barbora VIZ 40, Karolina CPU 11100, Karolina VIZ 40
The production of safe and clean energy is one of the main challenges of the energy crisis and the climate change. Inertial confinement fusion (ICF) driven by lasers could be potentially viable solution. In August 2021, National Ignition Facility in the USA has ignited the fusion target for the first time, bringing the fusion energy much closer to reality. However, the energy gain has to be further increased by about 2 orders of magnitude for feasible energy production. A strong external magnetic field has been proposed to reduce the requirements on the fuel confinement and allow for increased fuel burn-up and higher energy gain. Such a field can already be realized, and the first experiments are being performed and giving promising results. Nevertheless, the influence of the strong external magnetic field on the interaction and absorption of an intense laser beam in a plasma target has not been fully explored yet. Our project aims to help decipher this influence via large-scale kinetic numerical simulations for realistic parameters of experiments which are foreseen to be realized at the PALS laser facility in Prague. We will investigate the dependence of the absorption process on the magnetic field direction and strength in the range up to 100 Tesla. It is expected that the project will shed new light on laser-plasma interaction in ICF relevant magnetized plasmas and thus significantly contribute to effort of increasing fusion gain and making it feasible for energy production.
Researcher: Zdeněk Mašín
Attosecond photoionization dynamics in molecules
Barbora CPU 15600, Barbora FAT 800, Barbora VIZ 40, Karolina CPU 28125, Karolina FAT 1440, Karolina VIZ 40
Photoionization of molecules is an ultrafast process which takes place on attosecond timescales while the subsequent nuclear dynamics that it triggers unfolds on femtosecond timescales. Progress in development of laser systems capable of taking ``snapshots” of these processes allows us to obtain a detailed insight into the role of electron-electron interaction, coupled electron-nuclear motion and thus provides a pathway to their control. In our project we will exploit our recently developed codes and theory to generate highly accurate ab initio calculations of attosecond time-delays and time-resolved photoelectron spectroscopy to support our collaborations with experimental groups. Our ultimate goal is to resolve the fundamental components of molecular photoionization dynamics to understand the initial steps of photochemical reactions in the continuum. We will focus on molecules ranging from water and simple polyatomics to more complex halogenated compounds.
Researcher: Jan Boháček
Inverse Heat Conduction Problem in complex geometries
Barbora VIZ 40, Karolina CPU 2300, Karolina VIZ 40
Metallurgical processes rely on numerical simulations of solidification and heat transfer. A successful simulation necessitates knowledge of appropriate boundary conditions. Herein, the topic concentrates on reconstruction of thermal boundary conditions. This task is known as solving the inverse heat conduction problem (IHCP). Temperature is known from one or more thermocouples at specific locations in the solid. Using this information, the heat transfer coefficient (HTC) or the heat flux is recalculated on the surface of a solid. In Heat Transfer and Fluid Flow Laboratory, a well-established in-house software has been in use for many years. The algorithm combines the Downhill Simplex Optimization Method with the Sequential Function Specification Method. Interestingly, theline-by-line method is considered for calculation ofthe heat conduction, which is denoted as a direct part of the IHCP. In spite of very good results achieved with the present software, important drawbacks are identified. Models are restricted to structured orthogonal meshes; thus, often suffering from high aspect-ratio volume elements. The solver of system of linear equations has a slow convergence due to the dimensional splitting considered. As result, the IHCPs with coupled heat transfer (lateral fluxes), more thermocouples, highly temperature dependent properties or phase change cannot be efficiently solved with the present software. Therefore, we propose to transform it into OpenFOAM, an open-source CFD code.
Researcher: Jakub Šístek
Multilevel domain decomposition for accelerated incompressible flow simulations
Barbora VIZ 40, Karolina CPU 5000, Karolina GPU 4000, Karolina VIZ 40
The main aim of the project is performing high-resolution computational fluid dynamics simulations of prototype problems of incompressible viscous flows. The primary goal of these simulations is to generate high-resolution 3D data with vortical structures, which will be subsequently used for development of new methods for flow-field analysis and vortex identification and visualization. Unsteady flows considering very fine computational meshes are required for this purpose. The computations will be performed using an in-house parallel finite element solver based on the pressure correction method and multilevel domain decomposition, with the aid of an existing parallel implementation of the method in the open-source BDDCML library. A subsequent goal of the project is further development of the computational method and optimization of the BDDCML library for large numbers of computer cores combined with GPU accelerators.
Researcher: Marketa Paloncyova
Interaction of bacterial membrane with carbon nanomaterials: a multiscale simulation study
Barbora VIZ 40, Karolina CPU 18600, Karolina GPU 2400, Karolina VIZ 40
Carbon nanomaterials (CNs) are promising tools for both medical therapy and diagnostics. They are largely biocompatible, with a low toxicity, however, the principle of their interactions with cell membranes is not fully resolved. Interaction with cell membranes and their crossing plays a role in all biological pathways. In this proposal, we will use atomistic and coarse grained molecular dynamics (MD) simulations in order to describe interactions of CNs with lipid membranes. We will pay special attention to carbon dots and graphene derivatives and to the differences between their possible partner membranes – mammalian membranes and bacterial membranes. Our simulations will show the thermodynamic aspects of individual CNs functional groups on the mutual CNs-membrane interactions. Further, we will describe the global behavior of CNs during membrane permeation, focusing on endocytosis and mechanical membrane disruption. This proposed project will require developing coarse grained parameters and simulation protocols and the gained insight can be used for tuning of CNs with desired properties, e.g. with antimicrobial activity.
Researcher: Petr Valenta
Laser-driven electron acceleration at kHz repetition rate
Barbora VIZ 40, Karolina CPU 19000, Karolina VIZ 40
Laser-wakefield acceleration, which was proposed in 1979, is already a well-established technique for producing high-energy electron beams in a plasma medium using lasers. Within this concept, a relativistically intense laser pulse propagating in underdense plasma induces a strong longitudinal electric field which, in turn, accelerates duly injected electrons. One of the primary advantages of this concept lies in the fact that ionized plasmas can sustain electric fields several orders of magnitude larger than conventional radio-frequency accelerators, allowing one to substantially reduce the acceleration length and the total cost of the device. However, the quality of electron beams produced by present-day laser-wakefield accelerators has to be further improved in order to replace the conventional accelerators in many interesting industrial and medical applications. A huge improvement in the application potential could be achieved by using high-repetition-rate (≈kHz) laser systems. One such laser - Allegra at ELI Beamlines, Czech Republic - has recently come into the operation and the first experiments on electron acceleration indicate very promising parameters of electron beams. Within the scope of this project, we plan to use the supercomputer for calculating numerical simulations capturing the physical mechanisms related to these experiments. The simulation results will then help to optimize the parameters of electron beams for selected practical applications.
Researcher: Jan Mičan
Structural dynamics-based design of compounds targeting Alzheimers Disease
Barbora VIZ 40, Karolina CPU 62500, Karolina GPU 168, Karolina VIZ 40
Alzheimer’s disease (AD) is the seventh leading cause of death, totalling 355 billion USD in US healthcare costs alone. An overlooked 256 billion is additionally spent by the families of the victims in care value. As the victims of AD, these numbers will quadruple by the year 2050. Many modes of therapy have failed because of targeting mere markers of the disease and not its cause: cytotoxic amyloid beta (Aβ) oligomers. Aβ belongs to intrinsically disordered proteins which show dynamic behavior challenging traditional structural biology and its methods of drug design. Here we propose a new combination of chemically and computationally accelerated, adaptively sampled molecular dynamics simulations. The simulations will be analysed by a state-of-the-art machine learning-guided method, VAMPnet, that can provide information on conformational macrostates, their populations, and the kinetics of exchange. We have already successfully employed molecular dynamics and adaptive sampling in the study and design of various enzymes. Our aim is to screen the halogenated derivatives of a small molecule named 10074-G5 to alter the dynamics of Aβ peptide in collaboration with the University of Cambridge (UK). This research will provide: (I) small molecules blocking Aβ oligomer formation and thus one of the causes of AD, (II) a better understanding of the Aϐ structure and behaviour, and (III) new horizons of structure-based drug design employing a molecular dynamics-based drug design.
Researcher: Alexander Molodozhentsev
Plasma channel formation and laser wake-field acceleration in a preformed channel
Barbora VIZ 40, Karolina CPU 15500, Karolina VIZ 40
A plasma-based acceleration scheme for particle acceleration by space charge wave was proposed by Y. Fainberg in 1956. This acceleration approach allows one to overcome one of the most significant limitations in conventional accelerators - limited electric field gradient. Extreme laser-plasma accelerating gradients, demonstrated experimentally by different teams, offer a path towards a compact laser-plasma accelerator (LPA), which can be used as an electron beam driver needed in a broad variety of applications, including free electron lasers (FEL) and even electron-positron colliders. The goal of the project is to model an plasma formation in a Sapphire capillary and after that combine the preformed plasma with laser pulse to perform the laser wakefield acceleration (LWFA). Such a combination of two models will allow us to predict an optimum set of the plasma and laser parameters needed to get parameters of the LWFA electron beam with parameters suitable for the laser-driven compact FEL. For this purpose, large-scale 3D MHD (magnetohydrodynamics) simulations in combination with large scale 2D and 3D particle-in-cell (PIC) simulations are necessary to perform, which requires both significant computer power and CPU time, available only on supercomputers. The simulation results will be used at ELI-Beamlines (IoP CAS) during experimental campaigns and will be a part of the ELI-Beamlines contribution into the EuPRAXIA preparatory project, accepted recently by EU (HORIZON-2022).
Researcher: Christian Sippl
Harvesting seismic waveform data for microseismicity with deep learning approaches - pt. 2: Benchmarking phase associators (Follow-up to OPEN-24-76)
Barbora VIZ 40, Karolina GPU 3000, Karolina VIZ 40
The strongest and most devastating earthquakes occur along subduction zones, thus a detailed understanding of the processes involved in the buildup of future large subduction earthquakes can have high social and economic significance. Recent studies have shown that detailed observations of many thousands of small earthquakes (“microseismicity”) on the interplate contact as well as in the downgoing plate can yield critical insights into the current state of the subduction system. We plan to conduct a comparative study between four subduction regions by harvesting large amounts of available raw seismic data for microseismicity using an automated and deep learning based approach. For this, we will combine existing and recently published algorithms for seismic arrival time picking and phase association into a single automated workflow. The retrieved large catalogs of small earthquakes (expected to contain hundreds of thousands of events) will form the base for several lines of further analysis, including seismic tomography, statistical seismology and combined inversion with GPS data. All of these lines of research are aimed at characterizing the different plate margins, and at understanding what parameters (e.g. age and temperature of the incoming plate, subduction angle) may govern differences in the observed characteristics between different subduction zones.
Researcher: Armit Sarmah
Photoenergetic Materials Design on the Basis of Quantum Chemical Simulation and Materials Informatics
Barbora CPU 4200, Barbora VIZ 40, Karolina CPU 1600, Karolina VIZ 40
Now more than ever the pressing need for securable, green, and sustainable energy calls for solutions incorporating the use of renewable energy sources. We have observed an exponential growth in the demand for solar power for generating electricity, catalysis, and other useful renewable energy applications and have become the driving force for the research on photo-driven processes as well as the development of light-harvesting materials. This project aims to develop some conceptual understanding of the electronic modulations of the photo-active materials after light irradiation. This includes photo-induced chemical reactions, charge-carrier generation, and transport in semiconductors and metals. The primary goal is to optimize materials for catalytic and solar applications through a methodical approach where each stage will involve different level(s) of theory to maximize discovery rate, balanced with appropriate accuracy. Better models and a greater understanding of these processes such as electronically excited states, in conjunction with chemical reactivity, are essential for bringing the current knowledge to a completely new level.
Researcher: Zdeněk Futera
Conductance of redox protein junctions in aqueous solution
Barbora CPU 12408, Barbora VIZ 40, Karolina CPU 13959, Karolina VIZ 40
The recent development of detailed single-molecular probe techniques like electrochemical scanning tunneling microscopy (EC-STM) allowed measurements of single redox-protein conductances on solvated metal-electrode interfaces. However, the obtained currents are unexpectedly high and the undergoing charge transport mechanism is not fully understood. The experimental data suggest that electronic charge tunnels through the protein junction between the electrode and the STM tip. Nevertheless, it is not clear how this can efficiently happen in such large and flexible macromolecules as proteins. To investigate the mechanism, we perform large-scale density-functional-theory (DFT) simulations of the redox-protein junctions in an aqueous solution to elucidate the formation of conduction channels and the interaction of the protein with the metal contacts. In this way, we study Azurin and small-tetraheme cytochrome (STC) conductive properties, as examples of Cu and Fe containing redox proteins on which the EC-STM measurements were recently performed.
Researcher: Martin Zelený
Ab initio study of exchange interactions at planar defects in Ni-Mn-Ga alloys
Barbora FAT 624, Barbora VIZ 40, Karolina CPU 24960, Karolina VIZ 40
Magnetic shape memory (MSM) alloys have a large application potential in actuators, sensors, energy harvesters, and magnetic refrigeration systems thanks to the extraordinary properties of their multiferroic martensite structure. The macroscopic deformation of such materials in an external magnetic field is caused by the motion of highly mobile twin boundaries (TBs) in magnetically ordered martensite. The twin behavior can be further modified by the particular magnetic domain structure and the presence of antiphase boundaries (APBs). Understanding of exchange interactions at the atomic level is the key to reveal the underlying physics and chemistry behind the observed magnetic properties which allows further improvement of currently use materials. The values of exchange interaction parameters Jij can be obtained from first-principles calculations and subsequently used in simulations of extended systems and/or excited states at nonzero temperatures. Information about Jij can be found in literature for many bulk magnetic materials. However, the information about Jij in the vicinity of TBs and APBs is very limited. Within the current project, we propose calculation of parameters Jij for selected TBs and APBs in Ni-Mn-Ga alloy, which is the most promising MSM material.
Researcher: Jiří Jaroš
Modeling of Low Intensity Focused Ultrasound Using Convolutional Networks II
Barbora CPU 5000, Barbora FAT 10, Barbora GPU 250, Barbora VIZ 40, Karolina CPU 1000, Karolina FAT 10, Karolina GPU 750,
Karolina VIZ 40, LUMI-C 100, LUMI-G 200
Transcranial low-intensity focused ultrasound (LIFU) therapy is increasingly used for the non-invasive treatment of brain disorders. However, conventional numerical wave solvers are currently too computationally expensive to be used online during treatments to predict the acoustic field passing through the skull. As a step towards real-time predictions, we developed a fast iterative solver for the heterogeneous Helmholtz equation in 3D using a fully-learned optimizer. The lightweight network architecture is based on a modified UNet that includes a learned hidden state. The network is trained using a physics-based loss function and a set of idealized sound speed distributions with fully unsupervised training (no knowledge of the true solution is required). The learned optimizer shows good performance on the test set, and is capable of generalization well outside the training examples, including to much larger computational domains, and more complex source and sound speed distributions, for example, those derived from x-ray computed tomography images of the skull. The aim of this project is to introduce fully heterogeneous description of the skull into the model and minimize computational requirements which still exceed the real-time requirements for inference and reaches days for training.
Researcher: Jun Terasaki
Reliability of nuclear matrix element of neutrinoless double-β decay
Barbora VIZ 40, Karolina CPU 38000, Karolina VIZ 40
The subject of this project is a study of nuclear matrix elements of the neutrinoless double-β decay, which is a decay of nucleus emitting electrons, and it is called neutrinoless because the neutrino is involved but not emitted from the nucleus. The possibility of this decay has been pointed out more than half a century ago. However, this decay is not yet observed because it is extremely rare, if it occurs. Currently a few tens of experimental projects continue their operations day and night around the world to observe this rare decay. If this decay is found, two important conclusions are derived. One is the identicality of the neutrino and antineutrino, and the other is the breaking of the lepton-number conservation. These remarkable properties are keys for developing fundamental physics. The nuclear matrix element is a physical quantity crucial for determining the probability of the neutrinoless double-β decay. There was a problem unsolved for more than 30 years; the calculated nuclear matrix elements are rather different depending on the methods to calculate the nuclear wave functions. The reason is that the mechanism of this decay is special, so that more accurate nuclear wave functions are necessary than those for other nuclear studies. The aim of this project is to solve this discrepancy problem by improving the theoretical method.
Researcher: Libor Veis
New efficient methods for strongly correlated molecules based on the density matrix renormalization group algorithm Barbora VIZ 40, Karolina CPU 34000, Karolina VIZ 40 Despite huge progress in development of novel efficient computational methods, the electronic structure problem of strongly correlated systems, such as catalysts or high temperature superconductors, still represents a very hard task. In this project, we will develop and benchmark the new computational method for treatment of molecules with strongly correlated electrons, which will combine the massively parallel density matrix renormalization group (DMRG) method with the adiabatic connection (AC) approach. The proposed method has a potential to efficiently solve the electronic structure problem of the most challenging strongly correlated systems.
Researcher: Igor Roncevic
Nonempirically tuned functionals for low-dimensional systems
Barbora CPU 55000, Barbora VIZ 40, Karolina VIZ 40
Low-dimensional systems such as organic polymer tapes (1D) or graphene sheets (2D) are very interesting due to their remarkable properties such as high electrical conductivity, unusual magnetic behaviour and the possibility of catalysing chemical reactions. Computational modelling of these systems has recently become very popular (ranking as the #1 field of reseach at the IT4I National Supercomputing Center) due to the possibility of explaining and predicting their properties. However, because of their high computational cost the most accurate methods (GW, coupled clusters) are quite limited (e.g. they cannot be used to model reactions or screen a large number of compounds). In this project, we will explore whether density functional theory, which is a much more efficient method typically including empirical parameters, can be used in place of the aforementioned expensive methods to describe 1D and 2D systems. This will be done by applying the concept of nonempirical tuning, developed for describing molecules, to 1D and 2D periodic systems.