Filtern
Erscheinungsjahr
Dokumenttyp
- Wissenschaftlicher Artikel (237) (entfernen)
Sprache
- Englisch (237) (entfernen)
Schlagworte
Institut
- Westfälisches Institut für Gesundheit (38)
- Westfälisches Energieinstitut (35)
- Wirtschaft und Informationstechnik Bocholt (32)
- Elektrotechnik und angewandte Naturwissenschaften (28)
- Institut für biologische und chemische Informatik (19)
- Maschinenbau und Facilities Management (13)
- Wirtschaftsingenieurwesen (13)
- Maschinenbau Bocholt (12)
- Institut Arbeit und Technik (7)
- Informatik und Kommunikation (6)
As a rule, an experiment carried out at school or in undergraduate study
courses is rather simple and not very informative. However, when the experiments
are to be performed using modern methods, they are often abstract and
difficult to understand. Here, we describe a quick and simple experiment,
namely the enzymatic characterization of ptyalin (human salivary amylase)
using a starch degradation assay. With the experimental setup presented here,
enzyme parameters, such as pH optimum, temperature optimum, chloride
dependence, and sensitivity to certain chemicals can be easily determined. This
experiment can serve as a good model for enzyme characterization in general,
as modern methods usually follow the same principle: determination of the
activity of the enzyme under different conditions. As different alleles occur in
humans, a random selection of test subjects will be quite different with regard
to ptyalin activities. Therefore, when the students measure their own ptyalin
activity, significant differences will emerge, and this will give them an idea of
the genetic diversity in human populations. The evaluation has shown that the
pupils have gained a solid understanding of the topic through this experiment.
Stereo Camera Setup for 360° Digital Image Correlation to Reveal Smart Structures of Hakea Fruits
(2024)
About forty years after its first application, digital image correlation (DIC) has become an established method for measuring surface displacements and deformations of objects under stress. To date, DIC has been used in a variety of in vitro and in vivo studies to biomechanically characterise biological samples in order to reveal biomimetic principles. However, when surfaces of samples strongly deform or twist, they cannot be thoroughly traced. To overcome this challenge, different DIC setups have been developed to provide additional sensor perspectives and, thus, capture larger parts of an object’s surface. Herein, we discuss current solutions for this multi-perspective DIC, and we present our own approach to a 360 DIC system based on a single stereo-camera setup. Using this setup, we are able to characterise the desiccation-driven opening mechanism of two woody Hakea fruits over their entire surfaces. Both the breaking mechanism and the actuation of the two valves in predominantly dead plant material are models for smart materials. Based on these results, an evaluation of the setup for 360 DIC regarding its use in deducing biomimetic principles is given. Furthermore, we propose a way to improve and apply the method for future measurements.
An Augmented Multiphase Rail Launcher With a Modular Design: Extended Setup and Muzzle Fed Operation
(2024)
Unsupervised physics-informed deep learning can be used to solve computational physics problems by training neural networks to satisfy the underlying equations and boundary conditions without labeled data. Parameters such as network architecture and training method determine the training success. However, the best choice is unknown a priori as it is case specific. Here, we investigated network shapes, sizes, and types for unsupervised physics-informed deep learning of the two-dimensional Reynolds averaged flow around cylinders. We trained mixed-variable networks and compared them to traditional models. Several network architectures with different shape factors and sizes were evaluated. The models were trained to solve the Reynolds averaged Navier-Stokes equations incorporating Prandtl’s mixing length turbulence model. No training data were deployed to train the models. The superiority of the mixed-variable approach was confirmed for the investigated high Reynolds number flow. The mixed-variable models were sensitive to the network shape. For the two cylinders, differently deep networks showed superior performance. The best fitting models were able to capture important flow phenomena such as stagnation regions, boundary layers, flow separation, and recirculation. We also encountered difficulties when predicting high Reynolds number flows without training data.
Advancements in hand-drawn chemical structure recognition through an enhanced DECIMER architecture
(2024)
Accurate recognition of hand-drawn chemical structures is crucial for digitising hand-written chemical information in traditional laboratory notebooks or facilitating stylus-based structure entry on tablets or smartphones. However, the inherent variability in hand-drawn structures poses challenges for existing Optical Chemical Structure Recognition (OCSR) software. To address this, we present an enhanced Deep lEarning for Chemical ImagE Recognition (DECIMER) architecture that leverages a combination of Convolutional Neural Networks (CNNs) and Transformers to improve the recognition of hand-drawn chemical structures. The model incorporates an EfficientNetV2 CNN encoder that extracts features from hand-drawn images, followed by a Transformer decoder that converts the extracted features into Simplified Molecular Input Line Entry System (SMILES) strings. Our models were trained using synthetic hand-drawn images generated by RanDepict, a tool for depicting chemical structures with different style elements. A benchmark was performed using a real-world dataset of hand-drawn chemical structures to evaluate the model's performance. The results indicate that our improved DECIMER architecture exhibits a significantly enhanced recognition accuracy compared to other approaches.
Developing and implementing computational algorithms for the extraction of specific substructures from molecular graphs (in silico molecule fragmentation) is an iterative process. It involves repeated sequences of implementing a rule set, applying it to relevant structural data, checking the results, and adjusting the rules. This requires a computational workflow with data import, fragmentation algorithm integration, and result visualisation. The described workflow is normally unavailable for a new algorithm and must be set up individually. This work presents an open Java rich client Graphical User Interface (GUI) application to support the development of new in silico molecule fragmentation algorithms and make them readily available upon release. The MORTAR (MOlecule fRagmenTAtion fRamework) application visualises fragmentation results of a set of molecules in various ways and provides basic analysis features. Fragmentation algorithms can be integrated and developed within MORTAR by using a specific wrapper class. In addition, fragmentation pipelines with any combination of the available fragmentation methods can be executed. Upon release, three fragmentation algorithms are already integrated: ErtlFunctionalGroupsFinder, Sugar Removal Utility, and Scaffold Generator. These algorithms, as well as all cheminformatics functionalities in MORTAR, are implemented based on the Chemistry Development Kit (CDK).
The influence of molecular fragmentation and parameter settings on a mesoscopic dissipative particle dynamics (DPD) simulation of lamellar bilayer formation for a C10E4/water mixture is studied. A “bottom-up” decomposition of C10E4 into the smallest fragment molecules (particles) that satisfy chemical intuition leads to convincing simulation results which agree with experimental findings for bilayer formation and thickness. For integration of the equations of motion Shardlow’s S1 scheme proves to be a favorable choice with best overall performance. Increasing the integration time steps above the common setting of 0.04 DPD units leads to increasingly unphysical temperature drifts, but also to increasingly rapid formation of bilayer superstructures without significantly distorted particle distributions up to an integration time step of 0.12. A scaling of the mutual particle–particle repulsions that guide the dynamics has negligible influence within a considerable range of values but exhibits apparent lower thresholds beyond which a simulation fails. Repulsion parameter scaling and molecular particle decomposition show a mutual dependence. For mapping of concentrations to molecule numbers in the simulation box particle volume scaling should be taken into account. A repulsion parameter morphing investigation suggests to not overstretch repulsion parameter accuracy considerations.
Recent years have seen a sharp increase in the development of deep learning and artificial intelligence-based molecular informatics. There has been a growing interest in applying deep learning to several subfields, including the digital transformation of synthetic chemistry, extraction of chemical information from the scientific literature, and AI in natural product-based drug discovery. The application of AI to molecular informatics is still constrained by the fact that most of the data used for training and testing deep learning models are not available as FAIR and open data. As open science practices continue to grow in popularity, initiatives which support FAIR and open data as well as open-source software have emerged. It is becoming increasingly important for researchers in the field of molecular informatics to embrace open science and to submit data and software in open repositories. With the advent of open-source deep learning frameworks and cloud computing platforms, academic researchers are now able to deploy and test their own deep learning models with ease. With the development of new and faster hardware for deep learning and the increasing number of initiatives towards digital research data management infrastructures, as well as a culture promoting open data, open source, and open science, AI-driven molecular informatics will continue to grow. This review examines the current state of open data and open algorithms in molecular informatics, as well as ways in which they could be improved in future.
The German supply chain law ( Lieferkettensorgfaltspflichtengesetz, abbreviated: LkSG) which enters into force on 1 January 2023 is part of the developing legal framework for human rights in global supply chains. Like the French vigilance law, it represents a new generation of supply chain laws which impose mandatory human rights due diligence obligations. The LkSG requires enterprises to exercise a number of due diligence obligations – from conducting risk analysis to undertaking preventive measures or remedial actions. The law is based on public enforcement via a competent authority, the Federal Office for Economic Affairs and Export Control (BAFA). The BAFA monitors and enforces compliance with the due diligence obligations. Non-compliant enterprises can be fined with up to 800,000 Euros and, in some cases, up to 2% of the annual turnover. Whilst the LkSG is an important step towards achieving greater corporate sustainability, it also has limitations. It was a political compromise and, as such, it does not include a new civil liability for non-compliance. Moreover, by default, it only applies to the enterprise’s own business area and its direct suppliers, whereas indirect suppliers are only included where the enterprise has substantiated knowledge that an obligation has been violated.