Filtern
Erscheinungsjahr
Dokumenttyp
- Wissenschaftlicher Artikel (237) (entfernen)
Sprache
- Englisch (237) (entfernen)
Schlagworte
Institut
- Westfälisches Institut für Gesundheit (38)
- Westfälisches Energieinstitut (35)
- Wirtschaft und Informationstechnik Bocholt (32)
- Elektrotechnik und angewandte Naturwissenschaften (28)
- Institut für biologische und chemische Informatik (19)
- Maschinenbau und Facilities Management (13)
- Wirtschaftsingenieurwesen (13)
- Maschinenbau Bocholt (12)
- Institut Arbeit und Technik (7)
- Informatik und Kommunikation (6)
- Institut für Internetsicherheit (6)
- Institut für Innovationsforschung und -management (4)
- Wirtschaftsrecht (3)
- Fachbereiche (2)
- Institute (1)
- Mechatronik-Institut Bocholt (1)
In the modern Web, service providers often rely heavily on third parties to run their services. For example, they make use of ad networks to finance their services, externally hosted libraries to develop features quickly, and analytics providers to gain insights into visitor behavior.
For security and privacy, website owners need to be aware of the content they provide their users. However, in reality, they often do not know which third parties are embedded, for example, when these third parties request additional content as it is common in real-time ad auctions.
In this paper, we present a large-scale measurement study to analyze the magnitude of these new challenges. To better reflect the connectedness of third parties, we measured their relations in a model we call third party trees, which reflects an approximation of the loading dependencies of all third parties embedded into a given website. Using this concept, we show that including a single third party can lead to subsequent requests from up to eight additional services. Furthermore, our findings indicate that the third parties embedded on a page load are not always deterministic, as 50 % of the branches in the third party trees change between repeated visits. In addition, we found that 93 % of the analyzed websites embedded third parties that are located in regions that might not be in line with the current legal framework. Our study also replicates previous work that mostly focused on landing pages of websites. We show that this method is only able to measure a lower bound as subsites show a significant increase of privacy-invasive techniques. For example, our results show an increase of used cookies by about 36 % when crawling websites more deeply.
Advanced Persistent Threats (APTs) are one of the main challenges in modern computer security. They are planned and performed by well-funded, highly-trained and often state-based actors. The first step of such an attack is the reconnaissance of the target. In this phase, the adversary tries to gather as much intelligence on the victim as possible to prepare further actions. An essential part of this initial data collection phase is the identification of possible gateways to intrude the target.
In this paper, we aim to analyze the data that threat actors can use to plan their attacks. To do so, we analyze in a first step 93 APT reports and find that most (80 %) of them begin by sending phishing emails to their victims. Based on this analysis, we measure the extent of data openly available of 30 entities to understand if and how much data they leak that can potentially be used by an adversary to craft sophisticated spear phishing emails. We then use this data to quantify how many employees are potential targets for such attacks. We show that 83 % of the analyzed entities leak several attributes of uses, which can all be used to craft sophisticated phishing emails.
The European General Data Protection Regulation (GDPR), which went into effect in May 2018, brought new rules for the processing of personal data that affect many business models, including online advertising. The regulation’s definition of personal data applies to every company that collects data from European Internet users. This includes tracking services that, until then, argued that they were collecting anonymous information and data protection requirements would not apply to their businesses.
Previous studies have analyzed the impact of the GDPR on the prevalence of online tracking, with mixed results. In this paper, we go beyond the analysis of the number of third parties and focus on the underlying information sharing networks between online advertising companies in terms of client-side cookie syncing. Using graph analysis, our measurement shows that the number of ID syncing connections decreased by around 40 % around the time the GDPR went into effect, but a long-term analysis shows a slight rebound since then. While we can show a decrease in information sharing between third parties, which is likely related to the legislation, the data also shows that the amount of tracking, as well as the general structure of cooperation, was not affected. Consolidation in the ecosystem led to a more centralized infrastructure that might actually have negative effects on user privacy, as fewer companies perform tracking on more sites.
Based on the fact that titanium and titanium alloys have poor fretting fatigue resistance and poor tribological properties, it is necessary to apply some surface engineering methods in order to increase the exploitation characteristics of these materials. One may either implement some surface treatment technologies or even deposit overlay coatings by thermal spraying.
The present study is focused on the achieved properties of the ceramic coatings (Al2O3 + 13 wt.% TiO2) deposited onto a titanium substrate using high velocity oxygen fuel (HVOF) and plasma spraying (APS) respectively.
The effect of the deposition method on the microstructure, phase constituents, and mechanical properties of the ceramic coatings was investigated by means of scanning electron microscopy (SEM), X-ray diffraction technique (XRD) and nanoindentation tests. The sliding wear performances of the Al2O3–TiO2 coatings were tested using a pin on disk wear tester.
Tape brazing constitutes a cost-effective alternative surface protection technology for complex-shaped surfaces. The study explores the characteristics of high-temperature brazed coatings using a cobalt-based powder deposited on a stainless-steel substrate in order to protect parts subjected to hot temperatures in a wear-exposed environment. Microstructural imaging corroborated with x-ray diffraction analysis showed a complex phased structure consisting of intermetallic Cr-Ni, C-Co-W Laves type, and chromium carbide phases. The surface properties of the coatings, targeting hot corrosion behavior, erosion, wear resistance, and microhardness, were evaluated. The high-temperature corrosion test was performed for 100 h at 750 C in a salt mixture consisting of 25 wt.% NaCl + 75 wt.% Na2SO4. The degree of corrosion attack was closely connected with the exposure temperature, and the degradation of the material corresponding to the mechanisms of low-temperature hot corrosion. The erosion tests were carried out using alumina particles at a 90 impingement angle. The results, correlated with the microhardness measurements, have shown that Co-based coatings exhibited approximately 40% lower material loss compared to that of the steel substrate.
In this paper, we investigate the influence of different disease groups on the size of different 1 anatomical structures. To this end, we first modify and improve an existing anatomical segmentation 2 model. Then, we use this model to segment 104 anatomical structures from computed tomography 3 (CT) scans and compute their volumes from the segmentation. After correlating the results with each 4 other, we find no new significant correlations. After correlating the volume data with known diseases 5 for each case, we find two weak correlations, one of which has not been described before and for 6 which we present a possible explanation.
To address the question which neocortical layers and cell types are important for the perception of a sensory stimulus, we performed multielectrode recordings in the barrel cortex of head-fixed mice performing a single-whisker go/no-go detection task with vibrotactile stimuli of differing intensities. We found that behavioral detection probability decreased gradually over the course of each session, which was well explained by a signal detection theory-based model that posits stable psychometric sensitivity and a variable decision criterion updated after each reinforcement, reflecting decreasing motivation. Analysis of multiunit activity demonstrated highest neurometric sensitivity in layer 4, which was achieved within only 30 ms after stimulus onset. At the level of single neurons, we observed substantial heterogeneity of neurometric sensitivity within and across layers, ranging from nonresponsiveness to approaching or even exceeding psychometric sensitivity. In all cortical layers, putative inhibitory interneurons on average proffered higher neurometric sensitivity than putative excitatory neurons. In infragranular layers, neurons increasing firing rate in response to stimulation featured higher sensitivities than neurons decreasing firing rate. Offline machine-learning-based analysis of videos of behavioral sessions showed that mice performed better when not moving, which at the neuronal level, was reflected by increased stimulus-evoked firing rates.
As a rule, an experiment carried out at school or in undergraduate study
courses is rather simple and not very informative. However, when the experiments
are to be performed using modern methods, they are often abstract and
difficult to understand. Here, we describe a quick and simple experiment,
namely the enzymatic characterization of ptyalin (human salivary amylase)
using a starch degradation assay. With the experimental setup presented here,
enzyme parameters, such as pH optimum, temperature optimum, chloride
dependence, and sensitivity to certain chemicals can be easily determined. This
experiment can serve as a good model for enzyme characterization in general,
as modern methods usually follow the same principle: determination of the
activity of the enzyme under different conditions. As different alleles occur in
humans, a random selection of test subjects will be quite different with regard
to ptyalin activities. Therefore, when the students measure their own ptyalin
activity, significant differences will emerge, and this will give them an idea of
the genetic diversity in human populations. The evaluation has shown that the
pupils have gained a solid understanding of the topic through this experiment.
Biomimetics is the interdisciplinary co-operation of various scientific disciplines and fields of innovation, and it aims to solve practical problems using biological models. Biomimetic research and its fields of application are manifold, and the community is made up of a wide range of disciplines, from biologists and engineers to designers. Guidelines and standards can build a common ground for understanding of the field, communication across disciplines, present and future projects and implementation of biomimetic knowledge. Since 2015, three international standards have been published and defined terms and definitions, as well as specific applications. The scientific literature and patents in several databases were searched for citations of the published standards. Standards or technical guidelines on biomimetics are represented both in the scientific literature and in patents. However, taking into account the increasing number of publications in biomimetics, the number of publications (52) citing the international standards is low. This shows that the perception of technical rules is still underrepresented in the academic field. Greater awareness and acceptance of the importance of standards for quality assurance even in the academic environment is discussed, and active participation in the corresponding International Organization for Standardization committee on biomimetics is asked for.
Biomimetics is a well-known approach for technical innovation. However, most of its influence remains in the academic field. One option for increasing its application in the practice of technical design is to enhance the use of the biomimetic process with a step-by-step standard, building a bridge to common engineering procedures. This article presents the endeavor of an interdisciplinary expert panel from the fields of biology, engineering science, and industry to develop a standard that links biomimetics to the classical processes of product development and engineering design. This new standard, VDI 6220 Part 2, proposes a process description that is compatible and connectable to classical approaches in engineering design. The standard encompasses both the solution-based and the problem-driven process of biomimetics. It is intended to be used in any product development process for more biomimetic applications in the future.
The technology of polymer electrolyte membrane (PEM) electrolysis provides an efficient way to produce hydrogen. In combination with renewable energy sources, it promises to be one of the key factors towards a carbon-free energy infrastructure in the future. Today, PEM electrolyzers with a power consumption higher than 1 MW and a gas output pressure of 30 bar (or even higher) are already commercially available. Nevertheless, fundamental research and development for an improved efficiency is far from being finally accomplished, and mostly takes place on a laboratory scale. Upscaling the laboratory prototypes to an industrial size usually cannot be achieved without facing further problems and/or losing efficiency. With our novel system design based on hydraulic cell compression, a lot of the commonly occurring problems like inhomogeneous temperature and current distribution can be avoided. In this study we present first results of an upscaling by a factor of 30 in active cell area.
A Crypto-Token Based Charging Incentivization
Scheme for Sustainable Light Electric Vehicle
Sharing
(2021)
The ecological impact of shared light electric vehicles (LEV) such as kick scooters is still widely discussed. Especially the fact that the vehicles and batteries are collected using diesel vans in order to charge empty batteries with electricity of unclear origin is perceived as unsustainable. A better option could be to let the users charge the vehicles themselves whenever it is necessary. For this, a decentralized,flexible and easy to install network of off-grid solar charging stations could bring renewable electricity where it is needed without sacrificing the convenience of a free float sharing system. Since the charging stations are powered by solar energy the most efficient way to utilize them would be to charge the vehicles when the sun is shining. In order to make users charge the vehicle it is necessary to provide some form of benefit for
them doing so. This could be either a discount or free rides. A
particularly robust and well-established mechanism is controlling incentives via means of blockchain-based cryptotokens. This paper demonstrates a crypto-token based scheme for incentivizing users to charge sharing vehicles during times of considerable solar irradiation in order to contribute to more sustainable mobility services.
Proof of Existence as a blockchain service has first been published in 2013 as a public notary service on the Bitcoin network and can be used to verify the existence of a particular file in a specific point of time without sharing the file or its content itself. This service is also available on the Ethereum based bloxberg network, a decentralized research infrastructure that is governed, operated and developed by an international consortium of research facilities. Since it is desirable to integrate the creation of this proof tightly into the research workflow, namely the acquisition and processing of research data, we show a simple to integrate MATLAB extension based solution with the concept being applicable to other programming languages and environments as well.
With ongoing developments in the field of smart cities and digitalization in general, data is becoming a driving factor and value stream for new and existing economies alike. However, there exists an increasing centralization and monopolization of data holders and service providers, especially in the form of the big US-based technology companies in the western world and central technology providers with close ties to the government in the Asian regions. Self Sovereign Identity (SSI) provides the technical building blocks to create decentralized data-driven systems, which bring data autonomy back to the users. In this paper we propose a system in which the combination of SSI and token economy based incentivisation strategies makes it possible to unlock the potential value of data-pools without compromising the data autonomy of the users.
The set of transactions that occurs on the public ledger of an Ethereum network in a specific time frame can be represented as a directed graph, with vertices representing addresses and an edge indicating the interaction between two addresses.
While there exists preliminary research on analyzing an Ethereum network by the means of graph analysis, most existing work is focused on either the public Ethereum Mainnet or on analyzing the different semantic transaction layers using static graph analysis in order to carve out the different network properties (such as interconnectivity, degrees of centrality, etc.) needed to characterize a blockchain network. By analyzing the consortium-run bloxberg Proof-of-Authority (PoA) Ethereum network, we show that we can identify suspicious and potentially malicious behaviour of network participants by employing statistical graph analysis. We thereby show that it is possible to identify the potentially malicious
exploitation of an unmetered and weakly secured blockchain network resource. In addition, we show that Temporal Network Analysis is a promising technique to identify the occurrence of anomalies in a PoA Ethereum network.
A qualitative work‐flow analysis of a neurosurgical procedure indicates that the resolution of the image used to plan the intervention is the major source of inaccuracy. Quantitative experimental measurements confirm this observation. They fail, however, to explain the relationship between the accuracy of the frame components involved in a stereotactic procedure and the overall application accuracy. This investigation shows that the novel Gaussian approach is a flexible framework for the calculation of the application accuracy of frame systems. Therefore, the Gaussian approach provides a detailed understanding of the interplay between the various factors affecting accuracy. The basic ideas and limitations of the Gaussian approach are briefly explained. The effect of fiducial marker distribution and registration is investigated and shown to introduce a spatial dependence to the accuracy. The results of the Gaussian approach are compared with experimental data for three stereotactic frame devices: Leksell G, Cosman–Roberts–Wells, and Brown–Roberts–Wells. Although the Gaussian approach is an approximation, it reproduces the accuracy measured in the experiment within the statistical error of that experiment. Comp Aid Surg 4:77–86 (1999). © 1999 Wiley‐Liss, Inc.
Stereotactic frame systems are widely used in neurosurgery. The accuracy of frame devices is considered as a gold standard to which the accuracy of new frameless stereotactic navigation systems is compared. The purpose of this study is to develop a general approach for the prediction of the application accuracy of stereotactic systems. The approach will be applied to the frame‐based biopsy performed with three frame devices: Leksell G, Cosman–Roberts–Wells (CRW), and Brown–Roberts–Wells (BRW). A work‐flow analysis will be carried out demonstrating that the accuracy relevant for a clinical application comprises several error sources including imaging, target and entry point selection, image to frame coordinates registration, and the setting of mechanical parameters of the frame. These error sources will be postulated to obey a Gaussian distribution probability density. The linear, i.e., Gaussian, error propagation, will be used to link all error contributions thus to calculate the cumulative accuracy of the frame used in the application. Although the Gaussian approach is an approximation, it allows for an analytical treatment of the accuracy. Both the accuracy at the target point and the accuracy of the probe needle guidance along the planned trajectory have been investigated. Of great significance is the relationship found between accuracy, pixel dimension, and image slice thickness, the latter being the dominant factor for slices of more than 1.5 mm thickness, yielding inaccuracies larger than 1.5 mm. For target points the predictions for the application accuracy have been compared to the results of measurements, showing good agreement with the experimental data.
We report on investigations that illustrate the interaction between the specific immune system and a young avascular tumor growing due to a diffusive nutrient supply. We formulate a hybrid cellular automata-partial differential equation (CA-PDE) model which includes cell cycle dynamics and allows for tracking the spatial and temporal evolution of this elaborate biological system. We present results of two dimensional numerical simulations that, specifically in this work, include special cases of the spherical and papillary tumor growth, the infiltration of immune system cells into the tumor and the escape of tumor cells from the regime of the immune cells.