Filtern
Erscheinungsjahr
Dokumenttyp
Volltext vorhanden
- ja (58)
Gehört zur Bibliographie
- nein (58) (entfernen)
Schlagworte
- Biodiversität (6)
- CO2-Bilanz (6)
- Elektrofahrzeug (6)
- Elektromobilität (4)
- Fermentation (4)
- Maschinelles Lernen (4)
- Rapid Prototyping <Fertigung> (4)
- Umweltbilanz (4)
- Biomonitoring (3)
- Insekten (3)
Institut
- FB Umweltplanung/-technik (UCB) (58) (entfernen)
The current work investigates the capability of a tailored multivariate curve resolution–alternating least squares (MCR-ALS) algorithm to analyse glucose, phosphate, ammonium and acetate dynamics simultaneously in an E. coli BL21 fed-batch fermentation. The high-cell-density (HCDC) process is monitored by ex situ online attenuated total reflection (ATR) Fourier transform infrared (FTIR) spectroscopy and several in situ online process sensors. This approach efficiently utilises automatically generated process data to reduce the time and cost consuming reference measurement effort for multivariate calibration. To determine metabolite concentrations with accuracies between ±0.19 and ±0.96·gL−l, the presented utilisation needs primarily — besides online sensor measurements — single FTIR measurements for each of the components of interest. The ambiguities in alternating least squares solutions for concentration estimation are reduced by the insertion of analytical process knowledge primarily in the form of elementary carbon mass balances. Thus, in this way, the established idea of mass balance constraints in MCR combines with the consistency check of measured data by carbon balances, as commonly applied in bioprocess engineering. The constraints are calculated based on online process data and theoretical assumptions. This increased calculation effort is able to replace, to a large extent, the need for manually conducted quantitative chemical analysis, leads to good estimations of concentration profiles and a better process understanding.
This study introduced an automated long-term fermentation process for fungals grown in pellet form. The goal was to reduce the overgrowth of bioreactor internals and sensors while better rheological properties in the fermentation broth, such as oxygen transfer and mixing time, can be achieved. Because this could not be accomplished with continuous culture and fed-batch fermentation, repeated-batch fermentation was implemented with the help of additional bioreactor internals (“sporulation supports”). This should capture some biomass during fermentation. After harvesting the suspended biomass, intermediate cleaning was performed using a cleaning device. The biomass retained on the sporulation support went through the sporulation phase. The spores were subsequently used as inocula for the next batch. The reason for this approach was that the retained pellets could otherwise cause problems (e.g., overgrowth on sensors) in subsequent batches because the fungus would then show undesirable hyphal growth. Various sporulation supports were tested for sufficient biomass fixation to start the next batch. A reproducible spore concentration within the range of the requirements could be achieved by adjusting the sporulation support (design and construction material), and an intermediate cleaning adapted to this.
Additive manufacturing is an essential tool in innovative production processes. The extended degrees of freedom offer much potential in usage, construction, and product design. Rising raw material and energy costs, constantly increasing environmental requirements, and the increasing demand for resource-saving products represent a paradigm shift in classic production processes.
In addition to the purely energetic evaluation, developing energy models is a method to determine energy consumption and reduce it in the long term. The specific energy consumption model, also known as the SEC model, allows a quick estimation of energy consumption by multiplying the SEC with a unit like the mass of the workpiece, the manufacturing time, or the exposed area. Here, high dependence on the used machine, the considered peripheral devices, and the geometry are noticeable.
Previous studies, such as those by Kellens et al. and Baumers et al., have laid the basis for understanding the energy demands of PBF-LB/M processes. Various energy models have subsequently been proposed, including those by Paul and Anand, Yi et al., Lv et al., and Hui et al. These models are often limited by their specificity to sub-processes or subsystems. This results in limitations in their applicability to other manufacturing machines or inaccuracies in energy consumption predictions. The simulation accuracy ACC is mostly in the range of 90% with the limitation of small sample sizes. Moreover, nearly, all these models rely heavily on process time information, making the accuracy of their simulations largely dependent on the quality of the underlying time model.
In the following study, two manufacturing machines of the PBF-LB/M process are analyzed and compared with other studies. The aim is to analyze the power and resource consumption to use these data to build an improved energy model with a high accuracy, which can be used as an additional parameter in the adapted design methodology. Furthermore, potential savings are derived from the load curves.
Background: In recent years, the volume of medical knowledge and health data has increased rapidly. For example, the increased availability of electronic health records (EHRs) provides accurate, up-to-date, and complete information about patients at the point of care and enables medical staff to have quick access to patient records for more coordinated and efficient care. With this increase in knowledge, the complexity of accurate, evidence-based medicine tends to grow all the time. Health care workers must deal with an increasing amount of data and documentation. Meanwhile, relevant patient data are frequently overshadowed by a layer of less relevant data, causing medical staff to often miss important values or abnormal trends and their importance to the progression of the patient’s case.
Objective: The goal of this work is to analyze the current laboratory results for patients in the intensive care unit (ICU) and classify which of these lab values could be abnormal the next time the test is done. Detecting near-future abnormalities can be useful to support clinicians in their decision-making process in the ICU by drawing their attention to the important values and focus on future lab testing, saving them both time and money. Additionally, it will give doctors more time to spend with patients, rather than skimming through a long list of lab values.
Methods: We used Structured Query Language to extract 25 lab values for mechanically ventilated patients in the ICU from the MIMIC-III and eICU data sets. Additionally, we applied time-windowed sampling and holding, and a support vector machine to fill in the missing values in the sparse time series, as well as the Tukey range to detect and delete anomalies. Then, we used the data to train 4 deep learning models for time series classification, as well as a gradient boosting–based algorithm and compared their performance on both data sets.
Results: The models tested in this work (deep neural networks and gradient boosting), combined with the preprocessing pipeline, achieved an accuracy of at least 80% on the multilabel classification task. Moreover, the model based on the multiple convolutional neural network outperformed the other algorithms on both data sets, with the accuracy exceeding 89%.
Conclusions: In this work, we show that using machine learning and deep neural networks to predict near-future abnormalities in lab values can achieve satisfactory results. Our system was trained, validated, and tested on 2 well-known data sets to ensure that our system bridged the reality gap as much as possible. Finally, the model can be used in combination with our preprocessing pipeline on real-life EHRs to improve patients’ diagnosis and treatment.
Hydrological variability is a key factor in structuring biotic and abiotic processes in river ecosystems and is of particular importance to fish populations. We used 171 hydrological indices (HI) and young-of-the-year (YOY) fish abundances as indicators of reproductive success to compare species' response patterns to high and low flows on short-, intermediate-, and long-term scales. Our study included 13 common fish species in headwater streams of North Rhine-Westphalia, Germany. Generalized linear models using YOY abundances and HI on high- and low-flow patterns explained on average 64 % of the variability. HI calculated from long time series worked better than HI describing short- and intermediate-term high- and low flows. Species' reproductive success response to low flow HI depended on specific ecological traits whereas high flow HI differentially affected species according to their life history strategies. Equilibrium strategists responded negatively to high frequency and magnitude along with late timing of high flow, while periodic and opportunistic species mostly thrived under these conditions. We identified four species traits that mediated these differences between life history strategies. The reproductive success of species with low relative fecundity, large eggs and larvae, and long incubation periods was negatively impacted by the high frequency, high magnitude, and late timing of high flows. Conversely, the reproductive success of species with high relative fecundity, short incubation periods and small eggs and larvae was fostered by strong, frequent, and late high flows. The consideration of the relationship between reproductive success, life history, and fish species traits over several years under a range of flows is a novel step towards the implementation of measures to mitigate negative impacts and enhance conditions for successful fish reproduction.
Science on ecosystems and people to support the Kunming-Montreal Global Biodiversity Framework
(2023)
In December 2022, members of the Convention on Biological Diversity adopted the new Kunming-Montreal Global Biodiversity Framework (GBF) to guide international biodiversity conservation efforts until 2030 in order to be able to live ‘in harmony with nature’ by 2050. This framework addresses the implementation gap left after the Aichi Biodiversity Targets, which were the previous global instrument for mainstreaming biodiversity conservation between 2010 and 2020.
The aim of this editorial is to draw attention to the GBF targets that are most relevant to our readership, with two objectives: First, to suggest how Ecosystems and People may be a venue for emerging research insights in support of the GBF. Second, to highlight examples of recent research in Ecosystems and People that can contribute to enrich, or even challenge, the evidence and development of the GBF Targets.
Species distribution models (SDMs) are key tools in biodiversity and conservation, but assessing their reliability in unsampled locations is difficult, especially where there are sampling biases. We present a spatially-explicit sensitivity analysis for SDMs – SDM profiling – which assesses the leverage that unsampled locations have on the overall model by exploring the interaction between the effect on the variable response curves and the prevalence of the affected environmental conditions. The method adds a ‘pseudo-presence’ and ‘pseudo-absence’ to unsampled locations, re-running the SDM for each, and measuring the difference between the probability surfaces of the original and new SDMs. When the standardised difference values are plotted against each other (a ‘profile plot’), each point's location can be summarized by four leverage measures, calculated as the distances to each corner. We explore several applications: visualization of model certainty; identification of optimal new sampling locations and redundant existing locations; and flagging potentially erroneous occurrence records.
This study compares the environmental impacts of petrol, diesel, natural gas, and electric vehicles using a process-based attributional life cycle assessment (LCA) and the ReCiPe characterization method that captures 18 impact categories and the single score endpoints. Unlike common practice, we derive the cradle-to-grave inventories from an originally combustion engine VW Caddy that was disassembled and electrified in our laboratory, and its energy consumption was measured on the road. Ecoivent 2.2 and 3.0 emission inventories were contrasted exhibiting basically insignificant impact deviations. Ecoinvent 3.0 emission inventory for the diesel car was additionally updated with recent real-world close emission values and revealed strong increases over four midpoint impact categories, when matched with the standard Ecoinvent 3.0 emission inventory. Producing batteries with photovoltaic electricity instead of Chinese coal-based electricity decreases climate impacts of battery production by 69%. Break-even mileages for the electric VW Caddy to pass the combustion engine models under various conditions in terms of climate change impact ranged from 17,000 to 310,000 km. Break-even mileages, when contrasting the VW Caddy and a mini car (SMART), which was as well electrified, did not show systematic differences. Also, CO2-eq emissions in terms of passenger kilometers travelled (54–158 g CO2-eq/PKT) are fairly similar based on 1 person travelling in the mini car and 1.57 persons in the mid-sized car (VW Caddy). Additionally, under optimized conditions (battery production and use phase utilizing renewable electricity), the two electric cars can compete well in terms of CO2-eq emissions per passenger kilometer with other traffic modes (diesel bus, coach, trains) over lifetime. Only electric buses were found to have lower life cycle carbon emissions (27–52 g CO2-eq/PKT) than the two electric passenger cars.
Ahmad et al. in their paper for the first time proposed to apply sharp function for classification of images. In continuation of their work, in this paper we investigate the use of sharp function as an edge detector through well known diffusion models. Further, we discuss the formulation of weak solution of nonlinear diffusion equation and prove uniqueness of weak solution of nonlinear problem. The anisotropic generalization of sharp operator based diffusion has also been implemented and tested on various types of images.
Social media data are transforming sustainability science. However, challenges from restrictions in data accessibility and ethical concerns regarding potential data misuse have threatened this nascent field. Here, we review the literature on the use of social media data in environmental and sustainability research. We find that they can play a novel and irreplaceable role in achieving the UN Sustainable Development Goals by allowing a nuanced understanding of human-nature interactions at scale, observing the dynamics of social-ecological change, and investigating the co-construction of nature values. We reveal threats to data access and highlight scientific responsibility to address trade-offs between research transparency and privacy protection, while promoting inclusivity. This contributes to a wider societal debate of social media data for sustainability science and for the common good.