Refine
Year of publication
- 2021 (35) (remove)
Document Type
- Article (specialist journals) (35) (remove)
Language
- English (35) (remove)
Has Fulltext
- yes (35)
Is part of the Bibliography
- no (35)
Keywords
- Nachhaltigkeit (4)
- Electronic Commerce (3)
- Gesundheitswesen (3)
- Logistik (3)
- sustainability (3)
- City-Logistik (2)
- Deutschland (2)
- Digitalisierung (2)
- Kinderarbeit (2)
- Lieferservice (2)
Unintended nuclear war
(2021)
Carbon footprinting of universities worldwide: Part I — objective comparison by standardized metrics
(2021)
Background: Universities, as innovation drivers in science and technology worldwide, should be leading the Great Transformation towards a carbon–neutral society and many have indeed picked up the challenge. However, only a small number of universities worldwide are collecting and publishing their carbon footprints, and some of them have defined zero emission targets. Unfortunately, there is limited consistency between the reported carbon footprints (CFs) because of different analysis methods, different impact measures, and different target definitions by the respective universities.
Results: Comprehensive CF data of 20 universities from around the globe were collected and analysed. Essential factors contributing to the university CF were identified. For the first time, CF data from universities were not only compared. The CF data were also evaluated, partly corrected, and augmented by missing contributions, to improve the consistency and comparability. The CF performance of each university in the respective year is thus homogenized, and measured by means of two metrics: CO2e emissions per capita and per m2 of constructed area. Both metrics vary by one order of magnitude across the different universities in this study. However, we identified ten universities reaching a per capita carbon footprint of lower than or close to 1.0 Mt (metric tons) CO2e/person and year (normalized by the number of people associated with the university), independent from the university’s size. In addition to the aforementioned two metrics, we suggested a new metric expressing the economic efficiency in terms of the CF per $ expenditures and year. We next aggregated the results for all three impact measures, arriving at an overall carbon performance for the respective universities, which we found to be independent of geographical latitude. Instead the per capita measure correlates with the national per capita CFs, and it reaches on average 23% of the national impacts per capita. The three top performing universities are located in Switzerland, Chile, and Germany.
Conclusion: The usual reporting of CO2 emissions is categorized into Scopes 1–3 following the GHG Protocol Corporate Accounting Standard which makes comparison across universities challenging. In this study, we attempted to standardize the CF metrics, allowing us to objectively compare the CF at several universities. From this study, we observed that, almost 30 years after the Earth Summit in Rio de Janeiro (1992), the results are still limited. Only one zero emission university was identified, and hence, the transformation should speed up globally.
Laboratory protocols using magnetic beads have gained importance in the purification of mRNA for vaccines. Here, the produced mRNA hybridizes specifically to oligo(dT)-functionalized magnetic beads after cell lysis. The mRNA-loaded magnetic beads can be selectively separated using a magnet. Subsequently, impurities are removed by washing steps and the mRNA is eluted. Magnetic separation is utilized in each step, using different buffers such as the lysis/binding buffer. To reduce the time required for purification of larger amounts of mRNA vaccine for clinical trials, high-gradient magnetic separation (HGMS) is suitable. Thereby, magnetic beads are selectively retained in a flow-through separation chamber. To meet the requirements of biopharmaceutical production, a disposable HGMS separation chamber with a certified material (United States Pharmacopeia Class VI) was developed which can be manufactured using 3D printing. Due to the special design, the filter matrix itself is not in contact with the product. The separation chamber was tested with suspensions of oligo(dT)-functionalized Dynabeads MyOne loaded with synthetic mRNA. At a concentration of cB = 1.6–2.1 g·L–1 in lysis/binding buffer, these 1 μm magnetic particles are retained to more than 99.39% at volumetric flows of up to 150 mL·min–1 with the developed SU-HGMS separation chamber. When using the separation chamber with volumetric flow rates below 50 mL·min–1, the retained particle mass is even more than 99.99%.
Purification of mRNA with oligo(dT)-functionalized magnetic particles involves a series of magnetic separations for buffer exchange and washing. Magnetic particles interact and agglomerate with each other when a magnetic field is applied, which can result in a decreased total surface area and thus a decreased yield of mRNA. In addition, agglomeration may also be caused by mRNA loading on the magnetic particles. Therefore, it is of interest how the individual steps of magnetic separation and subsequent redispersion in the buffers used affect the particle size distribution. The lysis/binding buffer is the most important buffer for the separation of mRNA from the multicomponent suspension of cell lysate. Therefore, monodisperse magnetic particles loaded with mRNA were dispersed in the lysis/binding buffer and in the reference system deionized water, and the particle size distributions were measured. A concentration-dependent agglomeration tendency was observed in deionized water. In contrast, no significant agglomeration was detected in the lysis/binding buffer. With regard to magnetic particle recycling, the influence of different storage and drying processes on particle size distribution was investigated. Agglomeration occurred in all process alternatives. For de-agglomeration, ultrasonic treatment was examined. It represents a suitable method for reproducible restoration of the original particle size distribution.
The implementation of single-use technologies offers several major advantages, e.g. prevention of cross-contamination, especially when spore-forming microorganisms are present. This study investigated the application of a single-use bioreactor in batch fermentation of filamentous fungus Penicillium sp. (IBWF 040-09) from the Institute of Biotechnology and Drug Research (IBWF), which is capable of intracellular production of a protease inhibitor against parasitic proteases as a secondary metabolite. Several modifications to the SU bioreactor were suggested in this study to allow the fermentation in which the fungus forms pellets. Simultaneously, fermentations in conventional glass bioreactor were also conducted as reference. Although there are significant differences in the construction material and gassing system, the similarity of the two types of bioreactors in terms of fungal metabolic activity and the reproducibility of fermentations could be demonstrated using statistic methods. Under the selected cultivation conditions, growth rate, yield coefficient, substrate uptake rate, and formation of intracellular protease-inhibiting substance in the single-use bioreactor were similar to those in the glass bioreactor.
Thailand’s power system has been facing an energy transition due to the increasing amount of Renewable Energy (RE) integration, prosumers with self-consumption, and digitalization-based business models in a Local Energy Market (LEM). This paper introduces a decentralized business model and a possible trading platform for electricity trading in Thailand’s Micro-Grid to deal with the power system transformation. This approach is Hybrid P2P, a market structure in which sellers and buyers negotiate on energy exchanging by themselves called Fully P2P trading or through the algorithm on the market platform called Community-based trading. A combination of Auction Mechanism (AM), Bill Sharing (BS), and Traditional Mechanism (TM) is the decentralized price mechanism proposed for the Community-based trading. The approach is validated through a test case in which, during the daytime, the energy import and export of the community are significantly reduced when 75 consumers and 25 PV rooftop prosumers participate in this decentralized trading model. Furthermore, a comparison analysis confirms that the decentralized business model outperforms a centralized approach on community and individual levels.
As productive biofilms are increasingly gaining interest in research, the quantitative monitoring of biofilm formation on- or offline for the process remains a challenge. Optical coherence tomography (OCT) is a fast and often used method for scanning biofilms, but it has difficulty scanning through more dense optical materials. X-ray microtomography (μCT) can measure biofilms in most geometries but is very time-consuming. By combining both methods for the first time, the weaknesses of both methods could be compensated. The phototrophic cyanobacterium Tolypothrix distorta was cultured in a moving bed photobioreactor inside a biocarrier with a semi-enclosed geometry. An automated workflow was developed to process µCT scans of the biocarriers. This allowed quantification of biomass volume and biofilm-coverage on the biocarrier, both globally and spatially resolved. At the beginning of the cultivation, a growth limitation was detected in the outer region of the carrier, presumably due to shear stress. In the later phase, light limitations could be found inside the biocarrier. µCT data and biofilm thicknesses measured by OCT displayed good correlation. The latter could therefore be used to rapidly measure the biofilm formation in a process. The methods presented here can help gain a deeper understanding of biofilms inside a process and detect any limitations.
Introduction: The use of social marketing strategies to induce the promotion of cognitive health has received little attention in research. The objective of this scoping review is twofold: (i) to identify the social marketing strategies that have been used in recent years to initiate and maintain health-promoting behaviour; (ii) to advance research in this area to inform policy and practice on how to best make use of these strategies to promote cognitive health.
Methods and analysis: We will use the five-stage methodological framework of Arksey and O'Malley. Articles in English published since 2010 will be searched in electronic databases (the Cochrane Library, DoPHER, the International Bibliography of the Social Sciences, PsycInfo, PubMed, ScienceDirect, Scopus). Quantitative and qualitative study designs as well as reviews will be considered. We will include those articles that report the design, implementation, outcomes and evaluation of programmes and interventions concerning social marketing and/or health promotion and/or promotion of cognitive health. Grey literature will not be searched. Two independent reviewers will assess in detail the abstracts and full text of selected citations against the inclusion criteria. A Preferred Reporting Items for Systematic Reviews and Meta-Analyses flowchart for Scoping Reviews will be used to illustrate the process of article selection. We will use a data extraction form, present the results through narrative synthesis and discuss them in relation to the scoping review research questions.
Ethics and dissemination: Ethics approval is not required for conducting this scoping review. The results of the review will be the first step to advance a conceptual framework, which contributes to the development of interventions targeting the promotion of cognitive health. The results will be published in a peer-reviewed scientific journal. They will also be disseminated to key stakeholders in the field of the promotion of cognitive health.
The aim of this work was to develop and evaluate the reinforcement learning algorithm VentAI, which is able to suggest a dynamically optimized mechanical ventilation regime for critically-ill patients. We built, validated and tested its performance on 11,943 events of volume-controlled mechanical ventilation derived from 61,532 distinct ICU admissions and tested it on an independent, secondary dataset (200,859 ICU stays; 25,086 mechanical ventilation events). A patient “data fingerprint” of 44 features was extracted as multidimensional time series in 4-hour time steps. We used a Markov decision process, including a reward system and a Q-learning approach, to find the optimized settings for positive end-expiratory pressure (PEEP), fraction of inspired oxygen (FiO2) and ideal body weight-adjusted tidal volume (Vt). The observed outcome was in-hospital or 90-day mortality. VentAI reached a significantly increased estimated performance return of 83.3 (primary dataset) and 84.1 (secondary dataset) compared to physicians’ standard clinical care (51.1). The number of recommended action changes per mechanically ventilated patient constantly exceeded those of the clinicians. VentAI chose 202.9% more frequently ventilation regimes with lower Vt (5–7.5 mL/kg), but 50.8% less for regimes with higher Vt (7.5–10 mL/kg). VentAI recommended 29.3% more frequently PEEP levels of 5–7 cm H2O and 53.6% more frequently PEEP levels of 7–9 cmH2O. VentAI avoided high (>55%) FiO2 values (59.8% decrease), while preferring the range of 50–55% (140.3% increase). In conclusion, VentAI provides reproducible high performance by dynamically choosing an optimized, individualized ventilation strategy and thus might be of benefit for critically ill patients.
Big Data is now poised to mutate decision-making systems. Indeed, the decision is no longer based solely on the structured information that was hitherto collected and stored by the organization, but also on all data not structured outside the corporate straitjacket. The cloud and the information it contains impacts decisions and the industry is witnessing the emergence of business intelligence 3.0. With the growth of the internet, social networks, connected objects and communication information are now more abundant than ever before, along with rapid and substantial growth in their production. In 2012, 2.5 exabytes of data (one exabyte representing a million gigabytes of data) came every day to swell the ranks of big data (McAfee et al., 2012), which should weigh more than 40 zettabytes from 2020 (Valduriez, 2014) for 30 billion connected devices (The Internet Of Nothings, 2014) and 50 billion sensors (Davenport & Soulard, 2014). One of the most critical aspects of all of this information flow is the impact these will have on the way decisions are made. Indeed, in the part of an environment in which data was scarce and difficult to obtain, it was logical to let decision-making be conditioned by the intuition of the experienced decision-maker (Klein, Phillips, Rall, & Peluso, 2007). However, since information and knowledge are now available to everyone, the role of experts and decision-makers is gradually changing. Big data, in particular, makes it possible for analytical and decision-making systems to base their decision-making on global models. However, considering all the dimensions of the situations encountered, it was not until now that these systems were not within the reach of man, but were rationally limited (Simon & Newell, 1971). Big data and however, the processing of unstructured data requires modifying the architecture of decision support systems (DSS) of organizations. This paper is an inventory of developments undergone by aid systems decision-making, under the pressure of big data. Finally, it opens the debate on ethical questions raised by these new technologies, and it is observed that now, data analysis of personal data has become more debatable than in the past.