Filtern
Erscheinungsjahr
Dokumenttyp
Volltext vorhanden
- ja (183)
Gehört zur Bibliographie
- nein (183)
Schlagworte
- Deutschland (14)
- Nachhaltigkeit (11)
- Rückenschmerz (11)
- COVID-19 (7)
- Digitalisierung (7)
- Maschinelles Lernen (7)
- China (6)
- Gesundheitswesen (6)
- Künstliche Intelligenz (6)
- Pandemie (6)
Institut
- FB Bauen + Leben (49)
- FB Umweltplanung/-technik (UCB) (44)
- FB Informatik + Therapiewissenschaft (30)
- FB Technik (13)
- FB Umweltwirtschaft/-recht (UCB) (11)
- IfaS - Institut für angewandtes Stoffstrommanagement (10)
- InDi - Institut für Internationale und Digitale Kommunikation (7)
- LaROS - Labor für Radiotechnologie und optische Systeme (6)
- ISS - Institut für Softwaresysteme in Wirtschaft, Umwelt und Verwaltung (5)
- FB Wirtschaft (3)
Stratified care for low back pain (LBP) has been shown to be clinically- and cost-effective in the UK, but its transferability to the German healthcare system is unknown. This study explores LBP patients’ perspectives regarding future implementation of stratified care, through in-depth interviews (n = 12). The STarT-Back-Tool was completed by participants prior to interviews. Interview data were analysed using Grounded Theory. The overarching theme identified from the data was ‘treatment-success’, with subthemes of ‘assessment and treatment planning’, ‘acceptance of the questionnaire’ and ‘contextual factors’. Patients identified the underlying cause of pain as being of great importance (whereas STarT-Back allocates treatment based on prognosis). The integration of the STarT-Back-Tool in consultations was considered helpful as long as it does not disrupt the therapeutic relationship, and was acceptable if tool results are handled confidentially. Results indicate that for patients to find STarT-Back acceptable, the shift from a focus on identifying a cause of pain and subsequent diagnosis, to prediction-orientated treatment planning, must be made clear. Patient ‘buy in’ is important for successful uptake of clinical interventions, and findings can help to inform future strategies for implementing STarT-Back in the Germany, as well as having potential implications for transferability to other similar healthcare systems.
Deep brain stimulation (DBS) is an established therapy for movement disorders such as in Parkinson's disease (PD) and essential tremor (ET). Adjusting the stimulation parameters, however, is a labour-intensive process and often requires several patient visits. Physicians prefer objective tools to improve (or maintain) the performance in DBS. Wearable motion sensors (WMS) are able to detect some manifestations of pathological signs, such as tremor in PD. However, the interpretation of sensor data is often highly technical and methods to visualise tremor data of patients undergoing DBS in a clinical setting are lacking. This work aims to visualise the dynamics of tremor responses to DBS parameter changes with WMS while patients performing clinical hand movements. To this end, we attended DBS programming sessions of two patients with the aim to visualise certain aspects of the clinical examination. PD tremor and ET were effectively quantified by acceleration amplitude and frequency. Tremor dynamics were analysed and visualised based on setpoints, movement transitions and stability aspects. These methods have not yet been employed and examples demonstrate how tremor dynamics can be visualised with simple analysis techniques. We therefore provide a base for future research work on visualisation tools in order to assist clinicians who frequently encounter patients for DBS therapy. This could lead to benefits in terms of enhanced evaluation of treatment efficacy in the future.
For the assessment of human reaction time, a test environment was developed. This system consists of an embedded device with organic light-emitting diode (OLED) displays with push buttons for the combined presentation of visual stimulation and registration of the haptic human reaction. The test leader can define the test sequence with the aid of a graphical user interface (GUI) on a personal computer (PC). The validation of the system was proved by measuring the latency times of the whole system, which are conditioned by the specific hard- and software constellation. Through the investigation of the display’s light radiation by a photodiode and the recorded current consumption, latency times and their variance were specified. In the fastest mode the system can reach an error limit of 60 μs.
Erbbaurechte werden in den letzten Jahren zwar wieder verstärkt verwendet, führen aber immer noch ein Nischendasein. Bei Erbbaurechten findet eine Aufspaltung der Eigentumsrechte an der Immobilie statt. Hierdurch entstehen einerseits zusätzliche Kontroll- und Durchsetzungskosten, andererseits auch Eingriffe in die Verfügungsrechte des Erbbauberechtigten. Beides führt zu Wertabschlägen, mit denen Volleigentum nicht belastet ist. Dies belastet sowohl die Rendite als auch die Möglichkeiten, bezahlbaren Wohnraum über Erbbaurechte zur Verfügung zu stellen. Hinzu kommen Nachteile bei der Veräußerbarkeit und der Beleihbarkeit von Erbbaurechten. Auf der anderen Seite können Erbbaurechte als ein Instrumentarium zur Reallokation von Investitionsrisiken auf den Erbbaurechtnehmer verstanden werden. Marktgerechtigkeit vorausgesetzt, sinken die Renditeforderungen der Erbbaurechtgeber stärker ab, als die Renditeforderungen der Erbbaurechtgeber ansteigen. Hierdurch entstehen u. U. beträchtliche Diskontierungsgewinne, die bei Volleigentum nicht generiert werden und die eine Überkompensation der Nachteile des Erbbaurechts bewirken können. Allerdings erlaubt es die Art und Weise, wie in Deutschland Erbbaurechte angewandt werden aber nicht, diesen potenziellen Mehrwert tatsächlich auszuschöpfen. Es werden Modelle aufgezeigt, die diese Anwendungsprobleme auf einfache Weise beheben.
Background: Deficiency in musculoskeletal imaging (MI) education will pose a great challenge to physiotherapists in clinical decision making in this era of first-contact physiotherapy practices in many developed and developing countries. This study evaluated the nature and the level of MI training received by physiotherapists who graduate from Nigerian universities.
Methods: An online version of the previously validated Physiotherapist Musculoskeletal Imaging Profiling Questionnaire (PMIPQ) was administered to all eligible physiotherapists identified through the database of the Medical Rehabilitation Therapist Board of Nigeria. Data were obtained on demographics, nature, and level of training on MI procedures using the PMIPQ. Logistic regression, Friedman’s analysis of variance (ANOVA) and Kruskal-Wallis tests were used for the statistical analysis of collected data.
Results: The results (n = 400) showed that only 10.0% of the respondents had a stand-alone entry-level course in MI, 92.8% did not have any MI placement during their clinical internship, and 67.3% had never attended a MI workshop. There was a significant difference in the level of training received across MI procedures [χ2 (15) = 1285.899; p = 0.001]. However, there was no significant difference in the level of MI training across institutions of entry-level programme (p = 0.36). The study participants with transitional Doctor of Physiotherapy education were better trained in MI than their counterparts with a bachelor’s degree only (p = 0.047).
Conclusions: Most physiotherapy programmes in Nigeria did not include a specific MI module; imaging instructions were mainly provided through clinical science courses. The overall self-reported level of MI training among the respondents was deficient. It is recommended that stand-alone MI education should be introduced in the early part of the entry-level physiotherapy curriculum.
Unintended nuclear war
(2021)
Carbon footprinting of universities worldwide: Part I — objective comparison by standardized metrics
(2021)
Background: Universities, as innovation drivers in science and technology worldwide, should be leading the Great Transformation towards a carbon–neutral society and many have indeed picked up the challenge. However, only a small number of universities worldwide are collecting and publishing their carbon footprints, and some of them have defined zero emission targets. Unfortunately, there is limited consistency between the reported carbon footprints (CFs) because of different analysis methods, different impact measures, and different target definitions by the respective universities.
Results: Comprehensive CF data of 20 universities from around the globe were collected and analysed. Essential factors contributing to the university CF were identified. For the first time, CF data from universities were not only compared. The CF data were also evaluated, partly corrected, and augmented by missing contributions, to improve the consistency and comparability. The CF performance of each university in the respective year is thus homogenized, and measured by means of two metrics: CO2e emissions per capita and per m2 of constructed area. Both metrics vary by one order of magnitude across the different universities in this study. However, we identified ten universities reaching a per capita carbon footprint of lower than or close to 1.0 Mt (metric tons) CO2e/person and year (normalized by the number of people associated with the university), independent from the university’s size. In addition to the aforementioned two metrics, we suggested a new metric expressing the economic efficiency in terms of the CF per $ expenditures and year. We next aggregated the results for all three impact measures, arriving at an overall carbon performance for the respective universities, which we found to be independent of geographical latitude. Instead the per capita measure correlates with the national per capita CFs, and it reaches on average 23% of the national impacts per capita. The three top performing universities are located in Switzerland, Chile, and Germany.
Conclusion: The usual reporting of CO2 emissions is categorized into Scopes 1–3 following the GHG Protocol Corporate Accounting Standard which makes comparison across universities challenging. In this study, we attempted to standardize the CF metrics, allowing us to objectively compare the CF at several universities. From this study, we observed that, almost 30 years after the Earth Summit in Rio de Janeiro (1992), the results are still limited. Only one zero emission university was identified, and hence, the transformation should speed up globally.
Laboratory protocols using magnetic beads have gained importance in the purification of mRNA for vaccines. Here, the produced mRNA hybridizes specifically to oligo(dT)-functionalized magnetic beads after cell lysis. The mRNA-loaded magnetic beads can be selectively separated using a magnet. Subsequently, impurities are removed by washing steps and the mRNA is eluted. Magnetic separation is utilized in each step, using different buffers such as the lysis/binding buffer. To reduce the time required for purification of larger amounts of mRNA vaccine for clinical trials, high-gradient magnetic separation (HGMS) is suitable. Thereby, magnetic beads are selectively retained in a flow-through separation chamber. To meet the requirements of biopharmaceutical production, a disposable HGMS separation chamber with a certified material (United States Pharmacopeia Class VI) was developed which can be manufactured using 3D printing. Due to the special design, the filter matrix itself is not in contact with the product. The separation chamber was tested with suspensions of oligo(dT)-functionalized Dynabeads MyOne loaded with synthetic mRNA. At a concentration of cB = 1.6–2.1 g·L–1 in lysis/binding buffer, these 1 μm magnetic particles are retained to more than 99.39% at volumetric flows of up to 150 mL·min–1 with the developed SU-HGMS separation chamber. When using the separation chamber with volumetric flow rates below 50 mL·min–1, the retained particle mass is even more than 99.99%.
Purification of mRNA with oligo(dT)-functionalized magnetic particles involves a series of magnetic separations for buffer exchange and washing. Magnetic particles interact and agglomerate with each other when a magnetic field is applied, which can result in a decreased total surface area and thus a decreased yield of mRNA. In addition, agglomeration may also be caused by mRNA loading on the magnetic particles. Therefore, it is of interest how the individual steps of magnetic separation and subsequent redispersion in the buffers used affect the particle size distribution. The lysis/binding buffer is the most important buffer for the separation of mRNA from the multicomponent suspension of cell lysate. Therefore, monodisperse magnetic particles loaded with mRNA were dispersed in the lysis/binding buffer and in the reference system deionized water, and the particle size distributions were measured. A concentration-dependent agglomeration tendency was observed in deionized water. In contrast, no significant agglomeration was detected in the lysis/binding buffer. With regard to magnetic particle recycling, the influence of different storage and drying processes on particle size distribution was investigated. Agglomeration occurred in all process alternatives. For de-agglomeration, ultrasonic treatment was examined. It represents a suitable method for reproducible restoration of the original particle size distribution.
The implementation of single-use technologies offers several major advantages, e.g. prevention of cross-contamination, especially when spore-forming microorganisms are present. This study investigated the application of a single-use bioreactor in batch fermentation of filamentous fungus Penicillium sp. (IBWF 040-09) from the Institute of Biotechnology and Drug Research (IBWF), which is capable of intracellular production of a protease inhibitor against parasitic proteases as a secondary metabolite. Several modifications to the SU bioreactor were suggested in this study to allow the fermentation in which the fungus forms pellets. Simultaneously, fermentations in conventional glass bioreactor were also conducted as reference. Although there are significant differences in the construction material and gassing system, the similarity of the two types of bioreactors in terms of fungal metabolic activity and the reproducibility of fermentations could be demonstrated using statistic methods. Under the selected cultivation conditions, growth rate, yield coefficient, substrate uptake rate, and formation of intracellular protease-inhibiting substance in the single-use bioreactor were similar to those in the glass bioreactor.
Thailand’s power system has been facing an energy transition due to the increasing amount of Renewable Energy (RE) integration, prosumers with self-consumption, and digitalization-based business models in a Local Energy Market (LEM). This paper introduces a decentralized business model and a possible trading platform for electricity trading in Thailand’s Micro-Grid to deal with the power system transformation. This approach is Hybrid P2P, a market structure in which sellers and buyers negotiate on energy exchanging by themselves called Fully P2P trading or through the algorithm on the market platform called Community-based trading. A combination of Auction Mechanism (AM), Bill Sharing (BS), and Traditional Mechanism (TM) is the decentralized price mechanism proposed for the Community-based trading. The approach is validated through a test case in which, during the daytime, the energy import and export of the community are significantly reduced when 75 consumers and 25 PV rooftop prosumers participate in this decentralized trading model. Furthermore, a comparison analysis confirms that the decentralized business model outperforms a centralized approach on community and individual levels.
As productive biofilms are increasingly gaining interest in research, the quantitative monitoring of biofilm formation on- or offline for the process remains a challenge. Optical coherence tomography (OCT) is a fast and often used method for scanning biofilms, but it has difficulty scanning through more dense optical materials. X-ray microtomography (μCT) can measure biofilms in most geometries but is very time-consuming. By combining both methods for the first time, the weaknesses of both methods could be compensated. The phototrophic cyanobacterium Tolypothrix distorta was cultured in a moving bed photobioreactor inside a biocarrier with a semi-enclosed geometry. An automated workflow was developed to process µCT scans of the biocarriers. This allowed quantification of biomass volume and biofilm-coverage on the biocarrier, both globally and spatially resolved. At the beginning of the cultivation, a growth limitation was detected in the outer region of the carrier, presumably due to shear stress. In the later phase, light limitations could be found inside the biocarrier. µCT data and biofilm thicknesses measured by OCT displayed good correlation. The latter could therefore be used to rapidly measure the biofilm formation in a process. The methods presented here can help gain a deeper understanding of biofilms inside a process and detect any limitations.
Introduction: The use of social marketing strategies to induce the promotion of cognitive health has received little attention in research. The objective of this scoping review is twofold: (i) to identify the social marketing strategies that have been used in recent years to initiate and maintain health-promoting behaviour; (ii) to advance research in this area to inform policy and practice on how to best make use of these strategies to promote cognitive health.
Methods and analysis: We will use the five-stage methodological framework of Arksey and O'Malley. Articles in English published since 2010 will be searched in electronic databases (the Cochrane Library, DoPHER, the International Bibliography of the Social Sciences, PsycInfo, PubMed, ScienceDirect, Scopus). Quantitative and qualitative study designs as well as reviews will be considered. We will include those articles that report the design, implementation, outcomes and evaluation of programmes and interventions concerning social marketing and/or health promotion and/or promotion of cognitive health. Grey literature will not be searched. Two independent reviewers will assess in detail the abstracts and full text of selected citations against the inclusion criteria. A Preferred Reporting Items for Systematic Reviews and Meta-Analyses flowchart for Scoping Reviews will be used to illustrate the process of article selection. We will use a data extraction form, present the results through narrative synthesis and discuss them in relation to the scoping review research questions.
Ethics and dissemination: Ethics approval is not required for conducting this scoping review. The results of the review will be the first step to advance a conceptual framework, which contributes to the development of interventions targeting the promotion of cognitive health. The results will be published in a peer-reviewed scientific journal. They will also be disseminated to key stakeholders in the field of the promotion of cognitive health.
The aim of this work was to develop and evaluate the reinforcement learning algorithm VentAI, which is able to suggest a dynamically optimized mechanical ventilation regime for critically-ill patients. We built, validated and tested its performance on 11,943 events of volume-controlled mechanical ventilation derived from 61,532 distinct ICU admissions and tested it on an independent, secondary dataset (200,859 ICU stays; 25,086 mechanical ventilation events). A patient “data fingerprint” of 44 features was extracted as multidimensional time series in 4-hour time steps. We used a Markov decision process, including a reward system and a Q-learning approach, to find the optimized settings for positive end-expiratory pressure (PEEP), fraction of inspired oxygen (FiO2) and ideal body weight-adjusted tidal volume (Vt). The observed outcome was in-hospital or 90-day mortality. VentAI reached a significantly increased estimated performance return of 83.3 (primary dataset) and 84.1 (secondary dataset) compared to physicians’ standard clinical care (51.1). The number of recommended action changes per mechanically ventilated patient constantly exceeded those of the clinicians. VentAI chose 202.9% more frequently ventilation regimes with lower Vt (5–7.5 mL/kg), but 50.8% less for regimes with higher Vt (7.5–10 mL/kg). VentAI recommended 29.3% more frequently PEEP levels of 5–7 cm H2O and 53.6% more frequently PEEP levels of 7–9 cmH2O. VentAI avoided high (>55%) FiO2 values (59.8% decrease), while preferring the range of 50–55% (140.3% increase). In conclusion, VentAI provides reproducible high performance by dynamically choosing an optimized, individualized ventilation strategy and thus might be of benefit for critically ill patients.
Big Data is now poised to mutate decision-making systems. Indeed, the decision is no longer based solely on the structured information that was hitherto collected and stored by the organization, but also on all data not structured outside the corporate straitjacket. The cloud and the information it contains impacts decisions and the industry is witnessing the emergence of business intelligence 3.0. With the growth of the internet, social networks, connected objects and communication information are now more abundant than ever before, along with rapid and substantial growth in their production. In 2012, 2.5 exabytes of data (one exabyte representing a million gigabytes of data) came every day to swell the ranks of big data (McAfee et al., 2012), which should weigh more than 40 zettabytes from 2020 (Valduriez, 2014) for 30 billion connected devices (The Internet Of Nothings, 2014) and 50 billion sensors (Davenport & Soulard, 2014). One of the most critical aspects of all of this information flow is the impact these will have on the way decisions are made. Indeed, in the part of an environment in which data was scarce and difficult to obtain, it was logical to let decision-making be conditioned by the intuition of the experienced decision-maker (Klein, Phillips, Rall, & Peluso, 2007). However, since information and knowledge are now available to everyone, the role of experts and decision-makers is gradually changing. Big data, in particular, makes it possible for analytical and decision-making systems to base their decision-making on global models. However, considering all the dimensions of the situations encountered, it was not until now that these systems were not within the reach of man, but were rationally limited (Simon & Newell, 1971). Big data and however, the processing of unstructured data requires modifying the architecture of decision support systems (DSS) of organizations. This paper is an inventory of developments undergone by aid systems decision-making, under the pressure of big data. Finally, it opens the debate on ethical questions raised by these new technologies, and it is observed that now, data analysis of personal data has become more debatable than in the past.
The integration of genetic algorithms to optimize the networks of value chains could enormously improve the performance of supply chains. For this reason, this paper describes in more detail the application of genetic algorithms in the value chains of the automotive industry. For this purpose, a theoretical model is built up to evaluate whether the application of the model can optimize the value chain. This option is described, analyzed and its restrictions are shown. Instead of looking at the entire network, individual finished goods and their bill of material are used as a basis for optimization, which greatly reduces the complexity of the original problem. The original complexity of the supply chain networks can thus be reduced and considered based on the bill of material.
For a detailed discussion of process mining, the objective of this paper is the analysis of the successful implementation of process mining in the practical fields of supply chain management. The research comprises the investigation of use cases in companies that are already actively using process mining.
Purpose: This research aims to highlight the applicability of process mining in the supply chain management business field.
Research Methodology: In order to examine the applicability of process mining in supply chain management a research study was conducted among experts in this business field. Further, theoretical findings were compared to the results and evaluated.
Results: Process Mining can be applied very well in the SCM area. The advantages that arise primarily reflect significant potential benefits and improved process throughput times. The information that can be gained from the operational areas supported by process mining is suitable for reliable decisions, both in the tactical and strategic areas.
Limitations: The results on the application of process mining show a certain generalization and have to be adapted and adjusted to the respective application case.
Contribution: This study is useful, especially for the purchasing and logistics business area.
A study of industry 4.0 technologies in the John Deere and Company and their impact on company operations is presented in this paper. Deere and Company’s implementation of Industry 4.0 to its factories and its factors was the focus of the research. The literature review with the systematic approach as well as a comprehensive review of current John Deere and Company’s developments is used in the current study. Also, it relied on freely available information on the company website. Public and investor relations have also been used as credible sources of information. An analysis found that adopting industry 4.0 technologies to agriculture manufacturing results in higher quality products, increased productivity, safety, and wider acceptance among stakeholders. This study assumes full implementation of these technologies in all agriculture manufacturing companies, and it also emphasizes up-to-date technologies. Studying this topic can be useful for engineers in mechanical and agricultural fields, managers in business, and marketers.
The aim of the study is to find out how SMEs used Social Media during Corona and how customers received it, to determine what should be continued or avoided by SMEs in the future. In this study, an interpretivist approach was adopted through problem-centred interviews with three SMEs. The second part of the study used an objectivist approach, where an online-based survey with a purpose sampling was conducted. The results were evaluated by means of thematic analysis.The SMEs interviewed considered Social Media essential during Corona. This was due to limited resources and the feeling of being overwhelmed by the situation. For customers a Social Media presence is also considered indispensable, and that the followership is based on the desire for the latest information. However, it also became clear that the survey participants do not believe the information on Social Media and prefer information on the website or at the location itself. No answers could be found about how the experts would answer sans or post Corona. Furthermore, due to anonymisation efforts, it was not possible to clarify the attitude of the survey participants specifically to the individual SMEs.