Filtern
Erscheinungsjahr
- 2021 (39) (entfernen)
Dokumenttyp
- Wissenschaftlicher Artikel (Fachzeitschriften) (39) (entfernen)
Volltext vorhanden
- ja (39) (entfernen)
Gehört zur Bibliographie
- nein (39)
Schlagworte
- Logistik (4)
- Nachhaltigkeit (4)
- Digitalisierung (3)
- Electronic Commerce (3)
- Gesundheitswesen (3)
- sustainability (3)
- Agilität <Management> (2)
- City-Logistik (2)
- Deutschland (2)
- Kinderarbeit (2)
Unintended nuclear war
(2021)
Carbon footprinting of universities worldwide: Part I — objective comparison by standardized metrics
(2021)
Background: Universities, as innovation drivers in science and technology worldwide, should be leading the Great Transformation towards a carbon–neutral society and many have indeed picked up the challenge. However, only a small number of universities worldwide are collecting and publishing their carbon footprints, and some of them have defined zero emission targets. Unfortunately, there is limited consistency between the reported carbon footprints (CFs) because of different analysis methods, different impact measures, and different target definitions by the respective universities.
Results: Comprehensive CF data of 20 universities from around the globe were collected and analysed. Essential factors contributing to the university CF were identified. For the first time, CF data from universities were not only compared. The CF data were also evaluated, partly corrected, and augmented by missing contributions, to improve the consistency and comparability. The CF performance of each university in the respective year is thus homogenized, and measured by means of two metrics: CO2e emissions per capita and per m2 of constructed area. Both metrics vary by one order of magnitude across the different universities in this study. However, we identified ten universities reaching a per capita carbon footprint of lower than or close to 1.0 Mt (metric tons) CO2e/person and year (normalized by the number of people associated with the university), independent from the university’s size. In addition to the aforementioned two metrics, we suggested a new metric expressing the economic efficiency in terms of the CF per $ expenditures and year. We next aggregated the results for all three impact measures, arriving at an overall carbon performance for the respective universities, which we found to be independent of geographical latitude. Instead the per capita measure correlates with the national per capita CFs, and it reaches on average 23% of the national impacts per capita. The three top performing universities are located in Switzerland, Chile, and Germany.
Conclusion: The usual reporting of CO2 emissions is categorized into Scopes 1–3 following the GHG Protocol Corporate Accounting Standard which makes comparison across universities challenging. In this study, we attempted to standardize the CF metrics, allowing us to objectively compare the CF at several universities. From this study, we observed that, almost 30 years after the Earth Summit in Rio de Janeiro (1992), the results are still limited. Only one zero emission university was identified, and hence, the transformation should speed up globally.
Laboratory protocols using magnetic beads have gained importance in the purification of mRNA for vaccines. Here, the produced mRNA hybridizes specifically to oligo(dT)-functionalized magnetic beads after cell lysis. The mRNA-loaded magnetic beads can be selectively separated using a magnet. Subsequently, impurities are removed by washing steps and the mRNA is eluted. Magnetic separation is utilized in each step, using different buffers such as the lysis/binding buffer. To reduce the time required for purification of larger amounts of mRNA vaccine for clinical trials, high-gradient magnetic separation (HGMS) is suitable. Thereby, magnetic beads are selectively retained in a flow-through separation chamber. To meet the requirements of biopharmaceutical production, a disposable HGMS separation chamber with a certified material (United States Pharmacopeia Class VI) was developed which can be manufactured using 3D printing. Due to the special design, the filter matrix itself is not in contact with the product. The separation chamber was tested with suspensions of oligo(dT)-functionalized Dynabeads MyOne loaded with synthetic mRNA. At a concentration of cB = 1.6–2.1 g·L–1 in lysis/binding buffer, these 1 μm magnetic particles are retained to more than 99.39% at volumetric flows of up to 150 mL·min–1 with the developed SU-HGMS separation chamber. When using the separation chamber with volumetric flow rates below 50 mL·min–1, the retained particle mass is even more than 99.99%.
Purification of mRNA with oligo(dT)-functionalized magnetic particles involves a series of magnetic separations for buffer exchange and washing. Magnetic particles interact and agglomerate with each other when a magnetic field is applied, which can result in a decreased total surface area and thus a decreased yield of mRNA. In addition, agglomeration may also be caused by mRNA loading on the magnetic particles. Therefore, it is of interest how the individual steps of magnetic separation and subsequent redispersion in the buffers used affect the particle size distribution. The lysis/binding buffer is the most important buffer for the separation of mRNA from the multicomponent suspension of cell lysate. Therefore, monodisperse magnetic particles loaded with mRNA were dispersed in the lysis/binding buffer and in the reference system deionized water, and the particle size distributions were measured. A concentration-dependent agglomeration tendency was observed in deionized water. In contrast, no significant agglomeration was detected in the lysis/binding buffer. With regard to magnetic particle recycling, the influence of different storage and drying processes on particle size distribution was investigated. Agglomeration occurred in all process alternatives. For de-agglomeration, ultrasonic treatment was examined. It represents a suitable method for reproducible restoration of the original particle size distribution.
The implementation of single-use technologies offers several major advantages, e.g. prevention of cross-contamination, especially when spore-forming microorganisms are present. This study investigated the application of a single-use bioreactor in batch fermentation of filamentous fungus Penicillium sp. (IBWF 040-09) from the Institute of Biotechnology and Drug Research (IBWF), which is capable of intracellular production of a protease inhibitor against parasitic proteases as a secondary metabolite. Several modifications to the SU bioreactor were suggested in this study to allow the fermentation in which the fungus forms pellets. Simultaneously, fermentations in conventional glass bioreactor were also conducted as reference. Although there are significant differences in the construction material and gassing system, the similarity of the two types of bioreactors in terms of fungal metabolic activity and the reproducibility of fermentations could be demonstrated using statistic methods. Under the selected cultivation conditions, growth rate, yield coefficient, substrate uptake rate, and formation of intracellular protease-inhibiting substance in the single-use bioreactor were similar to those in the glass bioreactor.
Thailand’s power system has been facing an energy transition due to the increasing amount of Renewable Energy (RE) integration, prosumers with self-consumption, and digitalization-based business models in a Local Energy Market (LEM). This paper introduces a decentralized business model and a possible trading platform for electricity trading in Thailand’s Micro-Grid to deal with the power system transformation. This approach is Hybrid P2P, a market structure in which sellers and buyers negotiate on energy exchanging by themselves called Fully P2P trading or through the algorithm on the market platform called Community-based trading. A combination of Auction Mechanism (AM), Bill Sharing (BS), and Traditional Mechanism (TM) is the decentralized price mechanism proposed for the Community-based trading. The approach is validated through a test case in which, during the daytime, the energy import and export of the community are significantly reduced when 75 consumers and 25 PV rooftop prosumers participate in this decentralized trading model. Furthermore, a comparison analysis confirms that the decentralized business model outperforms a centralized approach on community and individual levels.
As productive biofilms are increasingly gaining interest in research, the quantitative monitoring of biofilm formation on- or offline for the process remains a challenge. Optical coherence tomography (OCT) is a fast and often used method for scanning biofilms, but it has difficulty scanning through more dense optical materials. X-ray microtomography (μCT) can measure biofilms in most geometries but is very time-consuming. By combining both methods for the first time, the weaknesses of both methods could be compensated. The phototrophic cyanobacterium Tolypothrix distorta was cultured in a moving bed photobioreactor inside a biocarrier with a semi-enclosed geometry. An automated workflow was developed to process µCT scans of the biocarriers. This allowed quantification of biomass volume and biofilm-coverage on the biocarrier, both globally and spatially resolved. At the beginning of the cultivation, a growth limitation was detected in the outer region of the carrier, presumably due to shear stress. In the later phase, light limitations could be found inside the biocarrier. µCT data and biofilm thicknesses measured by OCT displayed good correlation. The latter could therefore be used to rapidly measure the biofilm formation in a process. The methods presented here can help gain a deeper understanding of biofilms inside a process and detect any limitations.
Introduction: The use of social marketing strategies to induce the promotion of cognitive health has received little attention in research. The objective of this scoping review is twofold: (i) to identify the social marketing strategies that have been used in recent years to initiate and maintain health-promoting behaviour; (ii) to advance research in this area to inform policy and practice on how to best make use of these strategies to promote cognitive health.
Methods and analysis: We will use the five-stage methodological framework of Arksey and O'Malley. Articles in English published since 2010 will be searched in electronic databases (the Cochrane Library, DoPHER, the International Bibliography of the Social Sciences, PsycInfo, PubMed, ScienceDirect, Scopus). Quantitative and qualitative study designs as well as reviews will be considered. We will include those articles that report the design, implementation, outcomes and evaluation of programmes and interventions concerning social marketing and/or health promotion and/or promotion of cognitive health. Grey literature will not be searched. Two independent reviewers will assess in detail the abstracts and full text of selected citations against the inclusion criteria. A Preferred Reporting Items for Systematic Reviews and Meta-Analyses flowchart for Scoping Reviews will be used to illustrate the process of article selection. We will use a data extraction form, present the results through narrative synthesis and discuss them in relation to the scoping review research questions.
Ethics and dissemination: Ethics approval is not required for conducting this scoping review. The results of the review will be the first step to advance a conceptual framework, which contributes to the development of interventions targeting the promotion of cognitive health. The results will be published in a peer-reviewed scientific journal. They will also be disseminated to key stakeholders in the field of the promotion of cognitive health.
The aim of this work was to develop and evaluate the reinforcement learning algorithm VentAI, which is able to suggest a dynamically optimized mechanical ventilation regime for critically-ill patients. We built, validated and tested its performance on 11,943 events of volume-controlled mechanical ventilation derived from 61,532 distinct ICU admissions and tested it on an independent, secondary dataset (200,859 ICU stays; 25,086 mechanical ventilation events). A patient “data fingerprint” of 44 features was extracted as multidimensional time series in 4-hour time steps. We used a Markov decision process, including a reward system and a Q-learning approach, to find the optimized settings for positive end-expiratory pressure (PEEP), fraction of inspired oxygen (FiO2) and ideal body weight-adjusted tidal volume (Vt). The observed outcome was in-hospital or 90-day mortality. VentAI reached a significantly increased estimated performance return of 83.3 (primary dataset) and 84.1 (secondary dataset) compared to physicians’ standard clinical care (51.1). The number of recommended action changes per mechanically ventilated patient constantly exceeded those of the clinicians. VentAI chose 202.9% more frequently ventilation regimes with lower Vt (5–7.5 mL/kg), but 50.8% less for regimes with higher Vt (7.5–10 mL/kg). VentAI recommended 29.3% more frequently PEEP levels of 5–7 cm H2O and 53.6% more frequently PEEP levels of 7–9 cmH2O. VentAI avoided high (>55%) FiO2 values (59.8% decrease), while preferring the range of 50–55% (140.3% increase). In conclusion, VentAI provides reproducible high performance by dynamically choosing an optimized, individualized ventilation strategy and thus might be of benefit for critically ill patients.
Big Data is now poised to mutate decision-making systems. Indeed, the decision is no longer based solely on the structured information that was hitherto collected and stored by the organization, but also on all data not structured outside the corporate straitjacket. The cloud and the information it contains impacts decisions and the industry is witnessing the emergence of business intelligence 3.0. With the growth of the internet, social networks, connected objects and communication information are now more abundant than ever before, along with rapid and substantial growth in their production. In 2012, 2.5 exabytes of data (one exabyte representing a million gigabytes of data) came every day to swell the ranks of big data (McAfee et al., 2012), which should weigh more than 40 zettabytes from 2020 (Valduriez, 2014) for 30 billion connected devices (The Internet Of Nothings, 2014) and 50 billion sensors (Davenport & Soulard, 2014). One of the most critical aspects of all of this information flow is the impact these will have on the way decisions are made. Indeed, in the part of an environment in which data was scarce and difficult to obtain, it was logical to let decision-making be conditioned by the intuition of the experienced decision-maker (Klein, Phillips, Rall, & Peluso, 2007). However, since information and knowledge are now available to everyone, the role of experts and decision-makers is gradually changing. Big data, in particular, makes it possible for analytical and decision-making systems to base their decision-making on global models. However, considering all the dimensions of the situations encountered, it was not until now that these systems were not within the reach of man, but were rationally limited (Simon & Newell, 1971). Big data and however, the processing of unstructured data requires modifying the architecture of decision support systems (DSS) of organizations. This paper is an inventory of developments undergone by aid systems decision-making, under the pressure of big data. Finally, it opens the debate on ethical questions raised by these new technologies, and it is observed that now, data analysis of personal data has become more debatable than in the past.
The integration of genetic algorithms to optimize the networks of value chains could enormously improve the performance of supply chains. For this reason, this paper describes in more detail the application of genetic algorithms in the value chains of the automotive industry. For this purpose, a theoretical model is built up to evaluate whether the application of the model can optimize the value chain. This option is described, analyzed and its restrictions are shown. Instead of looking at the entire network, individual finished goods and their bill of material are used as a basis for optimization, which greatly reduces the complexity of the original problem. The original complexity of the supply chain networks can thus be reduced and considered based on the bill of material.
For a detailed discussion of process mining, the objective of this paper is the analysis of the successful implementation of process mining in the practical fields of supply chain management. The research comprises the investigation of use cases in companies that are already actively using process mining.
Purpose: This research aims to highlight the applicability of process mining in the supply chain management business field.
Research Methodology: In order to examine the applicability of process mining in supply chain management a research study was conducted among experts in this business field. Further, theoretical findings were compared to the results and evaluated.
Results: Process Mining can be applied very well in the SCM area. The advantages that arise primarily reflect significant potential benefits and improved process throughput times. The information that can be gained from the operational areas supported by process mining is suitable for reliable decisions, both in the tactical and strategic areas.
Limitations: The results on the application of process mining show a certain generalization and have to be adapted and adjusted to the respective application case.
Contribution: This study is useful, especially for the purchasing and logistics business area.
A study of industry 4.0 technologies in the John Deere and Company and their impact on company operations is presented in this paper. Deere and Company’s implementation of Industry 4.0 to its factories and its factors was the focus of the research. The literature review with the systematic approach as well as a comprehensive review of current John Deere and Company’s developments is used in the current study. Also, it relied on freely available information on the company website. Public and investor relations have also been used as credible sources of information. An analysis found that adopting industry 4.0 technologies to agriculture manufacturing results in higher quality products, increased productivity, safety, and wider acceptance among stakeholders. This study assumes full implementation of these technologies in all agriculture manufacturing companies, and it also emphasizes up-to-date technologies. Studying this topic can be useful for engineers in mechanical and agricultural fields, managers in business, and marketers.
The aim of the study is to find out how SMEs used Social Media during Corona and how customers received it, to determine what should be continued or avoided by SMEs in the future. In this study, an interpretivist approach was adopted through problem-centred interviews with three SMEs. The second part of the study used an objectivist approach, where an online-based survey with a purpose sampling was conducted. The results were evaluated by means of thematic analysis.The SMEs interviewed considered Social Media essential during Corona. This was due to limited resources and the feeling of being overwhelmed by the situation. For customers a Social Media presence is also considered indispensable, and that the followership is based on the desire for the latest information. However, it also became clear that the survey participants do not believe the information on Social Media and prefer information on the website or at the location itself. No answers could be found about how the experts would answer sans or post Corona. Furthermore, due to anonymisation efforts, it was not possible to clarify the attitude of the survey participants specifically to the individual SMEs.
Sowohl der stationäre Handel als auch die Online Pure Player befinden sich durch die zunehmende Digitalisierung und die Beeinflussung durch die dynamisch veränderten Trends in einem Wandel. Insbesondere wird von der Möbelbranche ein adaptives Verhalten an die vorliegenden Entwicklungen verlangt. Durch die Intensivierung der Markt- und Wettbewerbslandschaften und die Veränderungen des Verbraucherverhaltens bezüglich der verlangten Verschmelzung der Einkaufskanäle wird ein Umdenken gefordert und notwendig. Zusätzlich wird diese Notwendigkeit durch aktuelle Gegebenheiten, wie die Covid-19-Pandemie dringlicher. Die Omnichannel-Strategie und deren Etablierung birgt für die Möbelbranche insbesondere hinsichtlich der logistischen Herausforderungen Gelegenheiten und Bedrohungen. Diese sind zu erkennen, zu nutzen und zu beheben.
Agility and digital trends go hand in hand, but the advantages of digitalization perform a high pressure on the established automotive companies. For years now, automotive groups have no longer been innovation drivers in the industry. This status is reserved for radical companies like Tesla. But is there any chance that conservative companies will reinvent themselves, establish leaner structures and thus regain market dominance and innovation?
E-commerce live streaming - An emerging industry in China and a potential future trend in the world
(2021)
With the widespread use of the Internet, many industries have developed rapidly. The economy based on the Internet poses a significant threat to the traditional economy. Live streaming plus e-commerce, which is acknowledged as the current global economic status, is the result of combing live streaming and various industries through the Internet. E-commerce live streaming is one of the most essential types of online live streaming. In this article, it is defined as the live streaming of the e-commerce platform used by Key Opinion Leaders or product sellers through the built-in live streaming function of the platform to propagate goods, brands, events, etc. to achieve goals of brand exposure and product sales. Compared with the traditional economic model, the combined model of e-commerce and live streaming has its advantages and characteristics. This kind of marketing tool is now prevalent. However, there are many deficiencies in e-commerce live streaming that need to be improved since the development of e-commerce is immature and supervision of Internet use is ongoing.
This study will describe how the robotics industry evolved increasingly and a new phase of advanced robotics has emerged, and the relation between humans and robots in the same workplace. Problems of designing safer robots in human-machine interaction systems are urgent research topics in the field of industrial robotics. Many of the problems in industrial robotics are related not just to technological issues, but also to human-robot collaboration also will be discussed as an effective method to tackle this issue is the invention of Collaborative robots.
Unternehmen verlassen sich bei der Entwicklung von Software und Lösungen häufig auf das Know-How externer Dienstleister. Moderne Arbeits- und Kollaborationsformen verändern gleichzeitig die Entwicklung von Produkten und Dienstleistungen. Wie beeinflussen diese Trends die Zusammenarbeit und Kooperation zwischen Unternehmen und ihren externen agilen Dienstleistern? Ziel dieser wissenschaftlichen Arbeit ist es herauszufinden, welche Schritte unternehmen müssen, um agiles Arbeiten und die Zusammenarbeit mit externen Dienstleistern umzusetzen. Daher wurde anhand einer Fallstudie inklusive einer qualitativen Befragung herausgefunden und aufgezeigt, welche Maßnahmen und Handlungen Unternehmen ergreifen müssen, um das Ziel einer effektiven Umsetzung einer agilen Zusammenarbeit und Kooperation zu erreichen. Drei Kernthemen wurden identifiziert, auf deren Grundlage die Forschungsfragen zu den Maßnahmen beantwortet werden: Erstens, welche Möglichkeiten Unternehmen haben, ein internes agiles Setup zu implementieren, um mit agilen Dienstleistern auf Augenhöhe zusammenzuarbeiten. Zweitens, welche Vertragsvarianten die agile Zusammenarbeit unterstützen und verbessern können und drittens, welche agilen Techniken und Methoden in der agilen Zusammenarbeit eingesetzt werden sollten. Die Ergebnisse der Fallstudien bestätigen die Annahme, dass die drei identifizierten Kernthemen für eine effektive Zusammenarbeit im agilen Umfeld essenziell sind. Während einerseits nachgewiesen wurde, dass sich die Vertragsanforderungen hinsichtlich ihrer Flexibilität und Anpassungsfähigkeit veränderten, wurde andererseits auch nachgewiesen, dass das interne Setup agile Treiber, Techniken und Methoden erfordert, um eine effektive Zusammenarbeit mit agilen Dienstleistern zu ermöglichen. Dieser Artikel gibt einen Überblick über die wichtigsten Inhalte innerhalb der drei genannten Kernthemen und gibt Unternehmen zudem Hinweise, wie sie eine Basis für eine effektive Zusammenarbeit schaffen können.
In relation to the fast development of e-commerce and rapid increasing of parcels, urban logistic sector is facing the challenge of sustainability. Especially, last-mile delivery as the last step of goods transport, it connects to customers’ satisfaction, cost efficiency of logistic companies, and more and more public expectations to sustainability of urban logistics. To handle with the complexity of urban logistics conditions, governments and logistics companies should develop a co-operating strategy for sustainability of urban last-mile delivery. This paper is based on data collection from the long-term empirical research and a survey to the e-commerce users in Germany and China to develop a sustainable concept for the urban last-mile delivery. The key to the development of concept is to create a balance among the requirements of customers, the competition abilities of logistics companies and the public interest.
Optimal mental workload plays a key role in driving performance. Thus, driver-assisting systems that automatically adapt to a drivers current mental workload via brain–computer interfacing might greatly contribute to traffic safety. To design economic brain computer interfaces that do not compromise driver comfort, it is necessary to identify brain areas that are most sensitive to mental workload changes. In this study, we used functional near-infrared spectroscopy and subjective ratings to measure mental workload in two virtual driving environments with distinct demands. We found that demanding city environments induced both higher subjective workload ratings as well as higher bilateral middle frontal gyrus activation than less demanding country environments. A further analysis with higher spatial resolution revealed a center of activation in the right anterior dorsolateral prefrontal cortex. The area is highly involved in spatial working memory processing. Thus, a main component of drivers’ mental workload in complex surroundings might stem from the fact that large amounts of spatial information about the course of the road as well as other road users has to constantly be upheld, processed and updated. We propose that the right middle frontal gyrus might be a suitable region for the application of powerful small-area brain computer interfaces.
The present work aimed at investigating an extraction protocol based on consecutive steps of isoelectric point (pH ~ 4.25) mediated gum swelling and deproteinisation as an alternative method to produce flaxseed gum extracts of enhanced techno-functional characteristics. The osidic and proximate composition, structure conformation, flow behaviour, dynamic rheological and thermal properties of gums isolated from brown and golden flaxseeds were assessed. Gum extraction under near-to-isoelectric point conditions did not impair the extraction yield, residual protein and ash content, whilst it resulted in minor changes in the sugar composition of the flaxseed gum extracts. The deconvolution of the GPC/SEC chromatographs revealed the presence of four major polysaccharidic populations corresponding to arabinoxylans, rhamnogalacturonan–I and two AX-RG-I composite fractions. The latter appeared to minimise the intra- and interchain polymer non-covalent interactions (hydrogen bonding) leading to a better solvation affinity in water and lyotropic solvents. Golden flaxseed gums exerted higher molecular weight (Mw = 1.34–1.15 × 106 Da) and intrinsic viscosities (6.63–5.13 dL g−1) as well as better thickening and viscoelastic performance than the brown flaxseed gum exemplars. Golden flaxseed gums exhibited a better thermal stability compared to the brown flaxseed counterparts and therefore, they are suitable for product applications involving severe heat treatments.
Cryotropic gelation is one of the most common approaches to design novel hydrogels with multifaceted technological and biological functionalities. In the present paper, we studied the ability of highly galactosyl-substituted galactomannans, i.e. fenugreek and alfalfa gum, to form physically crosslinked hydrogels via cryogenic processing. Cycling of the galactomannan solutions (0.25 to 4% wt) from 25 to −20 to 25 °C induced the physical crosslinking of the galactomannan chains leading to the formation of different cryogel structures, i.e. filamentous aggregates (c* < c < 1%), cellular-like gel networks (1 ≤ c < 4%) or a homogeneously swollen gel (c ≥ 4%), depending on the total biopolymer content. Alfalfa gum-based cryogels exhibited higher elasticity and stiffness, better uniformity of the structure and a lower macropore size than their fenugreek counterparts. The physical blending of alfalfa or fenugreek gum with locust bean gum (2% total biopolymer) led to the reinforcement of the mechanical properties of the cryogels without significantly altering their microstructural aspects.
Understanding and modulating CNS function in physiological as well as pathophysiological contexts remains a significant ambition in research and clinical applications. The investigation of the multifaceted CNS cell types including their interactions and contributions to neural function requires a combination of the state-of-the-art in vivo electrophysiology and imaging techniques. We developed a novel type of liquid crystal polymer (LCP) surface micro-electrode manufactured in three customized designs with up to 16 channels for recording and stimulation of brain activity. All designs include spare central spaces for simultaneous 2P-imaging. Nanoporous platinum-plated contact sites ensure a low impedance and high current transfer. The epidural implantation of the LCP micro-electrodes could be combined with standard cranial window surgery. The epidurally positioned electrodes did not only display long-term biocompatibility, but we also observed an additional stabilization of the underlying CNS tissue. We demonstrate the electrode’s versatility in combination with in vivo 2P-imaging by monitoring anesthesia-awake cycles of transgenic mice with GCaMP3 expression in neurons or astrocytes. Cortical stimulation and simultaneous 2P Ca2+ imaging in neurons or astrocytes highlighted the astrocytes’ integrative character in neuronal activity processing. Furthermore, we confirmed that spontaneous astroglial Ca2+ signals are dampened under anesthesia, while evoked signals in neurons and astrocytes showed stronger dependency on stimulation intensity rather than on various levels of anesthesia. Finally, we show that the electrodes provide recordings of the electrocorticogram (ECoG) with a high signal-to noise ratio and spatial signal differences which help to decipher brain activity states during experimental procedures. Summarizing, the novel LCP surface micro-electrode is a versatile, convenient, and reliable tool to investigate brain function in vivo.
Intraspecific diet specialization, usually driven by resource availability, competition and predation, is common in natural populations. However, the role of parasites on diet specialization of their hosts has rarely been studied. Eye flukes can impair vision ability of their hosts and have been associated with alterations of fish feeding behavior. Here it was assessed whether European perch (Perca fluviatilis) alter their diet composition as a consequence of infection with eye flukes. Young-of-the-year (YOY) perch from temperate Lake Müggelsee (Berlin, Germany) were sampled in two years, eye flukes counted and fish diet was evaluated using both stomach content and stable isotope analyses. Perch diet was dominated by zooplankton and benthic macroinvertebrates. Both methods indicated that with increasing eye fluke infection intensity fish had a more selective diet, feeding mainly on the benthic macroinvertebrate Dikerogammarus villosus, while less intensively infected fish appeared to be generalist feeders showing no preference for any particular prey type. Our results show that infection with eye flukes can indirectly affect interaction of the host with lower trophic levels by altering the diet composition and highlight the underestimated role of parasites in food web studies.
Modeling and executing knowledge-intensive processes (KiPs) are challenging with state-of-the-art approaches, and the specific demands of KiPs are the subject of ongoing research. In this context, little attention has been paid to the ontology-driven combination of data-centric and semantic business process modeling, which finds additional motivation by enabling the division of labor between humans and artificial intelligence. Such approaches have characteristics that could allow support for KiPs based on the inferencing capabilities of reasoners. We confirm this as we show that reasoners can infer the executability of tasks based on a currently researched ontology- and data-driven business process model (ODD-BP model). Further support for KiPs by the proposed inference mechanism results from its ability to infer the relevance of tasks, depending on the extent to which their execution would contribute to process progress. Besides these contributions along with the execution perspective (start-to-end direction), we will also show how our approach can help to reach specific process goals by inferring the relevance of process elements regarding their support to achieve such goals (end-to-start direction). The elements with the most valuable process progress can be identified in the intersection of both, the execution and goal perspective. This paper will introduce this new approach and verifies its practicability with an evaluation of a KiP in the field of emergency call centers.
Purpose: Grounded in the theoretical concepts of utilitarianism and deontology, this paper aims to evaluate the issue of child labour from an ethics perspective. By linking utilitarianism with normative stakeholder theory, relevant stakeholder groups are being identified in order to examine their influence on and role in the occurrence of child labour allowing for a practical reference. The findings may serve companies in particular as a basis for decision-making in the development of their value chains.
Design/Methodology/Approach: The author uses a literature review in order to analyze the findings of existing literature on the topic of child labour in an ethics context, thereby drawing on literature, indexed in Web of Science and Google Scholar by employing forward and backward citation analysis.
Findings: The investigation of child labour in terms of ethics yields conflicting results. From a deontological perspective, child labour can never be ethical and should always be rejected as it is not wanted to become a general law. In contrast, according to a utilitarian sentiment, child labour is ethically justifiable as long as the beneficiaries of the labour are greater in number than the children working or suffering.
Originality/Value: The examination of child labour from the perspective of deontology and utilitarianism in conjunction with normative stakeholder theory constitutes a novelty in the ethics literature. The integration of theoretical findings into a practical business context provides additional value for managers and global supply chain managers.
Objective: The objective of the article highlight the significance of culture in the entrepreneurial landscape and provides entrepreneurs and (project) managers with a guidance tool to overcome previously unconsidered stumbling blocks while operating in the intercultural setting.
Research Design & Methods: The following article was prepared based on a critical study review devoted to existing approaches to intercultural impact in business life and used the archival technique from 1990-2020. The study review reflects on the identification of existing literature gaps in the implementation of a subcultural business environment. It addresses these by designing an appropriate model to bypass the apparent pitfalls of intercultural business communication and co-existence, if possible.
Findings: Culture impacts diverse sets of society and businesses, including entrepreneurship. This article underpins which pitfalls are advisable to consider when encountering the intercultural and entrepreneurship-driven workplace.
Implications & Recommendations: Based on the study review, startups, as well as big corporate companies’ projects of a creational nature, are advised to reconsider their perception and handling of culture applying The Building of Cultural and Entrepreneurial Force.
Contribution & Value Added: The added value of this article is to be found in the solid analysis of cultural essentialism, anti-essentialism, and implications to beware of in the managerial and entrepreneurial context related to The Building of Intercultural and Entrepreneurial Force that intends to ease to co-work of intercultural teams.
This text will explain which role “Green Bonds” play in financing projects and how the green factor is weighted. It will be discussed on how the term “green” can change the price of the bond, if there is a “green premium” and for which group of investors this type of bond is interesting. We will discuss ways to reduce their cost of capital, also considering the risks and on ways on how to improve their conditions. The sustainable and eco-friendly aspects are also highlighted in this text and they might become crucial in future investing, which gives the bond an interesting role.
The dark side of Samsung’s value chain: The human costs of cobalt mining “BLOOD, SWEAT AND COBALT”
(2021)
Samsung has been implicitly linked to human rights abuses and wider social downgrading propagated within the Democratic Republic of Congo (DRC). Reports by different studies have shown artisanal cobalt mines (ASM) to exploit child labour and subject workers to perilous conditions. The IT multinational is dependent upon Congolese cobalt as a key element in lithiumion batteries used to produce their array of electronics. However, irresponsible cobalt sourcing practices undertaken by Tier 1 suppliers, Glencore and Huayou, have resulted in ASM operations being incorporated into Samsung’s global value chain, as Tier 2 suppliers. Analysis of the relationships underpinning Samsung’s cobalt value chain theoretical framework, highlights the presence of a relational governance structure, with captive elements among upstream Tier 1 and Tier 2 suppliers. Samsung is thereby reliant upon both Glencore and Huayou to transmit and enforce private codes of conduct down the value chain to expel human rights abuses. In conjunction, the DRC’s weak and unstable institutional environment has facilitated corruption and the improper enforcement of laws across the ASM industry. It is thereby imperative that Samsung takes ownership of the issues present within its value chain, as both Tier 1 suppliers and the Congolese government have failed to ensure responsible cobalt sourcing practices to date. This report recommends that Samsung adopt a holistic action plan, not only utilising their own resources and capabilities, but also those of critical stakeholders including Tier 1 suppliers, NGOs and the DRC and South Korean governments. Most prominently, this report suggests that supply chain transparency can be improved using certificates of origin and blockchain technology. Furthermore, it is recommended that poverty alleviation is targeted as a key measure through “Cobalt for Development”, an action plan designed to instigate both social and economic upgrading within ASM operations and the wider community. By employing a multi-scalar approach and addressing the issues inherent across multiple governance levels, Samsung can ensure a responsible source of cobalt be sustained.
Reasons and potential solution approaches for the shortage of nursing staff in German hospitals
(2021)
The aim of this scientific paper was to find out the reasons for the shortage of nursing staff in German hospitals and to provide potential solution approaches for this shortage. Over the last years, the shortage of nursing staff has become a more and more important topic in the news: Not only due to the increasing amount of missing nurses, but also due to the ageing population in Germany, which leads to an increasing amount of patients in German hospitals. To reach this aim two surveys were done, of which one was for nursing staff only and the other one was for people from all occupational groups with the intention of creating comparative values. The surveys were done from March to April 2019 and were analysed afterwards. After a detailed analysis of the survey results, it can be summarized that the reasons for the shortage of nursing staff in German hospitals are very diverse: Starting with a weak salary, improvable working conditions – for example the shift work and the high amount of physical and psychological stress -, a difficult compatibility of family and job as well as the unattractive image of the job as a nurse in the society. It can be concluded that the solution for the shortage of nursing staff is very difficult. The future will show whether the governmental support will help to make the job as a nurse more attractive – not only for the current nurses, but also for potential future nurses.
Aim: The aim of this scientific paper was to examine important trends and developments influencing the nursing care in order to forecast future opportunities and challenges and how to deal with them in the best possible way.
Background: The Corona-pandemic demonstrated the importance of nursing care in the entire world and had drawn attention to the issue of a well-educated and enough nursing staff. The nursing care will face opportunities and challenges due to current trends and developments, which are important to examine in order to provide the best possible nursing care.
Methods: To reach the above-mentioned aim, intensive research was done by using secondary sources and surveys.
Results/Findings: After a detailed analysis of the research it can be summarized that there are three important topics influencing the nursing care: The demographical development with an increasing life expectancy leading to an increasing amount of old people with a demand for care and decreasing birth rates leading to less working people. Cultural transformation and diversity imply many opportunities because the employment market can fill gaps with foreign workers and the immigrating people can compensate the decreasing birth rates. Nevertheless, it can imply many challenges and potential problems which need to be solved by the society and the immigrating people. Furthermore, the changing gender roles can lead to more men becoming a nurse, which might have a significant impact on the shortage of nursing staff. The third important topic influencing the nursing care is technological trends which can help to decrease physiological stress, by facilitating the nurses’ work and by taking over some work from them.
Conclusion: It can be concluded that the trends and developments influencing the nursing care are very diverse and imply many different opportunities as well as challenges.
E-commerce has been keeping fast increasing worldwide since beginning of the 21st century. Rapid growth of e-commerce & parcel shipping is a booming business. However, how to handle with many hard-to-solve sustainability issues of transport in urban areas, is becoming a serious challenge for urban logistic sector and numerous stakeholders. The sustainability issues contain the problems of air pollution, congestion, and sub-contractors. This paper reported those issues in the context of growth of e-commerce and analyzed their efforts on the sustainable urban logistics development.
Bauprojekte sind in der Regel komplexe Vorhaben. Sie werden mit Hilfe des Projektmanagements und dessen Verfahren, Prozessen und Techniken bewältigt. Dennoch sind deutsche Bauprojekte nicht selten von Kosten- und Terminüberschreitungen betroffen. Ziel dieser Arbeit ist es, mögliche Optimierungsfelder im Planungs- und Steuerungsprozess eines Unternehmens für Industriebauprojekte zu identifizieren und darauf aufbauende Verbesserungsansätze zu erarbeiten. Um die Ziele verfolgen zu können, wurde eine qualitative Sozialforschung mittels Experteninterviews durchgeführt. Die Expertenaussagen verdeutlichen weiterhin Optimierungspotenzial, sowohl im Planungs- als auch im Steuerungsprozess. Ausgewählte Techniken (hauptsächlich aus dem klassischen Projektmanagement) dienen indessen dazu, die Effektivität und Effizienz des Planungsprozesses zu erhöhen. Innerhalb des Steuerungsprozesses zeigt sich, dass viele Optimierungsbereiche der Steuerung auf den Defiziten der Planung beruhen.
With a radar working in the 24 GHz ISM-band in a frequency modulated continuous wave mode the major vital signs heartbeat and respiration rate are monitored. The observation is hereby contactless with the patient sitting straight up in a distance of 1–2 m to the radar. Radar and sampling platform are components developed internally in the university institution. The communication with the radar is handled with MATLAB via TCP/IP. The signal processing and real-time visualization is developed in MATLAB, too. Cornerstone of this publication are the wavelet packet transformation and a spectral frequency estimation for vital sign calculation. The wavelet transformation allows a fine tuning of frequency subspaces, separating the heartbeat signal from the respiration and more important from noise and other movement. Heartbeat and respiration are monitored independently and compared to parallel recorded ECG-data.
Radar signal processing is a promising tool for vital sign monitoring. For contactless observation of breathing and heart rate a precise measurement of the distance between radar antenna and the patient’s skin is required. This results in the need to detect small movements in the range of 0.5 mm and below. Such small changes in distance are hard to be measured with a limited radar bandwidth when relying on the frequency based range detection alone. In order to enhance the relative distance resolution a precise measurement of the observed signal’s phase is required. Due to radar reflections from surfaces in close proximity to the main area of interest the desired signal of the radar reflection can get superposed. For superposing signals with little separation in frequency domain the main lobes of their discrete Fourier transform (DFT) merge into a single lobe, so that their peaks cannot be differentiated. This paper evaluates a method for reconstructing the phase and amplitude of such superimposed signals.
Multimodal meaning making: The annotation of nonverbal elements in multimodal corpus transcription
(2021)
The article discusses how to integrate annotation for nonverbal elements (NVE) from multimodal raw data as part of a standardized corpus transcription. We argue that it is essential to include multimodal elements when investigating conversational data, and that in order to integrate these elements, a structured approach to complex multimodal data is needed. We discuss how to formulate a structured corpus-suitable standard syntax and taxonomy for nonverbal features such as gesture, facial expressions, and physical stance, and how to integrate it in a corpus. Using corpus examples, the article describes the development of a robust annotation system for spoken language in the corpus of Video-mediated English as a Lingua Franca Conversations (ViMELF 2018) and illustrates how the system can be used for the study of spoken discourse. The system takes into account previous research on multimodality, transcribes salient nonverbal features in a concise manner, and uses a standard syntax. While such an approach introduces a degree of subjectivity through the criteria of salience and conciseness, the system also offers considerable advantages: it is versatile and adaptable, flexible enough to work with a wide range of multimodal data, and it allows both quantitative and qualitative research on the pragmatics of interaction.
Vibroarthrography measures joint sounds caused by sliding of the joint surfaces over each other. and can be affected by joint health, load and type of movement. Since both warm-up and muscle fatigue lead to local changes in the knee joint (e.g., temperature increase, lubrication of the joint, muscle activation), these may impact knee joint sounds. Therefore, this study investigates the effects of warm-up and muscle fatiguing exercise on knee joint sounds during an activity of daily living. Seventeen healthy, physically active volunteers (25.7 ± 2 years, 7 males) performed a control and an intervention session with a wash-out phase of one week. The control session consisted of sitting on a chair, while the intervention session contained a warm-up (walking on a treadmill) followed by a fatiguing exercise (modified sit-to-stand) protocol. Knee sounds were recorded by vibroarthrography (at the medial tibia plateau and at the patella) at three time points in each session during a sit-to-stand movement. The primary outcome was the mean signal amplitude (MSA, dB). Differences between sessions were determined by repeated measures ANOVA with intra-individual pre-post differences for the warm-up and for the muscle fatigue effect. We found a significant difference for MSA at the medial tibia plateau (intervention: mean 1.51 dB, standard deviation 2.51 dB; control: mean -1.28 dB, SD 2.61 dB; F = 9.5; p = .007; η2 = .37) during extension (from sit to stand) after the warm-up. There was no significant difference for any parameter after the muscle fatiguing exercise (p > .05). The increase in MSA may mostly be explained by an increase in internal knee load and joint friction. However, neuromuscular changes may also have played a role. It appears that the muscle fatiguing exercise has no impact on knee joint sounds in young, active, symptom-free participants during sit to stand.
In der seit 2009 andauernden Niedrigzinsphase ging der Anteil der Kapitaleinkommen am Volkseinkommen zurück. Da sich der Anteil des Faktors Arbeit nicht wesentlich veränderte, gewann der Produktionsfaktor Boden an Bedeutung. In der Volkswirtschaftlichen Gesamtrechnung spiegelt sich dies jedoch nicht wider, da der Boden dort nicht gesondert ausgewiesen wird. Mithilfe des Henry-George-Theorems wird versucht, eine Methode zu entwickeln, die den Anteil des Faktors Boden quantifiziert. Der explizite Ausweis des Bodeneinkommens in der Volkswirtschaftlichen Gesamtrechnung könnte dazu beitragen, die Rolle dieses vernachlässigten Faktors zu korrigieren.