Filtern
Erscheinungsjahr
Dokumenttyp
Volltext vorhanden
- ja (183)
Gehört zur Bibliographie
- nein (183)
Schlagworte
- Deutschland (14)
- Nachhaltigkeit (11)
- Rückenschmerz (11)
- COVID-19 (7)
- Digitalisierung (7)
- Maschinelles Lernen (7)
- China (6)
- Gesundheitswesen (6)
- Künstliche Intelligenz (6)
- Pandemie (6)
Institut
- FB Bauen + Leben (49)
- FB Umweltplanung/-technik (UCB) (44)
- FB Informatik + Therapiewissenschaft (30)
- FB Technik (13)
- FB Umweltwirtschaft/-recht (UCB) (11)
- IfaS - Institut für angewandtes Stoffstrommanagement (10)
- InDi - Institut für Internationale und Digitale Kommunikation (7)
- LaROS - Labor für Radiotechnologie und optische Systeme (6)
- ISS - Institut für Softwaresysteme in Wirtschaft, Umwelt und Verwaltung (5)
- FB Wirtschaft (3)
In this paper, the radio frequency (RF) behavior of mechanically stressed coaxial and for the first time also twisted-pair transmission lines is investigated over their service life. The main goal is to enable predictive maintenance for cables in moving applications and avoid preventive replacement. This also reduces the use of high-cost resources. For this purpose, stranded and solid-core variants of coaxial and twisted-pair type cables are mechanically loaded on the two-pulley apparatus according to EN 50396. Their RF transmission (S21) behavior is measured using a vector network analyzer and presented over bending cycles. For the first time, the phase response of mechanically loaded transmission lines is evaluated with respect to their service life. Two significant causes for the increasing attenuation and altered phase response are identified: breakage in foil screen and increasing surface roughness on the copper conductors. The identified causes are supported with literature evidence. Through measurements and theoretical calculations, it is proven that the phase is much more suitable for an assessment of the remaining service life than the amplitude. The findings can be used to implement a cable monitoring system in industrial environments which monitors the lines in-situ and reminds the user to replace them, whenever a certain wear-level is reached.
Research in global change ecology relies heavily on global climatic grids derived from estimates of air temperature in open areas at around 2 m above the ground. These climatic grids do not reflect conditions below vegetation canopies and near the ground surface, where critical ecosystem functions occur and most terrestrial species reside. Here, we provide global maps of soil temperature and bioclimatic variables at a 1-km2 resolution for 0–5 and 5–15 cm soil depth. These maps were created by calculating the difference (i.e. offset) between in situ soil temperature measurements, based on time series from over 1200 1-km2 pixels (summarized from 8519 unique temperature sensors) across all the world's major terrestrial biomes, and coarse-grained air temperature estimates from ERA5-Land (an atmospheric reanalysis by the European Centre for Medium-Range Weather Forecasts). We show that mean annual soil temperature differs markedly from the corresponding gridded air temperature, by up to 10°C (mean = 3.0 ± 2.1°C), with substantial variation across biomes and seasons. Over the year, soils in cold and/or dry biomes are substantially warmer (+3.6 ± 2.3°C) than gridded air temperature, whereas soils in warm and humid environments are on average slightly cooler (−0.7 ± 2.3°C). The observed substantial and biome-specific offsets emphasize that the projected impacts of climate and climate change on near-surface biodiversity and ecosystem functioning are inaccurately assessed when air rather than soil temperature is used, especially in cold environments. The global soil-related bioclimatic variables provided here are an important step forward for any application in ecology and related disciplines. Nevertheless, we highlight the need to fill remaining geographic gaps by collecting more in situ measurements of microclimate conditions to further enhance the spatiotemporal resolution of global soil temperature products for ecological applications.
Gait analysis is a systematic study of human movement. Combining wearable foot pressure sensors and machine learning (ML) solutions for a high-fidelity body pose tracking from RGB video frames could reveal more insights into gait abnormalities. However, accurate detection of heel strike (HS) and toe-off (TO) events is crucial to compute interpretable gait parameters. In this work, we present an experimental platform to study the timing of gait events using a new wearable foot pressure sensor (ActiSense System, IEE S.A., Luxembourg), and Google’s open-source ML solution MediaPipe Pose. For this purpose, two StereoPi systems were built to capture stereoscopic videos and images in real time. MediaPipe Pose was applied to the synchronized StereoPi cameras, and two algorithms (ALs) were developed to detect HS and TO events for gait and analysis. Preliminary results from a healthy subject walking on a treadmill show a mean relative deviation across all time spans of less than 4% for the ActiSense device and less than 16% for AL2 (33% for AL1) employing MediaPipe Pose on StereoPi videos. Finally, this work offers a platform for the development of sensor- and video-based ALs to automatically identify the timing of gait events in healthy individuals and those with gait disorders.
Since tangible assets of companies are becoming increasingly insignificant, emphasis should rather be placed on human capital as an essential source of competitive edge. This paper, accordingly, pursues the purpose to shed light on the major demands that the Millenials place on their prospective employers. In consequence, the work aims to identify attractiveness factors that German retailers should particularly promote in order to succeed in the war for talents and attract the most promising candidates among the German Gen Y. This work is based on a mixed-methods approach. First, interviews with German retail experts as well as generational keynote speakers were conducted in order to obtain a deep understanding and assessment of the German retail landscape from a professional perspective. The insights gained were subsequently used to design a questionnaire, which distribution led to a final sample of 216 useable responses by Millenials. Furthermore, the data obtained by interviewing experts and the survey was subsequently compared in order to evaluate to what extent the expectations of the Millenials correspond to the experts’ assessment. This study reveals Millenials to be driven by the need for growth, such as wide offers of development opportunities or scope for decision when choosing an employer. Among the relatedness needs, a harmonious working environment is particularly important, whereas a weekend off ranks first among the existential needs. Moreover, male Millenials consider Media Markt being the most popular employer in the German retail sector, while dm is preferred from a female perspective. Overall, employers of the German retail sector provide the majority of factors required by the Millenials, yet are only considered the 4th most popular industry behind the automotive, IT, art and entertainment industries. Our findings provide valuable practical implications as the research results might serve companies to build up a target group specific employer brand. Marketing strategies can be aligned with the identified attractiveness factors to efficiently and cost-effectively attract and bind Millenials to the company. Customized recruiting campaigns enhance the appeal as well as the attractiveness of an employer driving the likelihood of obtaining the strived status: Employer of Choice. To the best of the author’s knowledge, no study has yet dealt specifically with the attractiveness factors demanded by the Millenials in the context of the German retail sector as well as their most aspired employers in this industry. Furthermore, the attractiveness factors identified in the literature were embedded in Aldefer’s ERG theory. This work also offers a bilateral perspective through the widely conducted survey carried out among Millenials, which was additionally expanded through the lens of experts.
Purpose: Grounded in the theoretical concepts of utilitarianism and deontology, this paper aims to evaluate the issue of child labour from an ethics perspective. By linking utilitarianism with normative stakeholder theory, relevant stakeholder groups are being identified in order to examine their influence on and role in the occurrence of child labour allowing for a practical reference. The findings may serve companies in particular as a basis for decision-making in the development of their value chains.
Design/Methodology/Approach: The author uses a literature review in order to analyze the findings of existing literature on the topic of child labour in an ethics context, thereby drawing on literature, indexed in Web of Science and Google Scholar by employing forward and backward citation analysis.
Findings: The investigation of child labour in terms of ethics yields conflicting results. From a deontological perspective, child labour can never be ethical and should always be rejected as it is not wanted to become a general law. In contrast, according to a utilitarian sentiment, child labour is ethically justifiable as long as the beneficiaries of the labour are greater in number than the children working or suffering.
Originality/Value: The examination of child labour from the perspective of deontology and utilitarianism in conjunction with normative stakeholder theory constitutes a novelty in the ethics literature. The integration of theoretical findings into a practical business context provides additional value for managers and global supply chain managers.
The dark side of Samsung’s value chain: The human costs of cobalt mining “BLOOD, SWEAT AND COBALT”
(2021)
Samsung has been implicitly linked to human rights abuses and wider social downgrading propagated within the Democratic Republic of Congo (DRC). Reports by different studies have shown artisanal cobalt mines (ASM) to exploit child labour and subject workers to perilous conditions. The IT multinational is dependent upon Congolese cobalt as a key element in lithiumion batteries used to produce their array of electronics. However, irresponsible cobalt sourcing practices undertaken by Tier 1 suppliers, Glencore and Huayou, have resulted in ASM operations being incorporated into Samsung’s global value chain, as Tier 2 suppliers. Analysis of the relationships underpinning Samsung’s cobalt value chain theoretical framework, highlights the presence of a relational governance structure, with captive elements among upstream Tier 1 and Tier 2 suppliers. Samsung is thereby reliant upon both Glencore and Huayou to transmit and enforce private codes of conduct down the value chain to expel human rights abuses. In conjunction, the DRC’s weak and unstable institutional environment has facilitated corruption and the improper enforcement of laws across the ASM industry. It is thereby imperative that Samsung takes ownership of the issues present within its value chain, as both Tier 1 suppliers and the Congolese government have failed to ensure responsible cobalt sourcing practices to date. This report recommends that Samsung adopt a holistic action plan, not only utilising their own resources and capabilities, but also those of critical stakeholders including Tier 1 suppliers, NGOs and the DRC and South Korean governments. Most prominently, this report suggests that supply chain transparency can be improved using certificates of origin and blockchain technology. Furthermore, it is recommended that poverty alleviation is targeted as a key measure through “Cobalt for Development”, an action plan designed to instigate both social and economic upgrading within ASM operations and the wider community. By employing a multi-scalar approach and addressing the issues inherent across multiple governance levels, Samsung can ensure a responsible source of cobalt be sustained.
Containerization is one of the most important topics for modern data centers and web developers. Since the number of containers on one- and multi-node systems is growing, knowledge about the energy consumption behavior of single web-service containers is essential in order to save energy and, of course, money. In this article, we are going to show how the energy consumption behavior of single containerized web services/web apps changes while creating replicas of the service in order to scale and balance the web service.
Context: In the framework of studying cosmic microwave background polarization and characterizing its Galactic foregrounds, the angular power spectrum analysis of the thermal dust polarization map has led to intriguing evidence of an E/B asymmetry and a positive TE correlation. The interpretation of these observations is the subject of theoretical and simulation-driven studies in which the correlation between the density structure of the interstellar medium (ISM) and the magnetic field appears to be a key aspect. In this context, and when the magnetized ISM structures are modeled in three dimensions, dust clouds are generally considered to be filamentary structures only, but both filamentary and sheet-like shapes are supported by observational and theoretical evidence.
Aims: We aim to study the influence of the cloud shape and its connection to the local magnetic field, as well as the influence from the viewing angle, on the angular power spectra measured on thermal dust polarization maps; we specifically focus on the dependence of the E/B power asymmetry and TE correlation.
Methods: To this end, we simulated realistic interstellar clouds with both filament-like and sheet-like shapes using the software ASTERION, which also allowed us to generate synthetic maps of thermal dust polarized emission with an area of 400 square degrees. Then, we computed their polarization power spectra in the multipole range ℓ ϵ [100, 500] and focused on the E/B power asymmetry, quantified through the ℛEB ratio, and the correlation coefficient rTE between Τ and Ε modes. We quantified the dependence of ℛEB and rTE values on the offset angle (between the longest cloud axis and local magnetic field lines) and inclination angle (between the line of sight and the magnetic field) for both types of cloud shapes, either embedded in a regular magnetic field or coupled to a nonregular field to mimic turbulence.
Results: We find that both types of cloud shapes cover the same regions of the (ℛEB, rTE) parameter space. The dependence on the inclination and offset angles is similar for both shapes, although sheet-like structures generally show larger scatter than filamentary structures. In addition to the known dependence on the offset angle, we find a strong dependence of ℛEB and rTE on the inclination angle.
Conclusions: The very fact that filament-like and sheet-like structures may lead to polarization power spectra with similar (ℛEB,rTE) values complicates their interpretation. We argue that interpreting them solely in terms of filament characteristics is risky, and in future analyses, this degeneracy should be accounted for, as should the connection to the magnetic field geometry. Our results based on maps of 400 square degrees clarify that the overall geometrical arrangement of the magnetized ISM surrounding the observer leaves its marks on polarization power spectra.
For the assessment of human reaction time, a test environment was developed. This system consists of an embedded device with organic light-emitting diode (OLED) displays with push buttons for the combined presentation of visual stimulation and registration of the haptic human reaction. The test leader can define the test sequence with the aid of a graphical user interface (GUI) on a personal computer (PC). The validation of the system was proved by measuring the latency times of the whole system, which are conditioned by the specific hard- and software constellation. Through the investigation of the display’s light radiation by a photodiode and the recorded current consumption, latency times and their variance were specified. In the fastest mode the system can reach an error limit of 60 μs.
The photo-Dember effect is a source of impulsive THz emission following femtosecond pulsed optical excitation. This emission results from the ultrafast spatial separation of electron-hole pairs in strong carrier gradients due to their different diffusion coefficients. The associated time dependent polarization is oriented perpendicular to the excited surface which is inaptly for efficient out coupling of THz radiation. We propose a scheme for generating strong carrier gradients parallel to the excited surface. The resulting photo-Dember currents are oriented in the same direction and emit THz radiation into the favorable direction perpendicular to the surface. This effect is demonstrated for GaAs and In(0.53)Ga(0.47)As. Surprisingly the photo-Dember THz emitters provide higher bandwidth than photoconductive emitters. Multiplexing of phase coherent photo-Dember currents by periodically tailoring the photoexcited spatial carrier distribution gives rise to a strongly enhanced THz emission, which reaches electric field amplitudes comparable to a high-efficiency externally biased photoconductive emitter.
Global vernetzte Supply Chains (SC) führen bei den Unternehmen zu geringeren Kosten, aber zugleich erhöhen diese auch die Abhängigkeit ggü. den Lieferanten und die Störanfälligkeit der SCs wird erhöht. In den vergangenen Jahren haben zudem die Unsicherheiten für die SCs stark zugenommen. Treiber waren hier u.a. der Brexit, Handelskonflikte oder auch die Corona-Pandemie. In diesem Zusammenhang steht verstärkt die Entwicklung neuer SC-Strategien im Fokus der Unternehmen. Dabei wird stark auf die Verbesserung der Resilienz der SCs geachtet, um dadurch u.a. die Risiken für die SCs zu reduzieren. Die Arbeit beschäftigt sich mit den Auswirkungen steigender Unsicherheiten auf das Design sowie die Resilienz von SCs und hat das Ziel zu ermitteln, ob es Änderungen in der SC bedarf, um auf die Auswirkungen steigender Unsicherheiten zu reagieren und wie die Resilienz zukünftig sichergestellt werden kann (Trade-off zwischen Resilienz und Kosteneffizienz). Im Rahmen der Untersuchung erfolgte eine qualitative Forschung in Form von Experteninterviews, da so u.a. spezifische Meinungen oder auch Begründungen und Einstellungen von bestimmten Personen zu den vorliegenden Thematiken ermittelt werden können. Die Ergebnisse zeigen, dass die Kosten weiterhin meist der entscheidende Aspekt sind und es mehr Transparenz, Flexibilität sowie ein besseres Risikomanagement nötig ist. Des Weiteren bedarf es zukünftig u.a. einer größeren Berücksichtigung von Unsicherheiten, eine Erhöhung von Sicherheitsbeständen sowie eine Verringerung der Komplexität der SCs und u.U. mehr Local Sourcing. Es empfiehlt sich eine weitere Untersuchung hinsichtlich der Kosten, die durch Resilienz-Instrumente sowie durch fehlende Resilienz entstehen können, durchzuführen.
Companies have made considerable progress in assessing the sustainability of their processes and products, including the information and communication technology (ICT) sector. However, it is surprising that little attention has been given to the sustainability performance of software products. For this article, we chose a case study approach to explore the extent, to which software manufacturers have considered sustainability criteria for their products. We selected a manufacturer of sustainability management software on the assumption that they would be more likely to integrate elements of sustainability performance in their products. In the case study, we applied a previously developed set of criteria for sustainable software (SCSS) using a questionnaire and experiments, to assess a web-based sustainability management software product regarding its sustainability performance. The assessment finds that despite a sustainability conscious manufacturer, a systematic assessment of sustainability regarding software products is missing in the case study. This implies that sustainability assessment for software products is still novel, corresponding knowledge is missing and suitable tools are not yet being widely applied in the industry. The SCSS presents a suitable approach to close this gap, but it does require further refinement, for example regarding its applicability to web-based software on external servers.
Sustainable software products - Towards assessment criteria for resource and energy efficiency
(2018)
Many authors have proposed criteria to assess the “environmental friendliness” or “sustainability” of software products. However, a causal model that links observable properties of a software product to conditions of it being green or (more general) sustainable is still missing. Such a causal model is necessary because software products are intangible goods and, as such, only have indirect effects on the physical world. In particular, software products are not subject to any wear and tear, they can be copied without great effort, and generate no waste or emissions when being disposed of. Viewed in isolation, software seems to be a perfectly sustainable type of product. In real life, however, software products with the same or similar functionality can differ substantially in the burden they place on natural resources, especially if the sequence of released versions and resulting hardware obsolescence is taken into account. In this article, we present a model describing the causal chains from software products to their impacts on natural resources, including energy sources, from a life-cycle perspective. We focus on (i) the demands of software for hardware capacities (local, remote, and in the connecting network) and the resulting hardware energy demand, (ii) the expectations of users regarding such demands and how these affect hardware operating life, and (iii) the autonomy of users in managing their software use with regard to resource efficiency. We propose a hierarchical set of criteria and indicators to assess these impacts. We demonstrate the application of this set of criteria, including the definition of standard usage scenarios for chosen categories of software products. We further discuss the practicability of this type of assessment, its acceptability for several stakeholders and potential consequences for the eco-labeling of software products and sustainable software design.
The number of additive manufacturing methods and materials is growing rapidly, leaving gaps in the knowledge of specific material properties. A relatively recent addition is the metal-filled filament to be printed similarly to the fused filament fabrication (FFF) technology used for plastic materials, but with additional debinding and sintering steps. While tensile, bending, and shear properties of metals manufactured this way have been studied thoroughly, their fatigue properties remain unexplored. Thus, the paper aims to determine the tensile, fatigue, and impact strengths of Markforged 17-4 PH and BASF Ultrafuse 316L stainless steel to answer whether the metal FFF can be used for structural parts safely with the current state of technology. They are compared to two 316L variants manufactured via selective laser melting (SLM) and literature results. For extrusion-based additive manufacturing methods, a significant decrease in tensile and fatigue strength is observed compared to specimens manufactured via SLM. Defects created during the extrusion and by the pathing scheme, causing a rough surface and internal voids to act as local stress risers, handle the strength decrease. The findings cast doubt on whether the metal FFF technique can be safely used for structural components; therefore, further developments are needed to reduce internal material defects.
Railroads, roads, rivers, and airways are the most common modes of transportation for people and commodities. The cost of different ways of transportation varies according to distance, luxury, size, fragility, and other factors. When the following factors are accounted for, the vehicle might become prohibitively expensive for many individuals. A new means of conveyance has been developed. Elon Musk initially proposed it as the fifth mode of transportation in 2012. For commuters and goods, Hyperloop offers a quick and cost-effective way of transportation. The Hyperloop is essentially a vacuum tube train that transports people or products at incredible speeds while efficiently. Compared to traditional forms of transportation, the Hyperloop is ideal since it is highly energy-efficient, quiet, and self-contained. Increased cargo delivery speeds will be the most evident benefit of this idea to the industry. Hyperloop also has the potential to make a significant contribution to green supply chains. It is a carbon-free form of transportation that has changed inland freight transportation and maritime and air freight transit. It can move freight below, above ground, and under-water. The aim of this paper is to explain this new innovative technology as a development for logistic concepts.
Background: The STarT-MSK-Tool is an adaptation of the well established STarT-Back-Tool, used to risk-stratify patients with a wider range of musculoskeletal presentations.
Objective: To formally translate and cross-culturally adapt the Keele STarT-MSK risk stratification tool into German (STarT-MSKG) and to establish its reliability and validity.
Methods: A formal, multi-step, forward and backward translation approach was used. To assess validity patients aged ≥18 years, with acute, subacute or chronic musculoskeletal presentations in the lumbar spine, hip, knee, shoulder, or neck were included. The prospective cohort was used with initial data collected electronically at the point-of-consultation. Retest and 6-month follow-up questionnaires were sent by email. Test-retest reliability, construct validity, discriminative ability, predictive ability and floor or ceiling effects were analysed using intraclass correlation coefficient, and comparisons with a reference standard (Orebro-Musculoskeletal-Pain-Questionnaire: OMPQ) using correlations, ROC-curves and regression models.
Results: The participants’ (n = 287) mean age was 47 (SD = 15.8) years, 51% were female, with 48.8% at low, 43.6% at medium, and 7.7% at high risk. With ICC = 0.75 (95% CI 0.69; 0.81) test-retest-reliability was good. Construct validity was good with correlations for the STarT-MSKG-Tool against the OMPQ-Tool of rs = 0.74 (95% CI 0.68, 0.79). The ability of the tool [comparison OMPQ] to predict 6-month pain and disability was acceptable with AUC = 0.77 (95% CI 0.71, 0.83) [OMPQ = 0.74] and 0.76 (95% CI 0.69, 0.82) [OMPQ = 0.72] respectively. However, the explained variance (linear/logistic regression) for predicting 6-month pain (21% [OMPQ = 17%]/logistic = 29%) and disability (linear = 20%:[OMPQ = 19%]/logistic = 26%), whilst being comparable to the existing OMPQ reference standard, fell short of the a priori target of ≥30%.
Conclusions: The German version of the STarT-MSK-Tool is a valid instrument for use across multiple musculoskeletal conditions and is availabe for use in clinical practice. Comparison with the OMPQ suggests it is a good alternative.
Stratified care for low back pain (LBP) has been shown to be clinically- and cost-effective in the UK, but its transferability to the German healthcare system is unknown. This study explores LBP patients’ perspectives regarding future implementation of stratified care, through in-depth interviews (n = 12). The STarT-Back-Tool was completed by participants prior to interviews. Interview data were analysed using Grounded Theory. The overarching theme identified from the data was ‘treatment-success’, with subthemes of ‘assessment and treatment planning’, ‘acceptance of the questionnaire’ and ‘contextual factors’. Patients identified the underlying cause of pain as being of great importance (whereas STarT-Back allocates treatment based on prognosis). The integration of the STarT-Back-Tool in consultations was considered helpful as long as it does not disrupt the therapeutic relationship, and was acceptable if tool results are handled confidentially. Results indicate that for patients to find STarT-Back acceptable, the shift from a focus on identifying a cause of pain and subsequent diagnosis, to prediction-orientated treatment planning, must be made clear. Patient ‘buy in’ is important for successful uptake of clinical interventions, and findings can help to inform future strategies for implementing STarT-Back in the Germany, as well as having potential implications for transferability to other similar healthcare systems.
Background: The STarT-Back-Approach (STarT: Subgroups for Targeted Treatment) was developed in the UK and has demonstrated clinical and cost effectiveness. Based on the results of a brief questionnaire, patients with low back pain are stratified into three treatment groups. Since the organisation of physiotherapy differs between Germany and the UK, the aim of this study is to explore German physiotherapists’ views and perceptions about implementing the STarT-Back-Approach.
Methods: Three two-hour think-tank workshops with physiotherapists were conducted. Focus groups, using a semi-structured interview guideline, followed a presentation of the STarT-Back-Approach, with discussions audio recorded, transcribed and qualitatively analysed using content analysis.
Results: Nineteen physiotherapists participated (15 female, mean age 41.2 (SD 8.6) years). Three main themes emerged, each with multiple subthemes: 1) the intervention (15 subthemes), 2) the healthcare context (26 subthemes) and 3) individual characteristics (8 subthemes). Therapists’ perceptions of the extent to which the STarT-Back intervention would require changes to their normal clinical practice varied considerably. They felt that within their current healthcare context, there were significant financial disincentives that would discourage German physiotherapists from providing the STarT-Back treatment pathways, such as the early discharge of low-risk patients with supported self-management materials. They also discussed the need for appropriate standardised graduate and post-graduate skills training for German physiotherapists to treat high-risk patients with a combined physical and psychological approach (e.g., communication skills).
Conclusions: Whilst many German physiotherapists are positive about the STarT-Back-Approach, there are a number of substantial barriers to implementing the matched treatment pathways in Germany. These include financial disincentives within the healthcare system to early discharge of low-risk patients. Therapists also highlighted the need for solutions in respect of scalable physiotherapy training to gain skills in combined physical and psychological approaches.
Background: Stratified care is an up-to-date treatment approach suggested for patients with back pain in several guidelines. A comprehensively studied stratification instrument is the STarT Back Tool (SBT). It was developed to stratify patients with back pain into three subgroups, according to their risk of persistent disabling symptoms. The primary aim was to analyse the disability differences in patients with back pain 12 months after inclusion according to the subgroups determined at baseline using the German version of the SBT (STarT-G). Moreover, the potential to improve prognosis for disability by adding further predictor variables, an analysis for differences in pain intensity according to the STarT-Classification, and discriminative ability were investigated.
Methods: Data from the control group of a randomized controlled trial were analysed. Trial participants were members of a private medical insurance with a minimum age of 18 and indicated as having persistent back pain. Measurements were made for the risk of back pain chronification using the STarT-G, disability (as primary outcome) and back pain intensity with the Chronic Pain Grade Scale (CPGS), health-related quality of life with the SF-12, psychological distress with the Patient Health Questionnaire-4 (PHQ-4) and physical activity. Analysis of variance (ANOVA), multiple linear regression, and area under the curve (AUC) analysis were conducted.
Results: The mean age of the 294 participants was 53.5 (SD 8.7) years, and 38% were female. The ANOVA for disability and pain showed significant differences (p < 0.01) among the risk groups at 12 months. Post hoc Tukey tests revealed significant differences among all three risk groups for every comparison for both outcomes. AUC for STarT-G’s ability to discriminate reference standard ‘cases’ for chronic pain status at 12 months was 0.79. A prognostic model including the STarT-Classification, the variables global health, and disability at baseline explained 45% of the variance in disability at 12 months.
Conclusions: Disability differences in patients with back pain after a period of 12 months are in accordance with the subgroups determined using the STarT-G at baseline. Results should be confirmed in a study developed with the primary aim to investigate those differences.
The integration of genetic algorithms to optimize the networks of value chains could enormously improve the performance of supply chains. For this reason, this paper describes in more detail the application of genetic algorithms in the value chains of the automotive industry. For this purpose, a theoretical model is built up to evaluate whether the application of the model can optimize the value chain. This option is described, analyzed and its restrictions are shown. Instead of looking at the entire network, individual finished goods and their bill of material are used as a basis for optimization, which greatly reduces the complexity of the original problem. The original complexity of the supply chain networks can thus be reduced and considered based on the bill of material.