60 Technik
Filtern
Erscheinungsjahr
Dokumenttyp
Sprache
- Englisch (39) (entfernen)
Volltext vorhanden
- ja (39)
Gehört zur Bibliographie
- nein (39)
Schlagworte
- Rapid Prototyping <Fertigung> (4)
- CO2-Bilanz (3)
- Elektrofahrzeug (3)
- Elektromobilität (3)
- additive manufacturing (3)
- 3D-Druck (2)
- BEV (2)
- Biokatalyse (2)
- Biotechnik (2)
- CO2 (2)
Institut
- FB Umweltplanung/-technik (UCB) (14)
- FB Technik (11)
- FB Bauen + Leben (6)
- IMiP - Institut für Mikroverfahrenstechnik und Partikeltechnologie (3)
- IBioPD - Institut für biotechnisches Prozessdesign (2)
- LaROS - Labor für Radiotechnologie und optische Systeme (2)
- FB Umweltwirtschaft/-recht (UCB) (1)
- Sonstiges (1)
Driven by falling photovoltaic (PV) installation costs and potential support policies, rooftop PV is expected to expand rapidly in Thailand. As a result, the relevant stakeholders, especially utilities, have concerns about the net economic impacts of high PV adoption. Using a cost–benefit analysis, this study quantifies the net economic impacts of rooftop PV systems on three utilities and on ratepayers in Thailand by applying nine different PV adoption scenarios with various buyback rates and annual percentages of PV cost reduction. Under Thailand’s current electricity tariff structure, Thai utilities are well-protected and able to pass all costs due to PV onto the ratepayers in terms of changes in retail rates. We find that when PV adoption is low, the net economic impacts on both the utilities and retail rates are small and the impacts on each utility depend on its specific characteristics. On the other hand, when PV adoption ranges from 9–14% in energy basis, five-year retail rate impacts become noticeable and are between 6% and 11% as compared to the projected retail rates in 2036 depending on the PV adoption level. Thus, it is necessary for Thailand to make tradeoffs among the stakeholders and maximize the benefits of rooftop PV adoption.
Background: As electric kick scooters, three-wheelers, and passenger cars enter the streets, efficiency trade-offs across vehicle types gain practical relevance for consumers and policy makers. Here, we compile a comprehensive dataset of 428 electric vehicles, including seven vehicle types and information on certified and real-world energy consumption. Regression analysis is applied to quantify trade-offs between energy consumption and other vehicle attributes.
Results: Certified and real-world energy consumption of electric vehicles increase by 60% and 40%, respectively, with each doubling of vehicle mass, but only by 5% with each doubling of rated motor power. These findings hold roughly also for passenger cars whose energy consumption tends to increase 0.6 ± 0.1 kWh/100 km with each 100 kg of vehicle mass. Battery capacity and vehicle mass are closely related. A 10 kWh increase in battery capacity increases the mass of electric cars by 15 kg, their drive range by 40–50 km, and their energy consumption by 0.7–1.0 kWh/100 km. Mass-produced state-of-the-art electric passenger cars are 2.1 ± 0.8 kWh/100 km more efficient than first-generation vehicles, produced at small scale.
Conclusion: Efficiency trade-offs in electric vehicles differ from those in conventional cars—the latter showing a strong dependency of fuel consumption on rated engine power. Mass-related efficiency trade-offs in electric vehicles are large and could be tapped by stimulating mode shift from passenger cars to light electric road vehicles. Electric passenger cars still offer potentials for further efficiency improvements. These could be exploited through a dedicated energy label with battery capacity as utility parameter.
Electric drive systems are increasingly used in automobiles. However, the combination of comfort, dynamics and safety requirements places high demands on the torque accuracy. The complex interplay of battery, inverter and electrical machine causes a lot of system uncertainties based on parameter fluctuations and measurement errors that influence the system performance. In this paper these influences on the closed loop torque control are analyzed and quantified using a variance based sensitivity analysis. The method enables to connect the variance of the torque accuracy with the parameter uncertainties causing this variance. Moreover, it quantifies the influences of the parameters independent of the complexity of the analyzed system. In addition, two methods to ensure convergence of the estimated variance based sensitivity measures are proposed. The results of the analysis are presented for 19 static working points of an battery electric drive system.
Since operational managers often monitor large numbers of wind turbines (WTs), they depend on a toolset to provide them with highly condensed information to identify and prioritize low performing WTs or schedule preventive maintenance measures. Power curves are a frequently used tool to assess the performance of WTs. The power curve health value (HV) used in this work is supposed to detect power curve anomalies since small deviations in the power curve are not easy to identify. It evaluates deviations in the linear region of power curves by performing a principal component analysis. To calculate the HV, the standard deviation in direction of the second principal component of a reference data set is compared to the standard deviation of a combined data set consisting of the reference data and data of the evaluated period. This article examines the applicability of this HV for different purposes as well as its sensitivities and provides a modified HV approach to make it more robust and suitable for heterogeneous data sets. The modified HV was tested based on ENGIE's open data wind farm and data of on- and offshore WTs from the WInD-Pool. It proved to detect anomalies in the linear region of the power curve in a reliable and sensitive manner and was also eligible to detect long term power curve degradation. Also, about 7 % of all corrective maintenance measures were preceded by high HVs with a median alarm horizon of three days. Overall, the HV proved to be a promising tool for various applications.
Deep brain stimulation (DBS) is an established therapy for movement disorders such as in Parkinson's disease (PD) and essential tremor (ET). Adjusting the stimulation parameters, however, is a labour-intensive process and often requires several patient visits. Physicians prefer objective tools to improve (or maintain) the performance in DBS. Wearable motion sensors (WMS) are able to detect some manifestations of pathological signs, such as tremor in PD. However, the interpretation of sensor data is often highly technical and methods to visualise tremor data of patients undergoing DBS in a clinical setting are lacking. This work aims to visualise the dynamics of tremor responses to DBS parameter changes with WMS while patients performing clinical hand movements. To this end, we attended DBS programming sessions of two patients with the aim to visualise certain aspects of the clinical examination. PD tremor and ET were effectively quantified by acceleration amplitude and frequency. Tremor dynamics were analysed and visualised based on setpoints, movement transitions and stability aspects. These methods have not yet been employed and examples demonstrate how tremor dynamics can be visualised with simple analysis techniques. We therefore provide a base for future research work on visualisation tools in order to assist clinicians who frequently encounter patients for DBS therapy. This could lead to benefits in terms of enhanced evaluation of treatment efficacy in the future.
For the assessment of human reaction time, a test environment was developed. This system consists of an embedded device with organic light-emitting diode (OLED) displays with push buttons for the combined presentation of visual stimulation and registration of the haptic human reaction. The test leader can define the test sequence with the aid of a graphical user interface (GUI) on a personal computer (PC). The validation of the system was proved by measuring the latency times of the whole system, which are conditioned by the specific hard- and software constellation. Through the investigation of the display’s light radiation by a photodiode and the recorded current consumption, latency times and their variance were specified. In the fastest mode the system can reach an error limit of 60 μs.
Carbon footprinting of universities worldwide: Part I — objective comparison by standardized metrics
(2021)
Background: Universities, as innovation drivers in science and technology worldwide, should be leading the Great Transformation towards a carbon–neutral society and many have indeed picked up the challenge. However, only a small number of universities worldwide are collecting and publishing their carbon footprints, and some of them have defined zero emission targets. Unfortunately, there is limited consistency between the reported carbon footprints (CFs) because of different analysis methods, different impact measures, and different target definitions by the respective universities.
Results: Comprehensive CF data of 20 universities from around the globe were collected and analysed. Essential factors contributing to the university CF were identified. For the first time, CF data from universities were not only compared. The CF data were also evaluated, partly corrected, and augmented by missing contributions, to improve the consistency and comparability. The CF performance of each university in the respective year is thus homogenized, and measured by means of two metrics: CO2e emissions per capita and per m2 of constructed area. Both metrics vary by one order of magnitude across the different universities in this study. However, we identified ten universities reaching a per capita carbon footprint of lower than or close to 1.0 Mt (metric tons) CO2e/person and year (normalized by the number of people associated with the university), independent from the university’s size. In addition to the aforementioned two metrics, we suggested a new metric expressing the economic efficiency in terms of the CF per $ expenditures and year. We next aggregated the results for all three impact measures, arriving at an overall carbon performance for the respective universities, which we found to be independent of geographical latitude. Instead the per capita measure correlates with the national per capita CFs, and it reaches on average 23% of the national impacts per capita. The three top performing universities are located in Switzerland, Chile, and Germany.
Conclusion: The usual reporting of CO2 emissions is categorized into Scopes 1–3 following the GHG Protocol Corporate Accounting Standard which makes comparison across universities challenging. In this study, we attempted to standardize the CF metrics, allowing us to objectively compare the CF at several universities. From this study, we observed that, almost 30 years after the Earth Summit in Rio de Janeiro (1992), the results are still limited. Only one zero emission university was identified, and hence, the transformation should speed up globally.
As productive biofilms are increasingly gaining interest in research, the quantitative monitoring of biofilm formation on- or offline for the process remains a challenge. Optical coherence tomography (OCT) is a fast and often used method for scanning biofilms, but it has difficulty scanning through more dense optical materials. X-ray microtomography (μCT) can measure biofilms in most geometries but is very time-consuming. By combining both methods for the first time, the weaknesses of both methods could be compensated. The phototrophic cyanobacterium Tolypothrix distorta was cultured in a moving bed photobioreactor inside a biocarrier with a semi-enclosed geometry. An automated workflow was developed to process µCT scans of the biocarriers. This allowed quantification of biomass volume and biofilm-coverage on the biocarrier, both globally and spatially resolved. At the beginning of the cultivation, a growth limitation was detected in the outer region of the carrier, presumably due to shear stress. In the later phase, light limitations could be found inside the biocarrier. µCT data and biofilm thicknesses measured by OCT displayed good correlation. The latter could therefore be used to rapidly measure the biofilm formation in a process. The methods presented here can help gain a deeper understanding of biofilms inside a process and detect any limitations.
A study of industry 4.0 technologies in the John Deere and Company and their impact on company operations is presented in this paper. Deere and Company’s implementation of Industry 4.0 to its factories and its factors was the focus of the research. The literature review with the systematic approach as well as a comprehensive review of current John Deere and Company’s developments is used in the current study. Also, it relied on freely available information on the company website. Public and investor relations have also been used as credible sources of information. An analysis found that adopting industry 4.0 technologies to agriculture manufacturing results in higher quality products, increased productivity, safety, and wider acceptance among stakeholders. This study assumes full implementation of these technologies in all agriculture manufacturing companies, and it also emphasizes up-to-date technologies. Studying this topic can be useful for engineers in mechanical and agricultural fields, managers in business, and marketers.
This study will describe how the robotics industry evolved increasingly and a new phase of advanced robotics has emerged, and the relation between humans and robots in the same workplace. Problems of designing safer robots in human-machine interaction systems are urgent research topics in the field of industrial robotics. Many of the problems in industrial robotics are related not just to technological issues, but also to human-robot collaboration also will be discussed as an effective method to tackle this issue is the invention of Collaborative robots.