0


Guest Editorial

ASME J. Risk Uncertainty Part B. 2017;4(3):030301-030301-1. doi:10.1115/1.4038467.
FREE TO VIEW

The importance of safety, security, and risk management has been recognized in nuclear multiscale systems modeling, simulation, and analysis applications. Since 2011, earthquake and tsunami led to the nuclear accident at Fukushima Daiichi-Japan; nuclear energy facilities have been under massive pressures to enhance the safety and security. Enormous number of researches was conducted on the area of nuclear safety and security including cybersecurity, stress testing, resilience analysis along with risk management.

Commentary by Dr. Valentin Fuster

Special Section Papers

ASME J. Risk Uncertainty Part B. 2017;4(3):030901-030901-10. doi:10.1115/1.4037878.

Severe accident facilities for European safety targets (SAFEST) is a European project networking the European experimental laboratories focused on the investigation of a nuclear power plant (NPP) severe accident (SA) with reactor core melting and formation of hazardous material system known as corium. The main objective of the project is to establish coordinated activities, enabling the development of a common vision and severe accident research roadmaps for the next years, and of the management structure to achieve these goals. In this frame, a European roadmap on severe accident experimental research has been developed to define research challenges to contribute to further reinforcement of Gen II and III NPP safety. The roadmap takes into account different SA phenomena and issues identified and prioritized in the analyses of severe accidents at commercial NPPs and in the results of the recent European stress tests carried out after the Fukushima accident. Nineteen relevant issues related to reactor core meltdown accidents have been selected during these efforts. These issues have been compared to a survey of the European SA research experimental facilities and corium analysis laboratories. Finally, the coherence between European infrastructures and R&D needs has been assessed and a table linking issues and infrastructures has been derived. The comparison shows certain important lacks in SA research infrastructures in Europe, especially in the domains of core late reflooding impact on source term, reactor pressure vessel failure and molten core release modes, spent fuel pool (SFP) accidents, as well as the need for a large-scale experimental facility operating with up to 500 kg of chemically prototypic corium melt.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2017;4(3):030902-030902-9. doi:10.1115/1.4037877.

The objective of this paper is to develop a probabilistic risk assessment (PRA) methodology against volcanic eruption for decay heat removal function of sodium-cooled fast reactors (SFRs). In the volcanic PRA methodology development, only the effect of volcanic tephra (pulverized magma) is taken into account, because there is a great distance between a plant site assumed in this study and volcanoes. The volcanic tephra (ash) could potentially clog air filters of air-intakes that are essential for the decay heat removal. The degree of filter clogging can be calculated by atmospheric concentration of ash and tephra fallout duration and also suction flow rate of each component. This study evaluated a volcanic hazard using a combination of tephra fragment size, layer thickness, and duration. In this paper, functional failure probability of each component is defined as a failure probability of filter replacement obtained by using a grace period to filter failure. Finally, based on an event tree, a core damage frequency has been estimated by multiplying discrete hazard frequencies by conditional decay heat removal failure probabilities. A dominant sequence has been identified as well. In addition, sensitivity analyses have investigated the effects of a tephra arrival reduction factor and prefilter covering.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2017;4(3):030903-030903-7. doi:10.1115/1.4037879.

This study presents an assessment of the RELAP5/MOD3.3 using the experimental work upon the rewetting mechanism of bottom flooding of a vertical annular water flow inside a channel enclosing concentrically a heated rod. The experiments have been carried out in the experimental rig 1 of the Nuclear Engineering Department of National Technical University of Athens (NTUA-NED-ER1) inside which the dry out and the rewetting process of a hot vertical rod can be simulated. Experiments have been conducted at atmospheric conditions with liquid coolant flow rate within the range of 0.008 and 0.050 kg·s−1 and two levels of subcooling 25 and 50 K. The initial average surface temperature of the rod for each experiment was set at approximately 823 K. The predicted rod surface temperatures during rewetting of the RELAP5/MOD3.3 calculations were compared against the experimental values. The results presented in this study show that RELAP5/MOD3.3 provides temperature estimations of the reflooding mechanism within acceptable marginal error. However, larger deviations between predicted and experimental values have been observed when subcooled water was used instead of saturated one.

Commentary by Dr. Valentin Fuster

Research Papers

ASME J. Risk Uncertainty Part B. 2017;4(3):031001-031001-9. doi:10.1115/1.4037725.

Vehicle door latch performance testing presently utilizes uniaxial quasi-static loading conditions. Current technology enables sophisticated virtual testing of a broad range of systems. Door latch failures have been observed in vehicles under a variety of conditions. Typically, these conditions involve multi-axis loading conditions. The loading conditions presented during rollovers on passenger vehicle side door latches have not been published. Rollover crash test results, rollover crashes, and physical Federal Motor Vehicle Safety Standard (FMVSS) 206 latch testing results are reviewed. The creation and validation of a passenger vehicle door latch model is described. The multi-axis loading conditions observed in virtual rollover testing at the latch location are characterized and applied to the virtual testing of a latch in the secondary latch position. The results are then compared with crash test and real world rollover results for the same latch. The results indicate that a door latch that meets the secondary latch position requirements may fail at loads substantially below the FMVSS 206 uniaxial failure loads. In the side impact mode, risks associated with door handle designs and the potential for inertial release can be considered prior to manufacturing with virtual testing. An example case showing the effects of material and spring selection illustrates the potential issues that can be detected in advance of manufacturing. The findings suggest the need for re-examining the relevance of existing door latch testing practices in light of the prevalence of rollover impacts and other impact conditions in today's vehicle fleet environment.

Topics: Doors , Stress , Testing , Failure , Vehicles
Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2017;4(3):031002-031002-14. doi:10.1115/1.4038318.

Time-dependent system reliability is computed as the probability that the responses of a system do not exceed prescribed failure thresholds over a time duration of interest. In this work, an efficient time-dependent reliability analysis method is proposed for systems with bivariate responses which are general functions of random variables and stochastic processes. Analytical expressions are derived first for the single and joint upcrossing rates based on the first-order reliability method (FORM). Time-dependent system failure probability is then estimated with the computed single and joint upcrossing rates. The method can efficiently and accurately estimate different types of upcrossing rates for the systems with bivariate responses when FORM is applicable. In addition, the developed method is applicable to general problems with random variables, stationary, and nonstationary stochastic processes. As the general system reliability can be approximated with the results from reliability analyses for individual responses and bivariate responses, the proposed method can be extended to reliability analysis of general systems with more than two responses. Three examples, including a parallel system, a series system, and a hydrokinetic turbine blade application, are used to demonstrate the effectiveness of the proposed method.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2017;4(3):031003-031003-7. doi:10.1115/1.4038340.

Prestress applied on bridges affects the dynamic interaction between bridges and vehicles traveling over them. In this paper, the prestressed bridge is modeled as a beam subjected to eccentric prestress force at the two ends, and a half-vehicle model with four degrees-of-freedom is used to represent the vehicle passing the bridge. A new bridge–vehicle interaction model considering the effect of prestress with eccentricity is developed through the principle of virtual work. The correctness and accuracy of the model are validated with literature results. Based on the developed model, numerical simulations have been conducted using Newmark's β method to study the effects of vehicle speed, eccentricity and amplitude of the prestress, and presence of multiple vehicles. It is shown that prestress has an important effect on the maximum vertical acceleration of vehicles, which may provide a good index for detecting the change of prestress. It is also interesting to find that the later-entering vehicle on the prestressed bridge will largely reduce the maximum vertical acceleration of the vehicle ahead of it.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2017;4(3):031004-031004-12. doi:10.1115/1.4038170.

The influence of component errors on the final error is a key point of error modeling of computer numerical control (CNC) machine tool. Nevertheless, the mechanism by which the errors in mechanical parts accumulate to result in the component errors and then impact the final error of CNC machine tool has not been identified; the identification of this mechanism is highly relevant to precision design of CNC machine. In this study, the error modeling based on the Jacobian-torsor theory is applied to determine how the fundamental errors in mechanical parts influence and accumulate to the comprehensive error of single-axis assembly. First, a brief introduction of the Jacobian-torsor theory is provided. Next, the Jacobian-torsor model is applied to the error modeling of a single-axis assembly in a three-axis machine center. Furthermore, the comprehensive errors of the single-axis assembly are evaluated by Monte Carlo simulation based on the synthesized error model. The accuracy and efficiency of the Jacobian-torsor model are verified through a comparison between the simulation results and the measured data from a batch of similar vertical machine centers. Based on the Jacobian-torsor model, the application of quantitative sensitivity analysis of single-axis assembly is investigated, along with the analysis of key error sources to the synthetical error ranges of the single-axis assembly. This model provides a comprehensive method to identify the key error source of the single-axis assembly and has the potential to enhance the tolerance/error allocation of the single axis and the whole machine tool.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2017;4(3):031005-031005-11. doi:10.1115/1.4038372.

In the present study, a general probabilistic design framework is developed for cyclic fatigue life prediction of metallic hardware using methods that address uncertainty in experimental data and computational model. The methodology involves: (i) fatigue test data conducted on coupons of Ti6Al4V material, (ii) continuum damage mechanics (CDM) based material constitutive models to simulate cyclic fatigue behavior of material, (iii) variance-based global sensitivity analysis, (iv) Bayesian framework for model calibration and uncertainty quantification, and (v) computational life prediction and probabilistic design decision making under uncertainty. The outcomes of computational analyses using the experimental data prove the feasibility of the probabilistic design methods for model calibration in the presence of incomplete and noisy data. Moreover, using probabilistic design methods results in assessment of reliability of fatigue life predicted by computational models.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2018;4(3):031006-031006-12. doi:10.1115/1.4039148.

Cyber-physical systems (CPS) are the physical systems of which individual components have functional identities in both physical and cyber spaces. Given the vastly diversified CPS components in dynamically evolving networks, designing an open and resilient architecture with flexibility and adaptability thus is important. To enable a resilience engineering approach for systems design, quantitative measures of resilience have been proposed by researchers. Yet, domain-dependent system performance metrics are required to quantify resilience. In this paper, generic system performance metrics for CPS are proposed, which are entropy, conditional entropy, and mutual information associated with the probabilities of successful prediction and communication. A new probabilistic design framework for CPS network architecture is also proposed for resilience engineering, where several information fusion rules can be applied for data processing at the nodes. Sensitivities of metrics with respect to the probabilistic measurements are studied. Fine-grained discrete-event simulation models of communication networks are used to demonstrate the applicability of the proposed metrics.

Commentary by Dr. Valentin Fuster
Select Articles from Part A: Civil Engineering

Technical Papers

ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems, Part A: Civil Engineering. 2018;4(3):. doi:10.1061/AJRUA6.0000969.
Abstract 

Abstract  This paper examines the task of approximating or generating samples according to a target probability distribution when this distribution is expressed as a function of the response of an engineering system. Frequently such approximation is performed in a sequential manner, using a series of intermediate densities that converge to the target density and may require a large number of evaluations of the system response, which for applications involving complex numerical models creates a significant computational burden. To alleviate this burden an adaptive Kriging stochastic sampling and density approximation framework (AK-SSD) is developed in this work. The metamodel approximates the system response vector, whereas the adaptive characteristics are established through an iterative approach. At the end of each iteration, the target density, approximated through the current metamodel, is compared to the density established at the previous iteration, using the Hellinger distance as a comparison metric. If convergence has not been achieved, then additional simulation experiments are performed to inform the metamodel development, through a sample-based design of experiments that balances between the improvement of the metamodel accuracy and the addition of experiments in regions of importance for the stochastic sampling. These regions are defined by considering both the target density and any intermediate densities. The process then moves to the next iteration, with an improved metamodel developed using all the available simulation experiments. Although the theoretical discussions are general, the emphasis is placed on rare-event simulation. For this application, once the target density is approximated (first stage), it is used (second stage) as an importance sampling density for estimating the rare-event likelihood. For the second stage, use of either the metamodel or the exact numerical model is examined.

Topics:
Density , Approximation
ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems, Part A: Civil Engineering. 2018;4(3):. doi:10.1061/AJRUA6.0000983.
Abstract 

Abstract  Monte Carlo simulation is the most versatile solution method for problems in stochastic computational mechanics but suffers from a slow convergence rate. The number of simulations required to produce an acceptable accuracy is often impractical for complex and time-consuming numerical models. In this paper, an element-based control variate approach is developed to improve the efficiency of Monte Carlo simulation in stochastic finite-element analysis, with particular reference to high-dimensional and nonlinear geotechnical problems. The method uses a low-order element to form an inexpensive approximation to the output of an expensive, high-order model. By keeping the mesh constant, a high correlation between low-order and high-order models is ensured, enabling a large variance reduction to be achieved. The approach is demonstrated by application to the bearing capacity of a strip footing on a spatially variable soil. The problem requires 300 input random variables to represent the spatial variability by random fields, and would be difficult to solve by methods other than Monte Carlo simulation. Using an element-based control variate reduces the standard deviation of the mean bearing capacity by approximately half. In addition, two methods for estimating the cumulative distribution function as a complement to the improved mean estimator are presented.

Topics:
Finite element analysis
ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems, Part A: Civil Engineering. 2018;4(3):. doi:10.1061/AJRUA6.0000980.
Abstract 

Abstract  Extreme precipitation is one of the most important climate hazards that pose a significant threat to human property and life. Understanding extreme precipitation events helps to manage their risk to society and hence reduce potential losses. This paper provides two new stochastic methods to analyze and predict various extreme precipitation events based on nonstationary models with or without the consideration of serial dependency associated with different days. These methods, together with Monte Carlo simulation and dynamic optimization, bridge nonextreme precipitation and extreme precipitation so that abundant nonextreme precipitation data can be used for extreme precipitation analysis. On an annual basis, the analysis produces distributions for the maximum daily precipitation, number of days with heavy rainfall, and maximum number of consecutive days with heavy rainfall. The accuracy of the new methods is examined, using 10 decades of empirical data in the Washington, DC metropolitan area. Based on the new methods, predictions of various extreme events are provided under different assumptions. Finally, the impact of serial dependency on results is also discussed. The result shows that for the area studied, serial dependency can further improve the analysis result.

Topics:
Climate , Precipitation
ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems, Part A: Civil Engineering. 2018;4(3):. doi:10.1061/AJRUA6.0000981.
Abstract 

Abstract  This paper compiles a database consisting of 16 tunnel excavation projects in clayey soils using the earth pressure balance (EPB) tunneling method. Statistical analysis of the data is performed and the probability distributions for ground loss ratio η and fitting parameter n, which are two major parameters in Peck’s formula for transverse surface settlement, are identified based on 78 and 29 points of field data, respectively. The correlation between η and n is fitted based on the data and its influences on the failure probability is evaluated. A serviceability performance function is modeled based on Peck’s settlement formula, and the serviceability failure probability is defined and analyzed with Monte Carlo simulations. A series of design charts, in terms of tunnel diameter, cover/diameter ratio, η, and n, are generated for practical uses without running Monte Carlo simulations.

Topics:
Tunnel construction , Failure , Probability , Soil

Case Studies

ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems, Part A: Civil Engineering. 2018;4(3):. doi:10.1061/AJRUA6.0000963.
Abstract 

Abstract  The Houston Ship Channel (HSC) is one of the busiest waterway corridors in the United States. Since the channel’s expansion in 2005, concerns have arisen about design deficiencies in the HSC in the area of the Bayport Ship Channel (BSC), especially north of the turn at Five Mile Cut. A mental models expert elicitation exercise was conducted in order to identify safety concerns arising from these design deficiencies and provide qualitative data that can structure analysis of technical data like those from automatic identification system (AIS) databases, which can better connect possible design deficiencies to incident outcomes. The elicitation produced an influence diagram to enable later causal reasoning and Bayesian analysis for the HSC and BSC confluence and nearby areas on the HSC, and helped to prime a comprehensive study of the feasibility of safety and performance modifications on this reach of the HSC.

Topics:
Safety , Navigation , Risk management , Ships

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In