0


Guest Editorial

ASME J. Risk Uncertainty Part B. 2017;4(3):030301-030301-1. doi:10.1115/1.4038467.
FREE TO VIEW

The importance of safety, security, and risk management has been recognized in nuclear multiscale systems modeling, simulation, and analysis applications. Since 2011, earthquake and tsunami led to the nuclear accident at Fukushima Daiichi-Japan; nuclear energy facilities have been under massive pressures to enhance the safety and security. Enormous number of researches was conducted on the area of nuclear safety and security including cybersecurity, stress testing, resilience analysis along with risk management.

Commentary by Dr. Valentin Fuster

Special Section Papers

ASME J. Risk Uncertainty Part B. 2017;4(3):030901-030901-10. doi:10.1115/1.4037878.

Severe accident facilities for European safety targets (SAFEST) is a European project networking the European experimental laboratories focused on the investigation of a nuclear power plant (NPP) severe accident (SA) with reactor core melting and formation of hazardous material system known as corium. The main objective of the project is to establish coordinated activities, enabling the development of a common vision and severe accident research roadmaps for the next years, and of the management structure to achieve these goals. In this frame, a European roadmap on severe accident experimental research has been developed to define research challenges to contribute to further reinforcement of Gen II and III NPP safety. The roadmap takes into account different SA phenomena and issues identified and prioritized in the analyses of severe accidents at commercial NPPs and in the results of the recent European stress tests carried out after the Fukushima accident. Nineteen relevant issues related to reactor core meltdown accidents have been selected during these efforts. These issues have been compared to a survey of the European SA research experimental facilities and corium analysis laboratories. Finally, the coherence between European infrastructures and R&D needs has been assessed and a table linking issues and infrastructures has been derived. The comparison shows certain important lacks in SA research infrastructures in Europe, especially in the domains of core late reflooding impact on source term, reactor pressure vessel failure and molten core release modes, spent fuel pool (SFP) accidents, as well as the need for a large-scale experimental facility operating with up to 500 kg of chemically prototypic corium melt.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2017;4(3):030902-030902-9. doi:10.1115/1.4037877.

The objective of this paper is to develop a probabilistic risk assessment (PRA) methodology against volcanic eruption for decay heat removal function of sodium-cooled fast reactors (SFRs). In the volcanic PRA methodology development, only the effect of volcanic tephra (pulverized magma) is taken into account, because there is a great distance between a plant site assumed in this study and volcanoes. The volcanic tephra (ash) could potentially clog air filters of air-intakes that are essential for the decay heat removal. The degree of filter clogging can be calculated by atmospheric concentration of ash and tephra fallout duration and also suction flow rate of each component. This study evaluated a volcanic hazard using a combination of tephra fragment size, layer thickness, and duration. In this paper, functional failure probability of each component is defined as a failure probability of filter replacement obtained by using a grace period to filter failure. Finally, based on an event tree, a core damage frequency has been estimated by multiplying discrete hazard frequencies by conditional decay heat removal failure probabilities. A dominant sequence has been identified as well. In addition, sensitivity analyses have investigated the effects of a tephra arrival reduction factor and prefilter covering.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2017;4(3):030903-030903-7. doi:10.1115/1.4037879.

This study presents an assessment of the RELAP5/MOD3.3 using the experimental work upon the rewetting mechanism of bottom flooding of a vertical annular water flow inside a channel enclosing concentrically a heated rod. The experiments have been carried out in the experimental rig 1 of the Nuclear Engineering Department of National Technical University of Athens (NTUA-NED-ER1) inside which the dry out and the rewetting process of a hot vertical rod can be simulated. Experiments have been conducted at atmospheric conditions with liquid coolant flow rate within the range of 0.008 and 0.050 kg·s−1 and two levels of subcooling 25 and 50 K. The initial average surface temperature of the rod for each experiment was set at approximately 823 K. The predicted rod surface temperatures during rewetting of the RELAP5/MOD3.3 calculations were compared against the experimental values. The results presented in this study show that RELAP5/MOD3.3 provides temperature estimations of the reflooding mechanism within acceptable marginal error. However, larger deviations between predicted and experimental values have been observed when subcooled water was used instead of saturated one.

Commentary by Dr. Valentin Fuster

Research Papers

ASME J. Risk Uncertainty Part B. 2017;4(3):031001-031001-9. doi:10.1115/1.4037725.

Vehicle door latch performance testing presently utilizes uniaxial quasi-static loading conditions. Current technology enables sophisticated virtual testing of a broad range of systems. Door latch failures have been observed in vehicles under a variety of conditions. Typically, these conditions involve multi-axis loading conditions. The loading conditions presented during rollovers on passenger vehicle side door latches have not been published. Rollover crash test results, rollover crashes, and physical Federal Motor Vehicle Safety Standard (FMVSS) 206 latch testing results are reviewed. The creation and validation of a passenger vehicle door latch model is described. The multi-axis loading conditions observed in virtual rollover testing at the latch location are characterized and applied to the virtual testing of a latch in the secondary latch position. The results are then compared with crash test and real world rollover results for the same latch. The results indicate that a door latch that meets the secondary latch position requirements may fail at loads substantially below the FMVSS 206 uniaxial failure loads. In the side impact mode, risks associated with door handle designs and the potential for inertial release can be considered prior to manufacturing with virtual testing. An example case showing the effects of material and spring selection illustrates the potential issues that can be detected in advance of manufacturing. The findings suggest the need for re-examining the relevance of existing door latch testing practices in light of the prevalence of rollover impacts and other impact conditions in today's vehicle fleet environment.

Topics: Doors , Stress , Testing , Failure , Vehicles
Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2017;4(3):031002-031002-14. doi:10.1115/1.4038318.

Time-dependent system reliability is computed as the probability that the responses of a system do not exceed prescribed failure thresholds over a time duration of interest. In this work, an efficient time-dependent reliability analysis method is proposed for systems with bivariate responses which are general functions of random variables and stochastic processes. Analytical expressions are derived first for the single and joint upcrossing rates based on the first-order reliability method (FORM). Time-dependent system failure probability is then estimated with the computed single and joint upcrossing rates. The method can efficiently and accurately estimate different types of upcrossing rates for the systems with bivariate responses when FORM is applicable. In addition, the developed method is applicable to general problems with random variables, stationary, and nonstationary stochastic processes. As the general system reliability can be approximated with the results from reliability analyses for individual responses and bivariate responses, the proposed method can be extended to reliability analysis of general systems with more than two responses. Three examples, including a parallel system, a series system, and a hydrokinetic turbine blade application, are used to demonstrate the effectiveness of the proposed method.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2017;4(3):031003-031003-7. doi:10.1115/1.4038340.

Prestress applied on bridges affects the dynamic interaction between bridges and vehicles traveling over them. In this paper, the prestressed bridge is modeled as a beam subjected to eccentric prestress force at the two ends, and a half-vehicle model with four degrees-of-freedom is used to represent the vehicle passing the bridge. A new bridge–vehicle interaction model considering the effect of prestress with eccentricity is developed through the principle of virtual work. The correctness and accuracy of the model are validated with literature results. Based on the developed model, numerical simulations have been conducted using Newmark's β method to study the effects of vehicle speed, eccentricity and amplitude of the prestress, and presence of multiple vehicles. It is shown that prestress has an important effect on the maximum vertical acceleration of vehicles, which may provide a good index for detecting the change of prestress. It is also interesting to find that the later-entering vehicle on the prestressed bridge will largely reduce the maximum vertical acceleration of the vehicle ahead of it.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2017;4(3):031004-031004-12. doi:10.1115/1.4038170.

The influence of component errors on the final error is a key point of error modeling of computer numerical control (CNC) machine tool. Nevertheless, the mechanism by which the errors in mechanical parts accumulate to result in the component errors and then impact the final error of CNC machine tool has not been identified; the identification of this mechanism is highly relevant to precision design of CNC machine. In this study, the error modeling based on the Jacobian-torsor theory is applied to determine how the fundamental errors in mechanical parts influence and accumulate to the comprehensive error of single-axis assembly. First, a brief introduction of the Jacobian-torsor theory is provided. Next, the Jacobian-torsor model is applied to the error modeling of a single-axis assembly in a three-axis machine center. Furthermore, the comprehensive errors of the single-axis assembly are evaluated by Monte Carlo simulation based on the synthesized error model. The accuracy and efficiency of the Jacobian-torsor model are verified through a comparison between the simulation results and the measured data from a batch of similar vertical machine centers. Based on the Jacobian-torsor model, the application of quantitative sensitivity analysis of single-axis assembly is investigated, along with the analysis of key error sources to the synthetical error ranges of the single-axis assembly. This model provides a comprehensive method to identify the key error source of the single-axis assembly and has the potential to enhance the tolerance/error allocation of the single axis and the whole machine tool.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2017;4(3):031005-031005-11. doi:10.1115/1.4038372.

In the present study, a general probabilistic design framework is developed for cyclic fatigue life prediction of metallic hardware using methods that address uncertainty in experimental data and computational model. The methodology involves: (i) fatigue test data conducted on coupons of Ti6Al4V material, (ii) continuum damage mechanics (CDM) based material constitutive models to simulate cyclic fatigue behavior of material, (iii) variance-based global sensitivity analysis, (iv) Bayesian framework for model calibration and uncertainty quantification, and (v) computational life prediction and probabilistic design decision making under uncertainty. The outcomes of computational analyses using the experimental data prove the feasibility of the probabilistic design methods for model calibration in the presence of incomplete and noisy data. Moreover, using probabilistic design methods results in assessment of reliability of fatigue life predicted by computational models.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2018;4(3):031006-031006-12. doi:10.1115/1.4039148.

Cyber-physical systems (CPS) are the physical systems of which individual components have functional identities in both physical and cyber spaces. Given the vastly diversified CPS components in dynamically evolving networks, designing an open and resilient architecture with flexibility and adaptability thus is important. To enable a resilience engineering approach for systems design, quantitative measures of resilience have been proposed by researchers. Yet, domain-dependent system performance metrics are required to quantify resilience. In this paper, generic system performance metrics for CPS are proposed, which are entropy, conditional entropy, and mutual information associated with the probabilities of successful prediction and communication. A new probabilistic design framework for CPS network architecture is also proposed for resilience engineering, where several information fusion rules can be applied for data processing at the nodes. Sensitivities of metrics with respect to the probabilistic measurements are studied. Fine-grained discrete-event simulation models of communication networks are used to demonstrate the applicability of the proposed metrics.

Commentary by Dr. Valentin Fuster
Select Articles from Part A: Civil Engineering

Technical Papers

ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems, Part A: Civil Engineering. 2018;4(2):. doi:10.1061/AJRUA6.0000965.
Abstract 

Abstract  Stochastic soil modeling aims to provide reasonable mean, variance, and spatial correlation of soil properties with quantified uncertainty. Because of difficulties in integrating limited and imperfect prior knowledge (i.e., epistemic uncertainty) with observed site-specific information from tests (i.e., aleatoric uncertainty), a reasonably accurate estimate of the spatial correlation is significantly challenging. Possible reasons include (1) only sparse data being available (i.e., one-dimensional observations are collected at selected locations); and (2) from a physical point of view, the formation process of soil layers is considerably complex. This paper develops a Gaussian Markov random field (GMRF)-based modeling framework to describe the spatial correlation of soil properties conditional on observed electric cone penetration test (CPT) soundings at multiple locations. The model parameters are estimated using a novel stochastic partial differential equation (SPDE) approach and a fast Bayesian algorithm using the integrated nested Laplace approximation (INLA). An existing software library is used to implement the SPDE approach and Bayesian estimation. A real-world example using 185 CPT soundings from Alameda County, California is provided to demonstrate the developed method and examine its performance. The analyzed results from the proposed model framework are compared with the widely accepted covariance-based kriging method. The results indicate that the new approach generally outperforms the kriging method in predicting the long-range variability. In addition, a better understanding of the fine-scale variability along the depth is achieved by investigating one-dimensional residual processes at multiple locations.

Topics:
Modeling , Soil
ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems, Part A: Civil Engineering. 2018;4(2):. doi:10.1061/AJRUA6.0000950.
Abstract 

Abstract  Metamodeling techniques have been widely used as substitutes for high-fidelity and time-consuming models in various engineering applications. Examples include polynomial chaos expansions, neural networks, kriging, and support vector regression (SVR). This paper attempts to compare the latter two in different case studies so as to assess their relative efficiency on simulation-based analyses. Similarities are drawn between these two metamodel types, leading to the use of anisotropy for SVR. Such a feature is not commonly used in the SVR-related literature. Special care was given to a proper automatic calibration of the model hyperparameters by using an efficient global search algorithm, namely the covariance matrix adaptation–evolution scheme. Variants of these two metamodels, associated with various kernel and autocorrelation functions, were first compared on analytical functions and then on finite element–based models. From the comprehensive comparison, it was concluded that anisotropy in the two metamodels clearly improves their accuracy. In general, anisotropic L2-SVR with the Matérn kernels was shown to be the most effective metamodel.

Topics:
Structural engineering , Support vector machines
ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems, Part A: Civil Engineering. 2018;4(2):. doi:10.1061/AJRUA6.0000956.
Abstract 

Abstract  Corrosion is one of the main causes of pipeline failure, which can have large social, economic, and environmental consequences. To mitigate this risk, pipeline operators perform regular inspections and repairs. The results of the inspections aid decision makers in determining the optimal maintenance strategy. However, there are many possible maintenance strategies, and a large degree of uncertainty, leading to difficult decision making. This paper develops a framework to inform the decision of whether it is better over the long term to continuously repair defects as they become critical or to just replace entire segments of the pipeline. The method uses a probabilistic analysis to determine the expected number of failures for each pipeline segment. The expected number of failures informs the optimal decision. The proposed framework is tailored toward mass amounts of in-line inspection data and multiple pipeline segments. A numerical example of a corroding upstream pipeline illustrates the method.

Topics:
Maintenance , Pipeline systems , Decision making
ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems, Part A: Civil Engineering. 2018;4(2):. doi:10.1061/AJRUA6.0000960.
Abstract 

Abstract  This paper is focused on the development of an efficient system-level reliability-based design optimization strategy for uncertain wind-excited building systems characterized by high-dimensional design variable vectors (in the order of hundreds). Indeed, although a number of methods have been proposed over the last 15 years for the system-level reliability-based design optimization of building systems subject to stochastic excitation, few have treated problems characterized by more than a handful of design variables. This limits their applicability to practical problems of interest, such as the design optimization of high-rise buildings. To overcome this limitation, a simulation-based method is proposed in this work that is capable of solving reliability-based design optimization problems characterized by high-dimensional design variable vectors while considering system-level performance constraints. The framework is based on approximately decoupling the reliability analysis from the optimization loop through the definition of a system-level subproblem that can be fully defined from the results of a single simulation carried out in the current design point. To demonstrate the efficiency, practicality, and strong convergence properties of the proposed framework, a 40-story uncertain planar frame defined by 200 design variables is optimized under stochastic wind excitation.

Topics:
Reliability-based optimization , Wind
ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems, Part A: Civil Engineering. 2018;4(2):. doi:10.1061/AJRUA6.0000964.
Abstract 

Abstract  Fragility functions define the probability of meeting or exceeding some damage measure (DM) for a given level of engineering demand (e.g., base shear) or hazard intensity measure (IM; e.g., wind speed, and peak ground acceleration). Empirical fragility functions specifically refer to fragility functions that are developed from posthazard damage assessments, and, as such, they define the performance of structures or systems as they exist in use and under true natural hazard loading. This paper describes major sources of epistemic uncertainty in empirical fragility functions for building performance under natural hazard loading, and develops and demonstrates methods for quantifying these uncertainties using Monte Carlo simulation methods. Uncertainties are demonstrated using a dataset of 1,241 residential structures damaged in the May 22, 2011, Joplin, Missouri, tornado. Uncertainties in the intensity measure (wind speed) estimates were the largest contributors to the overall uncertainty in the empirical fragility functions. With a sufficient number of samples, uncertainties because of potential misclassification of the observed damage levels and sampling error were relatively small. The methods for quantifying uncertainty in empirical fragility functions are demonstrated using tornado damage observations, but are applicable to any other natural hazard as well.

Topics:
Uncertainty , Disasters , Damage assessment

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In