0

IN THIS ISSUE

Research Papers

ASME J. Risk Uncertainty Part B. 2018;4(4):041001-041001-7. doi:10.1115/1.4039243.

As a common type system, multistate weighted k-out-of-n system is of great importance in reliability engineering. The components are usually treated as independent from each other. It is usually not that case in real life and the components are dependent. On the other hand, the performance of the components degrades over time, leading to the change of the components' weight at the same time. As a result, the present paper provides a method to evaluate the dynamic reliability of multistate weighted k-out-of-n: G system with s-dependent components. The degradation of the components follows a Markov process and the components are nonrepairable. Copula function is used to model the s-dependence of the components. The LZ-transform for a discrete-state continuous-time Markov process is combined, and the explicit expression for the survival function and the mean time to failure (MTTF) of the system is obtained. A small electricity generating system is studied based on our method in the illustration, and detailed comparison result is made for dependent case and independent case. Dynamic reliability with varied levels of electricity generation conforming to the actual situation for this generating system is also calculated.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2018;4(4):041002-041002-12. doi:10.1115/1.4039149.
FREE TO VIEW

Bayesian networks (BNs) are being studied in recent years for system diagnosis, reliability analysis, and design of complex engineered systems. In several practical applications, BNs need to be learned from available data before being used for design or other purposes. Current BN learning algorithms are mainly developed for networks with only discrete variables. Engineering design problems often consist of both discrete and continuous variables. This paper develops a framework to handle continuous variables in BN learning by integrating learning algorithms of discrete BNs with Gaussian mixture models (GMMs). We first make the topology learning more robust by optimizing the number of Gaussian components in the univariate GMMs currently available in the literature. Based on the BN topology learning, a new multivariate Gaussian mixture (MGM) strategy is developed to improve the accuracy of conditional probability learning in the BN. A method is proposed to address this difficulty of MGM modeling with data of mixed discrete and continuous variables by mapping the data for discrete variables into data for a standard normal variable. The proposed framework is capable of learning BNs without discretizing the continuous variables or making assumptions about their conditional probability densities (CPDs). The applications of the learned BN to uncertainty quantification and model calibration are also investigated. The results of a mathematical example and an engineering application example demonstrate the effectiveness of the proposed framework.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2018;4(4):041003-041003-8. doi:10.1115/1.4039941.

The gear door lock system (GDLS) is a hydraulic and mechatronic system with high degree of complexity and uncertainty, making the performance assessment of the system especially intractable. We develop copula models to estimate the reliability of GDLS with dependent failure modes. Based on the working principle of the GDLS, kinematic and dynamic model with imperfect joints is built in which Latin hypercube sampling (LHS) and kernel smoothing density are utilized to obtain the marginal failure probabilities. Then, copula models are utilized to describe the dependence between the two function failure modes. Furthermore, to be more accurate, mixed copula models are developed. The squared Euclidean distance is adopted to estimate the parameters of the above reliability models. Finally, the Monte Carlo simulation is conducted to evaluate different reliability models.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2018;4(4):041004-041004-7. doi:10.1115/1.4039464.

A series of pedestrian sideswipe impacts were computationally reconstructed; a fast-walking pedestrian was collided laterally with the side of a moving vehicle at 25 km/h or 40 km/h, which resulted in rotating the pedestrian's body axially. Potential severity of traumatic brain injury (TBI) was assessed using linear and rotational acceleration pulses applied to the head and by measuring intracranial brain tissue deformation. We found that TBI risk due to secondary head strike with the ground can be much greater than that due to primary head strike with the vehicle. Further, an “effective” head mass, meff, was computed based upon the impulse and vertical velocity change involved in the secondary head strike, which mostly exceeded the mass of the adult head-form impactor (4.5 kg) commonly used for a current regulatory impact test for pedestrian safety assessment. Our results demonstrated that a sport utility vehicle (SUV) is more aggressive than a sedan due to the differences in frontal shape. Additionally, it was highlighted that a striking vehicle velocity should be lower than 25 km/h at the moment of impact to exclude the potential risk of sustaining TBI, which would be mitigated by actively controlling meff, because meff is closely associated with a rotational acceleration pulse applied to the head involved in the final event of ground contact.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2018;4(4):041005-041005-7. doi:10.1115/1.4039465.

Valuation of transactive energy (TE) systems should be supported by a structured and systematic approach to uncertainty identification, assessment, and treatment in the interest of risk-informed decision making. The proposed approach, a variation of fault tree analysis, is anticipated to support valuation analysts in analyzing conventional and transactive system scenarios. This approach allows for expanding the entire tree up to the level of minute details or collapsing them to a level sufficient enough to get an overview of the problem. Quantification scheme for the described approach lends itself for valuation. The method complements value exchange analysis, simulation, and field demonstration studies. The practicality of the proposed approach is demonstrated through uncertainty assessment of the smart grid interoperability panel peak heat day scenario.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2018;4(4):041006-041006-17. doi:10.1115/1.4039558.

This paper examines the variability of predicted responses when multiple stress–strain curves (reflecting variability from replicate material tests) are propagated through a finite element model of a ductile steel can being slowly crushed. Over 140 response quantities of interest (QOIs) (including displacements, stresses, strains, and calculated measures of material damage) are tracked in the simulations. Each response quantity's behavior varies according to the particular stress–strain curves used for the materials in the model. We desire to estimate or bound response variation when only a few stress–strain curve samples are available from material testing. Propagation of just a few samples will usually result in significantly underestimated response uncertainty relative to propagation of a much larger population that adequately samples the presiding random-function source. A simple classical statistical method, tolerance intervals (TIs), is tested for effectively treating sparse stress–strain curve data. The method is found to perform well on the highly nonlinear input-to-output response mappings and non-normal response distributions in the can crush problem. The results and discussion in this paper support a proposition that the method will apply similarly well for other sparsely sampled random variable or function data, whether from experiments or models. The simple TI method is also demonstrated to be very economical.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2018;4(4):041007-041007-8. doi:10.1115/1.4039467.

The development of robust and adaptable methods of grasping force optimization (GFO) is an important consideration for robotic devices, especially those which are designed to interact naturally with a variety of objects. Along with considerations for the computational efficiency of such methods, it is also important to ensure that a GFO approach chooses forces which can produce a stable grasp even in the presence of uncertainty. This paper examines the robustness of three methods of GFO in the presence of variability in the contact locations and in the coefficients of friction between the hand and the object. A Monte Carlo simulation is used to determine the resulting probability of failure and sensitivity levels when variability is introduced. Two numerical examples representing two common grasps performed by the human hand are used to demonstrate the performance of the optimization methods. Additionally, the method which yields the best overall performance is also tested to determine its consistency when force is applied to the object's center of mass in different directions. The results show that both the nonlinear and linear matrix inequality (LMIs) methods of GFO produce acceptable results, whereas the linear method produces unacceptably high probabilities of failure. Further, the nonlinear method continues to produce acceptable results even when the direction of the applied force is changed. Based on these results, the nonlinear method of GFO is considered to be robust in the presence of variability in the contact locations and coefficients of friction.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2018;4(4):041008-041008-13. doi:10.1115/1.4039784.

Optimal sizing of peak loads has proven to be an important factor affecting the overall energy consumption of heating ventilation and air-conditioning (HVAC) systems. Uncertainty quantification of peak loads enables optimal configuration of the system by opting for a suitable size factor. However, the representation of uncertainty in HVAC sizing has been limited to probabilistic analysis and scenario-based cases, which may limit and bias the results. This study provides a framework for uncertainty representation in building energy modeling, due to both random factors and imprecise knowledge. The framework is shown by a numerical case study of sizing cooling loads, in which uncertain climatic data are represented by probability distributions and human-driven activities are described by possibility distributions. Cooling loads obtained from the hybrid probabilistic–possibilistic propagation of uncertainty are compared to those obtained by pure probabilistic and pure possibilistic approaches. Results indicate that a pure possibilistic representation may not provide detailed information on the peak cooling loads, whereas a pure probabilistic approach may underestimate the effect of uncertain human behavior. The proposed hybrid representation and propagation of uncertainty in this paper can overcome these issues by proper handling of both random and limited data.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2018;4(4):041009-041009-9. doi:10.1115/1.4039471.

A novel uncertainty quantification routine in the genre of adaptive sparse grid stochastic collocation (SC) has been proposed in this study to investigate the propagation of parametric uncertainties in a stall flutter aeroelastic system. In a hypercube stochastic domain, presence of strong nonlinearities can give way to steep solution gradients that can adversely affect the convergence of nonadaptive sparse grid collocation schemes. A new adaptive scheme is proposed here that allows for accelerated convergence by clustering more discretization points in regimes characterized by steep fronts, using hat-like basis functions with nonequidistant nodes. The proposed technique has been applied on a nonlinear stall flutter aeroelastic system to quantify the propagation of multiparametric uncertainty from both structural and aerodynamic parameters. Their relative importance on the stochastic response is presented through a sensitivity analysis.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2018;4(4):041010-041010-9. doi:10.1115/1.4039357.

Go-karts are a common amusement park feature enjoyed by people of all ages. While intended for racing, contact between go-karts does occur. To investigate and quantify the accelerations and forces which result from contact, 44 low-speed impacts were conducted between a stationary (target) and a moving (bullet) go-kart. The occupant of the bullet go-kart was one of two human volunteers. The occupant of the target go-kart was a Hybrid III 50th percentile male anthropomorphic test device (ATD). Impact configurations consisted of rear-end impacts, frontal impacts, side impacts, and oblique impacts. Results demonstrated high repeatability for the vehicle performance and occupant response. Go-kart accelerations and speed changes increased with increased impact speed. Impact duration and restitution generally decreased with increased impact speed. All ATD acceleration, force, and moment values increased with increased impact speed. Common injury metrics such as the head injury criterion (HIC), $Nij$, and $Nkm$ were calculated and were found to be below injury thresholds. Occupant response was also compared to published activities of daily living data.

Commentary by Dr. Valentin Fuster
Select Articles from Part A: Civil Engineering

Technical Papers

ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems, Part A: Civil Engineering. 2018;4(2):. doi:10.1061/AJRUA6.0000965.
Abstract

Abstract  Stochastic soil modeling aims to provide reasonable mean, variance, and spatial correlation of soil properties with quantified uncertainty. Because of difficulties in integrating limited and imperfect prior knowledge (i.e., epistemic uncertainty) with observed site-specific information from tests (i.e., aleatoric uncertainty), a reasonably accurate estimate of the spatial correlation is significantly challenging. Possible reasons include (1) only sparse data being available (i.e., one-dimensional observations are collected at selected locations); and (2) from a physical point of view, the formation process of soil layers is considerably complex. This paper develops a Gaussian Markov random field (GMRF)-based modeling framework to describe the spatial correlation of soil properties conditional on observed electric cone penetration test (CPT) soundings at multiple locations. The model parameters are estimated using a novel stochastic partial differential equation (SPDE) approach and a fast Bayesian algorithm using the integrated nested Laplace approximation (INLA). An existing software library is used to implement the SPDE approach and Bayesian estimation. A real-world example using 185 CPT soundings from Alameda County, California is provided to demonstrate the developed method and examine its performance. The analyzed results from the proposed model framework are compared with the widely accepted covariance-based kriging method. The results indicate that the new approach generally outperforms the kriging method in predicting the long-range variability. In addition, a better understanding of the fine-scale variability along the depth is achieved by investigating one-dimensional residual processes at multiple locations.

Topics:
Modeling , Soil
ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems, Part A: Civil Engineering. 2018;4(2):. doi:10.1061/AJRUA6.0000950.
Abstract

Abstract  Metamodeling techniques have been widely used as substitutes for high-fidelity and time-consuming models in various engineering applications. Examples include polynomial chaos expansions, neural networks, kriging, and support vector regression (SVR). This paper attempts to compare the latter two in different case studies so as to assess their relative efficiency on simulation-based analyses. Similarities are drawn between these two metamodel types, leading to the use of anisotropy for SVR. Such a feature is not commonly used in the SVR-related literature. Special care was given to a proper automatic calibration of the model hyperparameters by using an efficient global search algorithm, namely the covariance matrix adaptation–evolution scheme. Variants of these two metamodels, associated with various kernel and autocorrelation functions, were first compared on analytical functions and then on finite element–based models. From the comprehensive comparison, it was concluded that anisotropy in the two metamodels clearly improves their accuracy. In general, anisotropic $L2$-SVR with the Matérn kernels was shown to be the most effective metamodel.

Topics:
Structural engineering , Support vector machines
ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems, Part A: Civil Engineering. 2018;4(2):. doi:10.1061/AJRUA6.0000956.
Abstract

Abstract  Corrosion is one of the main causes of pipeline failure, which can have large social, economic, and environmental consequences. To mitigate this risk, pipeline operators perform regular inspections and repairs. The results of the inspections aid decision makers in determining the optimal maintenance strategy. However, there are many possible maintenance strategies, and a large degree of uncertainty, leading to difficult decision making. This paper develops a framework to inform the decision of whether it is better over the long term to continuously repair defects as they become critical or to just replace entire segments of the pipeline. The method uses a probabilistic analysis to determine the expected number of failures for each pipeline segment. The expected number of failures informs the optimal decision. The proposed framework is tailored toward mass amounts of in-line inspection data and multiple pipeline segments. A numerical example of a corroding upstream pipeline illustrates the method.

Topics:
Maintenance , Pipeline systems , Decision making
ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems, Part A: Civil Engineering. 2018;4(2):. doi:10.1061/AJRUA6.0000960.
Abstract

Abstract  This paper is focused on the development of an efficient system-level reliability-based design optimization strategy for uncertain wind-excited building systems characterized by high-dimensional design variable vectors (in the order of hundreds). Indeed, although a number of methods have been proposed over the last 15 years for the system-level reliability-based design optimization of building systems subject to stochastic excitation, few have treated problems characterized by more than a handful of design variables. This limits their applicability to practical problems of interest, such as the design optimization of high-rise buildings. To overcome this limitation, a simulation-based method is proposed in this work that is capable of solving reliability-based design optimization problems characterized by high-dimensional design variable vectors while considering system-level performance constraints. The framework is based on approximately decoupling the reliability analysis from the optimization loop through the definition of a system-level subproblem that can be fully defined from the results of a single simulation carried out in the current design point. To demonstrate the efficiency, practicality, and strong convergence properties of the proposed framework, a 40-story uncertain planar frame defined by 200 design variables is optimized under stochastic wind excitation.

Topics:
Reliability-based optimization , Wind
ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems, Part A: Civil Engineering. 2018;4(2):. doi:10.1061/AJRUA6.0000964.
Abstract

Abstract  Fragility functions define the probability of meeting or exceeding some damage measure (DM) for a given level of engineering demand (e.g., base shear) or hazard intensity measure (IM; e.g., wind speed, and peak ground acceleration). Empirical fragility functions specifically refer to fragility functions that are developed from posthazard damage assessments, and, as such, they define the performance of structures or systems as they exist in use and under true natural hazard loading. This paper describes major sources of epistemic uncertainty in empirical fragility functions for building performance under natural hazard loading, and develops and demonstrates methods for quantifying these uncertainties using Monte Carlo simulation methods. Uncertainties are demonstrated using a dataset of 1,241 residential structures damaged in the May 22, 2011, Joplin, Missouri, tornado. Uncertainties in the intensity measure (wind speed) estimates were the largest contributors to the overall uncertainty in the empirical fragility functions. With a sufficient number of samples, uncertainties because of potential misclassification of the observed damage levels and sampling error were relatively small. The methods for quantifying uncertainty in empirical fragility functions are demonstrated using tornado damage observations, but are applicable to any other natural hazard as well.

Topics:
Uncertainty , Disasters , Damage assessment