0

IN THIS ISSUE

### Research Papers

ASME J. Risk Uncertainty Part B. 2018;4(4):041001-041001-7. doi:10.1115/1.4039243.

As a common type system, multistate weighted k-out-of-n system is of great importance in reliability engineering. The components are usually treated as independent from each other. It is usually not that case in real life and the components are dependent. On the other hand, the performance of the components degrades over time, leading to the change of the components' weight at the same time. As a result, the present paper provides a method to evaluate the dynamic reliability of multistate weighted k-out-of-n: G system with s-dependent components. The degradation of the components follows a Markov process and the components are nonrepairable. Copula function is used to model the s-dependence of the components. The LZ-transform for a discrete-state continuous-time Markov process is combined, and the explicit expression for the survival function and the mean time to failure (MTTF) of the system is obtained. A small electricity generating system is studied based on our method in the illustration, and detailed comparison result is made for dependent case and independent case. Dynamic reliability with varied levels of electricity generation conforming to the actual situation for this generating system is also calculated.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2018;4(4):041002-041002-12. doi:10.1115/1.4039149.

Bayesian networks (BNs) are being studied in recent years for system diagnosis, reliability analysis, and design of complex engineered systems. In several practical applications, BNs need to be learned from available data before being used for design or other purposes. Current BN learning algorithms are mainly developed for networks with only discrete variables. Engineering design problems often consist of both discrete and continuous variables. This paper develops a framework to handle continuous variables in BN learning by integrating learning algorithms of discrete BNs with Gaussian mixture models (GMMs). We first make the topology learning more robust by optimizing the number of Gaussian components in the univariate GMMs currently available in the literature. Based on the BN topology learning, a new multivariate Gaussian mixture (MGM) strategy is developed to improve the accuracy of conditional probability learning in the BN. A method is proposed to address this difficulty of MGM modeling with data of mixed discrete and continuous variables by mapping the data for discrete variables into data for a standard normal variable. The proposed framework is capable of learning BNs without discretizing the continuous variables or making assumptions about their conditional probability densities (CPDs). The applications of the learned BN to uncertainty quantification and model calibration are also investigated. The results of a mathematical example and an engineering application example demonstrate the effectiveness of the proposed framework.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2018;4(4):041003-041003-8. doi:10.1115/1.4039941.

The gear door lock system (GDLS) is a hydraulic and mechatronic system with high degree of complexity and uncertainty, making the performance assessment of the system especially intractable. We develop copula models to estimate the reliability of GDLS with dependent failure modes. Based on the working principle of the GDLS, kinematic and dynamic model with imperfect joints is built in which Latin hypercube sampling (LHS) and kernel smoothing density are utilized to obtain the marginal failure probabilities. Then, copula models are utilized to describe the dependence between the two function failure modes. Furthermore, to be more accurate, mixed copula models are developed. The squared Euclidean distance is adopted to estimate the parameters of the above reliability models. Finally, the Monte Carlo simulation is conducted to evaluate different reliability models.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2018;4(4):041004-041004-7. doi:10.1115/1.4039464.

A series of pedestrian sideswipe impacts were computationally reconstructed; a fast-walking pedestrian was collided laterally with the side of a moving vehicle at 25 km/h or 40 km/h, which resulted in rotating the pedestrian's body axially. Potential severity of traumatic brain injury (TBI) was assessed using linear and rotational acceleration pulses applied to the head and by measuring intracranial brain tissue deformation. We found that TBI risk due to secondary head strike with the ground can be much greater than that due to primary head strike with the vehicle. Further, an “effective” head mass, meff, was computed based upon the impulse and vertical velocity change involved in the secondary head strike, which mostly exceeded the mass of the adult head-form impactor (4.5 kg) commonly used for a current regulatory impact test for pedestrian safety assessment. Our results demonstrated that a sport utility vehicle (SUV) is more aggressive than a sedan due to the differences in frontal shape. Additionally, it was highlighted that a striking vehicle velocity should be lower than 25 km/h at the moment of impact to exclude the potential risk of sustaining TBI, which would be mitigated by actively controlling meff, because meff is closely associated with a rotational acceleration pulse applied to the head involved in the final event of ground contact.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2018;4(4):041005-041005-7. doi:10.1115/1.4039465.

Valuation of transactive energy (TE) systems should be supported by a structured and systematic approach to uncertainty identification, assessment, and treatment in the interest of risk-informed decision making. The proposed approach, a variation of fault tree analysis, is anticipated to support valuation analysts in analyzing conventional and transactive system scenarios. This approach allows for expanding the entire tree up to the level of minute details or collapsing them to a level sufficient enough to get an overview of the problem. Quantification scheme for the described approach lends itself for valuation. The method complements value exchange analysis, simulation, and field demonstration studies. The practicality of the proposed approach is demonstrated through uncertainty assessment of the smart grid interoperability panel peak heat day scenario.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2018;4(4):041006-041006-17. doi:10.1115/1.4039558.

This paper examines the variability of predicted responses when multiple stress–strain curves (reflecting variability from replicate material tests) are propagated through a finite element model of a ductile steel can being slowly crushed. Over 140 response quantities of interest (QOIs) (including displacements, stresses, strains, and calculated measures of material damage) are tracked in the simulations. Each response quantity's behavior varies according to the particular stress–strain curves used for the materials in the model. We desire to estimate or bound response variation when only a few stress–strain curve samples are available from material testing. Propagation of just a few samples will usually result in significantly underestimated response uncertainty relative to propagation of a much larger population that adequately samples the presiding random-function source. A simple classical statistical method, tolerance intervals (TIs), is tested for effectively treating sparse stress–strain curve data. The method is found to perform well on the highly nonlinear input-to-output response mappings and non-normal response distributions in the can crush problem. The results and discussion in this paper support a proposition that the method will apply similarly well for other sparsely sampled random variable or function data, whether from experiments or models. The simple TI method is also demonstrated to be very economical.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2018;4(4):041007-041007-8. doi:10.1115/1.4039467.

The development of robust and adaptable methods of grasping force optimization (GFO) is an important consideration for robotic devices, especially those which are designed to interact naturally with a variety of objects. Along with considerations for the computational efficiency of such methods, it is also important to ensure that a GFO approach chooses forces which can produce a stable grasp even in the presence of uncertainty. This paper examines the robustness of three methods of GFO in the presence of variability in the contact locations and in the coefficients of friction between the hand and the object. A Monte Carlo simulation is used to determine the resulting probability of failure and sensitivity levels when variability is introduced. Two numerical examples representing two common grasps performed by the human hand are used to demonstrate the performance of the optimization methods. Additionally, the method which yields the best overall performance is also tested to determine its consistency when force is applied to the object's center of mass in different directions. The results show that both the nonlinear and linear matrix inequality (LMIs) methods of GFO produce acceptable results, whereas the linear method produces unacceptably high probabilities of failure. Further, the nonlinear method continues to produce acceptable results even when the direction of the applied force is changed. Based on these results, the nonlinear method of GFO is considered to be robust in the presence of variability in the contact locations and coefficients of friction.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2018;4(4):041008-041008-13. doi:10.1115/1.4039784.

Optimal sizing of peak loads has proven to be an important factor affecting the overall energy consumption of heating ventilation and air-conditioning (HVAC) systems. Uncertainty quantification of peak loads enables optimal configuration of the system by opting for a suitable size factor. However, the representation of uncertainty in HVAC sizing has been limited to probabilistic analysis and scenario-based cases, which may limit and bias the results. This study provides a framework for uncertainty representation in building energy modeling, due to both random factors and imprecise knowledge. The framework is shown by a numerical case study of sizing cooling loads, in which uncertain climatic data are represented by probability distributions and human-driven activities are described by possibility distributions. Cooling loads obtained from the hybrid probabilistic–possibilistic propagation of uncertainty are compared to those obtained by pure probabilistic and pure possibilistic approaches. Results indicate that a pure possibilistic representation may not provide detailed information on the peak cooling loads, whereas a pure probabilistic approach may underestimate the effect of uncertain human behavior. The proposed hybrid representation and propagation of uncertainty in this paper can overcome these issues by proper handling of both random and limited data.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2018;4(4):041009-041009-9. doi:10.1115/1.4039471.

A novel uncertainty quantification routine in the genre of adaptive sparse grid stochastic collocation (SC) has been proposed in this study to investigate the propagation of parametric uncertainties in a stall flutter aeroelastic system. In a hypercube stochastic domain, presence of strong nonlinearities can give way to steep solution gradients that can adversely affect the convergence of nonadaptive sparse grid collocation schemes. A new adaptive scheme is proposed here that allows for accelerated convergence by clustering more discretization points in regimes characterized by steep fronts, using hat-like basis functions with nonequidistant nodes. The proposed technique has been applied on a nonlinear stall flutter aeroelastic system to quantify the propagation of multiparametric uncertainty from both structural and aerodynamic parameters. Their relative importance on the stochastic response is presented through a sensitivity analysis.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2018;4(4):041010-041010-9. doi:10.1115/1.4039357.

Go-karts are a common amusement park feature enjoyed by people of all ages. While intended for racing, contact between go-karts does occur. To investigate and quantify the accelerations and forces which result from contact, 44 low-speed impacts were conducted between a stationary (target) and a moving (bullet) go-kart. The occupant of the bullet go-kart was one of two human volunteers. The occupant of the target go-kart was a Hybrid III 50th percentile male anthropomorphic test device (ATD). Impact configurations consisted of rear-end impacts, frontal impacts, side impacts, and oblique impacts. Results demonstrated high repeatability for the vehicle performance and occupant response. Go-kart accelerations and speed changes increased with increased impact speed. Impact duration and restitution generally decreased with increased impact speed. All ATD acceleration, force, and moment values increased with increased impact speed. Common injury metrics such as the head injury criterion (HIC), $Nij$, and $Nkm$ were calculated and were found to be below injury thresholds. Occupant response was also compared to published activities of daily living data.

Commentary by Dr. Valentin Fuster
Select Articles from Part A: Civil Engineering

### Technical Papers

ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems, Part A: Civil Engineering. 2018;4(3):. doi:10.1061/AJRUA6.0000969.
Abstract

Abstract  This paper examines the task of approximating or generating samples according to a target probability distribution when this distribution is expressed as a function of the response of an engineering system. Frequently such approximation is performed in a sequential manner, using a series of intermediate densities that converge to the target density and may require a large number of evaluations of the system response, which for applications involving complex numerical models creates a significant computational burden. To alleviate this burden an adaptive Kriging stochastic sampling and density approximation framework (AK-SSD) is developed in this work. The metamodel approximates the system response vector, whereas the adaptive characteristics are established through an iterative approach. At the end of each iteration, the target density, approximated through the current metamodel, is compared to the density established at the previous iteration, using the Hellinger distance as a comparison metric. If convergence has not been achieved, then additional simulation experiments are performed to inform the metamodel development, through a sample-based design of experiments that balances between the improvement of the metamodel accuracy and the addition of experiments in regions of importance for the stochastic sampling. These regions are defined by considering both the target density and any intermediate densities. The process then moves to the next iteration, with an improved metamodel developed using all the available simulation experiments. Although the theoretical discussions are general, the emphasis is placed on rare-event simulation. For this application, once the target density is approximated (first stage), it is used (second stage) as an importance sampling density for estimating the rare-event likelihood. For the second stage, use of either the metamodel or the exact numerical model is examined.

Topics:
Density , Approximation
ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems, Part A: Civil Engineering. 2018;4(3):. doi:10.1061/AJRUA6.0000983.
Abstract

Abstract  Monte Carlo simulation is the most versatile solution method for problems in stochastic computational mechanics but suffers from a slow convergence rate. The number of simulations required to produce an acceptable accuracy is often impractical for complex and time-consuming numerical models. In this paper, an element-based control variate approach is developed to improve the efficiency of Monte Carlo simulation in stochastic finite-element analysis, with particular reference to high-dimensional and nonlinear geotechnical problems. The method uses a low-order element to form an inexpensive approximation to the output of an expensive, high-order model. By keeping the mesh constant, a high correlation between low-order and high-order models is ensured, enabling a large variance reduction to be achieved. The approach is demonstrated by application to the bearing capacity of a strip footing on a spatially variable soil. The problem requires 300 input random variables to represent the spatial variability by random fields, and would be difficult to solve by methods other than Monte Carlo simulation. Using an element-based control variate reduces the standard deviation of the mean bearing capacity by approximately half. In addition, two methods for estimating the cumulative distribution function as a complement to the improved mean estimator are presented.

Topics:
Finite element analysis
ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems, Part A: Civil Engineering. 2018;4(3):. doi:10.1061/AJRUA6.0000980.
Abstract

Abstract  Extreme precipitation is one of the most important climate hazards that pose a significant threat to human property and life. Understanding extreme precipitation events helps to manage their risk to society and hence reduce potential losses. This paper provides two new stochastic methods to analyze and predict various extreme precipitation events based on nonstationary models with or without the consideration of serial dependency associated with different days. These methods, together with Monte Carlo simulation and dynamic optimization, bridge nonextreme precipitation and extreme precipitation so that abundant nonextreme precipitation data can be used for extreme precipitation analysis. On an annual basis, the analysis produces distributions for the maximum daily precipitation, number of days with heavy rainfall, and maximum number of consecutive days with heavy rainfall. The accuracy of the new methods is examined, using 10 decades of empirical data in the Washington, DC metropolitan area. Based on the new methods, predictions of various extreme events are provided under different assumptions. Finally, the impact of serial dependency on results is also discussed. The result shows that for the area studied, serial dependency can further improve the analysis result.

Topics:
Climate , Precipitation
ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems, Part A: Civil Engineering. 2018;4(3):. doi:10.1061/AJRUA6.0000981.
Abstract

Abstract  This paper compiles a database consisting of 16 tunnel excavation projects in clayey soils using the earth pressure balance (EPB) tunneling method. Statistical analysis of the data is performed and the probability distributions for ground loss ratio $η$ and fitting parameter $n$, which are two major parameters in Peck’s formula for transverse surface settlement, are identified based on 78 and 29 points of field data, respectively. The correlation between $η$ and $n$ is fitted based on the data and its influences on the failure probability is evaluated. A serviceability performance function is modeled based on Peck’s settlement formula, and the serviceability failure probability is defined and analyzed with Monte Carlo simulations. A series of design charts, in terms of tunnel diameter, cover/diameter ratio, $η$, and $n$, are generated for practical uses without running Monte Carlo simulations.

Topics:
Tunnel construction , Failure , Probability , Soil

### Case Studies

ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems, Part A: Civil Engineering. 2018;4(3):. doi:10.1061/AJRUA6.0000963.
Abstract

Abstract  The Houston Ship Channel (HSC) is one of the busiest waterway corridors in the United States. Since the channel’s expansion in 2005, concerns have arisen about design deficiencies in the HSC in the area of the Bayport Ship Channel (BSC), especially north of the turn at Five Mile Cut. A mental models expert elicitation exercise was conducted in order to identify safety concerns arising from these design deficiencies and provide qualitative data that can structure analysis of technical data like those from automatic identification system (AIS) databases, which can better connect possible design deficiencies to incident outcomes. The elicitation produced an influence diagram to enable later causal reasoning and Bayesian analysis for the HSC and BSC confluence and nearby areas on the HSC, and helped to prime a comprehensive study of the feasibility of safety and performance modifications on this reach of the HSC.

Topics:
Safety , Navigation , Risk management , Ships