0

Newest Issue


Review Article

ASME J. Risk Uncertainty Part B. 2018;5(1):010801-010801-11. doi:10.1115/1.4040407.
FREE TO VIEW

In the challenging downhole environment, drilling tools are normally subject to high temperature, severe vibration, and other harsh operation conditions. The drilling activities generate massive field data, namely field reliability big data (FRBD), which includes downhole operation, environment, failure, degradation, and dynamic data. Field reliability big data has large size, high variety, and extreme complexity. FRBD presents abundant opportunities and great challenges for drilling tool reliability analytics. Consequently, as one of the key factors to affect drilling tool reliability, the downhole vibration factor plays an essential role in the reliability analytics based on FRBD. This paper reviews the important parameters of downhole drilling operations, examines the mode, physical and reliability impact of downhole vibration, and presents the features of reliability big data analytics. Specifically, this paper explores the application of vibration factor in reliability big data analytics covering tool lifetime/failure prediction, prognostics/diagnostics, condition monitoring (CM), and maintenance planning and optimization. Furthermore, the authors highlight the future research about how to better apply the downhole vibration factor in reliability big data analytics to further improve tool reliability and optimize maintenance planning.

Commentary by Dr. Valentin Fuster

Research Papers

ASME J. Risk Uncertainty Part B. 2018;5(1):011001-011001-17. doi:10.1115/1.4039943.

The high estimated position error in current commercial-off-the-shelf (GPS/INS) impedes achieving precise autonomous takeoff and landing (TOL) flight operations. To overcome this problem, in this paper, we propose an integrated global positioning system (GPS)/inertial navigation system (INS)/optical flow (OF) solution in which the OF provides an accurate augmentation to the GPS/INS. To ensure accurate and robust OF augmentation, we have used a robust modeling method to estimate OF based on a set of real-time experiments conducted under various simulated helicopter-landing scenarios. Knowing that the accuracy of the OF measurements is dependent on the accuracy of the height measurements, we have developed a real-time testing environment to model and validate the obtained dynamic OF model at various heights. The performance of the obtained OF model matches the real OF sensor with 87.70% fitting accuracy. An accuracy of 0.006 m/s mean error between the real OF sensor velocity and the velocity of the OF model is also achieved. The velocity measurements of the obtained OF model and the position of the GPS/INS are used in performing a dynamic model-based sensor fusion algorithm. In the proposed solution, the OF sensor is engaged when the vehicle approaches a landing spot that is equipped with a predefined landing pattern. The proposed solution has succeeded in performing a helicopter auto TOL with a maximum position error of 27 cm.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2018;5(1):011002-011002-5. doi:10.1115/1.4039999.
OPEN ACCESS

More than four decades have passed since the introduction of safety standards for impact attenuation surfaces (IAS) used in playgrounds. Falls in children's playground are a major source of injuries and IAS is one of the best methods of preventing severe head injuries. However, the ability of IAS in prevention of other types of injuries, such as upper limb fractures, is unclear. Accordingly, in this paper, ten synthetic playground surfaces were tested to examine their performance beyond the collected head injury criterion (HIC) and maximum G-force (Gmax) outputs recommended by ASTM F1292. The aim of this work was to investigate any limitations with current safety criteria and proposing additional criteria to filter hazardous IAS that technically comply with the current 1000 HIC and 200 Gmax thresholds. The proposed new criterion is called the impulse force criterion (If). If combines two important injury predictor characteristics, namely: HIC duration that is time duration of the most severe impact; and the change in momentum that addresses the IAS properties associated with bounce. Additionally, the maximum jerk (Jmax), the bounce, and the IAS absorbed work are presented. HIC, Gmax, If, and Jmax followed similar trends regarding material thickness and drop height. Moreover, the bounce and work done by the IAS on the falling missile at increasing drop heights was similar for all surfaces apart from one viscoelastic foam sample. The results presented in this paper demonstrate the limitations of current safety criteria and should, therefore, assist future research to reduce long-bone injuries in playgrounds.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2018;5(1):011003-011003-6. doi:10.1115/1.4040660.

Offshore petroleum platforms present complex, time-sensitive situations that can make emergency evacuations difficult to manage. Virtual environments (VE) can train safety-critical tasks and help prepare personnel to respond to real-world offshore emergencies. Before industries can adopt VE training, its utility must be established to ensure the technology provides effective training. This paper presents the results of two experiments that investigated the training utility of VE training. The experiments focused particularly on determining the most appropriate method to deliver offshore emergency egress training using a virtual environment. The first experiment used lecture-based teaching (LBT). The second experiment investigated the utility of a simulation-based mastery learning (SBML) pedagogical method from the medical field to address offshore emergency egress training. Both training programs (LBT and SBML) were used to train naïve participants in basic onboard familiarization and emergency evacuation procedures. This paper discusses the training efficacy of the SBML method in this context and compares the results of the SBML experimental study to the results of the LBT training experiment. Efficacy of the training methods is measured by a combination of time spent training and performance achieved by each of the training groups. Results show that the SBML approach to VE training was more time effective and produced better performance in the emergency scenarios. SBML training can help address individual variability in competence. Limitations to the SBML training are discussed and recommendations to improve the delivery of SBML training are presented. Overall, the results indicate that employing SBML training in industry can improve human reliability during emergencies through increased competence and compliance.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2018;5(1):011004-011004-7. doi:10.1115/1.4040661.

There are several ways of quantifying flood hazard. When the scale of the analysis is large, flood hazard simulation for an entire city becomes costly and complicated. The first part of this paper proposes utilizing experience and knowledge of local experts about flood characteristics in the area in order to come up with a first-level flood hazard and risk zoning maps, by implementing overlay operations in Arc GIS. In this step, the authors use the concept of pairwise comparison to eliminate the need for carrying out a complicated simulation to quantify flood hazard and risk. The process begins with identifying the main factors that contribute to flooding in a particular area. Pairwise comparison was used to elicit knowledge from local experts and assigned weights for each factor to reflect their relative importance toward flood hazard and risk. In the second part of this paper, the authors present a decision-making framework to support a flood risk response plan. Once the highest risk zones have been identified, a city can develop a risk response plan, for which this paper presents a decision-making framework to select an effective set of alternatives. The framework integrates tools from multicriteria decision-making, charrette design process to guide the pairwise elicitation, and a cost-effective analysis to include the limited budget constraint for any city. The theoretical framework uses the city of Addis Ababa for the first part of the paper. For the second part, the paper utilizes a hypothetical case of Addis Ababa and a mock city infrastructure department to illustrate the implementation of the framework.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2018;5(1):011005-011005-8. doi:10.1115/1.4040000.

In traditional reliability problems, the distribution of a basic random variable is usually unimodal; in other words, the probability density of the basic random variable has only one peak. In real applications, some basic random variables may follow bimodal distributions with two peaks in their probability density. When binomial variables are involved, traditional reliability methods, such as the first-order second moment (FOSM) method and the first-order reliability method (FORM), will not be accurate. This study investigates the accuracy of using the saddlepoint approximation (SPA) for bimodal variables and then employs SPA-based reliability methods with first-order approximation to predict the reliability. A limit-state function is at first approximated with the first-order Taylor expansion so that it becomes a linear combination of the basic random variables, some of which are bimodally distributed. The SPA is then applied to estimate the reliability. Examples show that the SPA-based reliability methods are more accurate than FOSM and FORM.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2018;5(1):011006-011006-12. doi:10.1115/1.4040571.

Hierarchical Bayesian models (HBMs) have been increasingly used for various engineering applications. We classify two types of HBM found in the literature as hierarchical prior model (HPM) and hierarchical stochastic model (HSM). Then, we focus on studying the theoretical implications of the HSM. Using examples of polynomial functions, we show that the HSM is capable of separating different types of uncertainties in a system and quantifying uncertainty of reduced order models under the Bayesian model class selection framework. To tackle the huge computational cost for analyzing HSM, we propose an efficient approximation scheme based on importance sampling (IS) and empirical interpolation method (EIM). We illustrate our method using two engineering examples—a molecular dynamics simulation for Krypton and a pharmacokinetic/pharmacodynamics (PKPD) model for cancer drug.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2018;5(1):011007-011007-10. doi:10.1115/1.4040918.

The data collected on second-to-second operations of large-scale freight and logistics systems have increased in recent years. Data analytics can provide valuable insight and improve efficiency and reduce waste of resources. Understanding sources of uncertainty, including emergent and future conditions, is critical to enterprise resilience, recognizing regimes of operations, and to decision-making for capacity expansions, etc. This paper demonstrates analyses of operations data at a marine container terminal and disaggregates layers of uncertainty and discusses implications for operations decision-making and capacity expansion. The layers arise from various sources and perspectives such as level of detail in data collection and compatibilities of data sources, missing entries in databases, natural and human-induced disruptions, and competing stakeholder views of what should be the performance metrics. Among the resulting observations is that long truck turn times are correlated with high traffic volume which is distributed across most states of operations. Furthermore, data quality and presentation of performance metrics should be considered when interpreting results from data analyses. The potential influences of emergent and future conditions of technologies, markets, commerce, environment, behaviors, regulations, organizations, environment, and others on the regimes of terminal operations are examined.

Commentary by Dr. Valentin Fuster
ASME J. Risk Uncertainty Part B. 2018;5(1):011008-011008-7. doi:10.1115/1.4040919.

Real components always deviate from their ideal dimensions. This makes every component, even a serial production, unique. Although they look the same, differences can always be observed due to different scattering factors and variations in the manufacturing process. All these factors inevitably lead to parts that deviate from their ideal shape and, therefore, have different properties than the ideal component. Changing properties can lead to major problems or even failure during operation. It is necessary to specify the permitted deviations to ensure that every single product nevertheless meets its technical requirements. Furthermore, it is necessary to estimate the consequences of the permitted deviations, which is done via tolerance analysis. During this process, components are assembled virtually and varied with the uncertainties specified by the tolerances. A variation simulation is one opportunity to calculate these effects for geometric deviations. Since tolerance analysis enables engineers to identify weak points in an early design stage, it is important to know the contribution that every single tolerance has on a certain quality-relevant characteristic, to restrict or increase the correct tolerances. In this paper, a fuzzy-based method to calculate the sensitivity is introduced and compared with the commonly used extended Fourier amplitude sensitivity test (EFAST) method. Special focus of this work is the differentiation of the sensitivity for the total system and the sensitivities for the subsystems defined by the α-cuts of the fuzzy numbers. It discusses the impact of the number of evaluations and nonlinearity on sensitivity for EFAST and the fuzzy-based method.

Commentary by Dr. Valentin Fuster
Select Articles from Part A: Civil Engineering

Technical Papers

ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems, Part A: Civil Engineering. 2018;4(3):. doi:10.1061/AJRUA6.0000969.
Abstract 

Abstract  This paper examines the task of approximating or generating samples according to a target probability distribution when this distribution is expressed as a function of the response of an engineering system. Frequently such approximation is performed in a sequential manner, using a series of intermediate densities that converge to the target density and may require a large number of evaluations of the system response, which for applications involving complex numerical models creates a significant computational burden. To alleviate this burden an adaptive Kriging stochastic sampling and density approximation framework (AK-SSD) is developed in this work. The metamodel approximates the system response vector, whereas the adaptive characteristics are established through an iterative approach. At the end of each iteration, the target density, approximated through the current metamodel, is compared to the density established at the previous iteration, using the Hellinger distance as a comparison metric. If convergence has not been achieved, then additional simulation experiments are performed to inform the metamodel development, through a sample-based design of experiments that balances between the improvement of the metamodel accuracy and the addition of experiments in regions of importance for the stochastic sampling. These regions are defined by considering both the target density and any intermediate densities. The process then moves to the next iteration, with an improved metamodel developed using all the available simulation experiments. Although the theoretical discussions are general, the emphasis is placed on rare-event simulation. For this application, once the target density is approximated (first stage), it is used (second stage) as an importance sampling density for estimating the rare-event likelihood. For the second stage, use of either the metamodel or the exact numerical model is examined.

Topics:
Density , Approximation
ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems, Part A: Civil Engineering. 2018;4(3):. doi:10.1061/AJRUA6.0000983.
Abstract 

Abstract  Monte Carlo simulation is the most versatile solution method for problems in stochastic computational mechanics but suffers from a slow convergence rate. The number of simulations required to produce an acceptable accuracy is often impractical for complex and time-consuming numerical models. In this paper, an element-based control variate approach is developed to improve the efficiency of Monte Carlo simulation in stochastic finite-element analysis, with particular reference to high-dimensional and nonlinear geotechnical problems. The method uses a low-order element to form an inexpensive approximation to the output of an expensive, high-order model. By keeping the mesh constant, a high correlation between low-order and high-order models is ensured, enabling a large variance reduction to be achieved. The approach is demonstrated by application to the bearing capacity of a strip footing on a spatially variable soil. The problem requires 300 input random variables to represent the spatial variability by random fields, and would be difficult to solve by methods other than Monte Carlo simulation. Using an element-based control variate reduces the standard deviation of the mean bearing capacity by approximately half. In addition, two methods for estimating the cumulative distribution function as a complement to the improved mean estimator are presented.

Topics:
Finite element analysis
ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems, Part A: Civil Engineering. 2018;4(3):. doi:10.1061/AJRUA6.0000980.
Abstract 

Abstract  Extreme precipitation is one of the most important climate hazards that pose a significant threat to human property and life. Understanding extreme precipitation events helps to manage their risk to society and hence reduce potential losses. This paper provides two new stochastic methods to analyze and predict various extreme precipitation events based on nonstationary models with or without the consideration of serial dependency associated with different days. These methods, together with Monte Carlo simulation and dynamic optimization, bridge nonextreme precipitation and extreme precipitation so that abundant nonextreme precipitation data can be used for extreme precipitation analysis. On an annual basis, the analysis produces distributions for the maximum daily precipitation, number of days with heavy rainfall, and maximum number of consecutive days with heavy rainfall. The accuracy of the new methods is examined, using 10 decades of empirical data in the Washington, DC metropolitan area. Based on the new methods, predictions of various extreme events are provided under different assumptions. Finally, the impact of serial dependency on results is also discussed. The result shows that for the area studied, serial dependency can further improve the analysis result.

Topics:
Climate , Precipitation
ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems, Part A: Civil Engineering. 2018;4(3):. doi:10.1061/AJRUA6.0000981.
Abstract 

Abstract  This paper compiles a database consisting of 16 tunnel excavation projects in clayey soils using the earth pressure balance (EPB) tunneling method. Statistical analysis of the data is performed and the probability distributions for ground loss ratio η and fitting parameter n, which are two major parameters in Peck’s formula for transverse surface settlement, are identified based on 78 and 29 points of field data, respectively. The correlation between η and n is fitted based on the data and its influences on the failure probability is evaluated. A serviceability performance function is modeled based on Peck’s settlement formula, and the serviceability failure probability is defined and analyzed with Monte Carlo simulations. A series of design charts, in terms of tunnel diameter, cover/diameter ratio, η, and n, are generated for practical uses without running Monte Carlo simulations.

Topics:
Tunnel construction , Failure , Probability , Soil

Case Studies

ASCE-ASME Journal of Risk and Uncertainty in Engineering Systems, Part A: Civil Engineering. 2018;4(3):. doi:10.1061/AJRUA6.0000963.
Abstract 

Abstract  The Houston Ship Channel (HSC) is one of the busiest waterway corridors in the United States. Since the channel’s expansion in 2005, concerns have arisen about design deficiencies in the HSC in the area of the Bayport Ship Channel (BSC), especially north of the turn at Five Mile Cut. A mental models expert elicitation exercise was conducted in order to identify safety concerns arising from these design deficiencies and provide qualitative data that can structure analysis of technical data like those from automatic identification system (AIS) databases, which can better connect possible design deficiencies to incident outcomes. The elicitation produced an influence diagram to enable later causal reasoning and Bayesian analysis for the HSC and BSC confluence and nearby areas on the HSC, and helped to prime a comprehensive study of the feasibility of safety and performance modifications on this reach of the HSC.

Topics:
Safety , Navigation , Risk management , Ships

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In