Stochastic systems

Analysis and Clustering-Based Improvement of Particle Filter Optimization Algorithms

This study highlights how particle filter optimization (PFO) algorithms can explore objective functions and their robustness near optimums. Improvements of the general algorithm are also introduced to increase search efficiency. Population-based optimization algorithms reach outstanding performance by propagating not only one but many candidate solutions. One novel representative of these methods is the PFO concept, which was created as an analogue of the particle filter state estimation algorithm. The PFO algorithm results in a probability distribution of the sample elements, which can represent the shape of the objective function. In the literature, several variants of the PFO can be found, but its elements are not clearly fixed because of its novelty. In the present study, a method is introduced to gain information on the shape of the objective function by following the propagation of the particles along the iterations. The contributions of the paper: 1) comparative study is proposed examining the different variants of the algorithm, and some improvements are introduced (e.g., weight differentiation) to increase the efficiency of the general PFO algorithm; 2) propagation of the particles is investigated to explore the shape of the objective function; 3) clustering-based technique is proposed to get information about the local optimums (e.g., robustness). The results verify that the proposed method is applicable to find local optimums and evaluate their robustness, which is a promising prospect for robust optimization problems where often not the global, but a more stable local optimum gives the best solution.


Post Date: 17 April 2024

P-graph-based multi-objective risk analysis and redundancy allocation in safety-critical energy systems

As most of the energy production and transformation processes are safety-critical, it is vital to develop tools that support the analysis and minimisation of their reliability-related risks. The resultant optimisation problem should reflect the structure of the process which requires the utilisation of flexible and problem-relevant models. This paper highlights that P-graphs extended by logical condition units can be transformed into reliability block diagrams, and based on the cut and path sets of the graph a polynomial risk model can be extracted which opens up new opportunities for the definition optimisation problems related to reliability redundancy allocation. A novel multi-objective optimisation based method has been developed to evaluate the criticality of the units and subsystems. The applicability of the proposed method is demonstrated using a real-life case study related to a reforming reaction system. The results highlight that P-graphs can serve as an interface between process flow diagrams and polynomial risk models and the developed tool can improve the reliability of energy systems in retrofitting projects. 

Test-sequence optimisation by survival analysis

 Testing is an indispensable process for ensuring product quality in production systems. Reducing the time and cost spent on testing whilst minimising the risk of not detecting faults is an essential problem of process engineering. The optimisation of complex testing processes consisting of independent test steps is considered. Survival analysis-based models of an elementary test to efficiently combine the time-dependent outcome of the tests and costs related to the operation of the testing system were developed. A mixed integer non-linear programming (MINLP) model to formalize how the total cost of testing depends on the sequence and the parameters of the elementary test steps was proposed. To provide an efficient formalization of the scheduling problem and avoid difficulties due to the relaxation of the integer variables, the MINLP model as a P-graph representation-based process network synthesis problem was considered. The applicability of the methodology is demonstrated by a realistic case study taken from the computer manufacturing industry. With the application of the optimal test times and sequence provided by the SCIP (Solving Constraint Integer Programs) solver, 0.1–5% of the cost of the testing can be saved.

Reliability - Redundancy Allocation in Process Graphs

Process graphs (P-graphs) have been proven to be useful in identifying optimal structures of process systems and business processes. The provision of redundant critical units can significantly reduce operational risk. Redundant units and subsystems can be modelled in P-graphs by adding nodes that represent logical conditions of the operation of the units. It is revealed in this paper that P-graphs extended by logical condition units can be transformed into reliability block diagrams and based on the cut sets and path sets of the graph a polynomial risk model can be extracted. Since the exponents of the polynomial represent the number of redundant units, the cost function of the reliability – redundancy allocation problem as a nonlinear integer programming model can be formalised, where the cost function handles the costs associated with consequences of equipment failure and repair times. The applicability of this approach is illustrated in a case study related to the asset-intensive chemical, oil, gas and energy sector. The results show that the proposed algorithm is useful for risk-based priority resource allocation in a reforming reaction system. 

Empirical working time distribution-based line balancing with integrated simulated annealing and dynamic programming

According to the Industry 4.0 paradigms, the balancing of stochastic production lines requires easily implementable, flexible and robust tools for task to workstations assignment. An algorithm that calculates the performance indicators of the production line based on the convolution of the empirical density distribution functions of the working times and applies dynamic programming to assign tasks to the workstations is proposed. The sequence of tasks is optimised by an outer simulated annealing loop that operates on the set of interchangeable task-pairs extracted from the precedence graph of the task-ordering constraints. Eight line-balancing problems were studied and the results by Monte Carlo simulations were validated to demonstrate the applicability of the algorithm. The results confirm that our methodology does not just provide optimal solutions, but it is an excellent tool in terms of the sensitivity analysis of stochastic production lines.

Improvement of PSO algorithm by memory based gradient search - application in inventory management

Advanced inventory management in complex supply chains requires effective and robust nonlinear optimization due to the stochastic nature of supply and demand variations. Application of estimated gradients can boost up the convergence of Particle Swarm Optimization (PSO) algorithm but classical gradient calculation cannot be applied to stochastic and uncertain systems. In these situations Monte-Carlo (MC) simulation can be applied to determine the gradient. We developed a memory based algorithm where instead of generating and evaluating new simulated samples the stored and shared former function evaluations of the particles are sampled to estimate the gradients by local weighted least squares regression. The performance of the resulted regional gradient-based PSO is verified by several benchmark problems and in a complex application example where optimal reorder points of a supply chain are determined.