Luis A.

Moncayo-Martínez

About Me

Luis A. Moncayo-Martínez is an associate professor at the Instituto Tecnológico Autónomo de México (ITAM) and member of the Mexican National Researchers System (SNI in Spanish). Luis A. earned a PhD in Engineering from the University of Exeter (UK). His research focuses on developing algorithms to optimise logistics and manufacturing systems. The results of his research have been published in peer-reviewed journals e.g. Int. J. Production Economics, Expert Systems with Applications, J. of Manufacturing Systems, and Int. J. of Advanced Manufacturing Technology. He has developed several projects about process improvement funded by the CONACYT and private companies. He is a member of INFORMS, IISE, and the Int. Association for Statistical Computing.

Faculty Profile at ITAM

Education


+ PhD in Engineering (2008) - The University of Exeter (UK)

Thesis: "Bi-criterion optimisation for configuring the supply Chain using Pareto ant colony metaheuristics"

+ MSc in Quality Control and Productivity (2002) - Tecnológico de Monterrey - Monterrey Campus (Mexico)

Thesis: Methodologies integration to performace improvement of a material handling system

+ BSc in Industrial Engineering (2000) - Instituto Politécnico Nacional - UPIICSA (Mexico)

Concentration: Projects Evaluation and Optimization


Research Interests

Luis Moncayo's research interests are in the areas of modelling and optimization of manufacturing and logistics systems using operations research techniques such as linear and non-linear programming, simulation, and analytics. Current work is focused on:

Algorithms Design to Optimise Supply Chain and Manufacturing Systems

The Supply Chain Design (SCD) problems encompass the selection of suppliers, manufacturing plants and ways to deliver products to the final customer. Two issues in SCD are selecting who performs those activities and computing the amount of inventory to place in order to avoid disruptions at the minimum cost.

When the SC has many stages, the problem is not trivial since the number of solutions grows exponentially. Therefore, it is necessary to design algorithms that return approximate (meta-heuristics) or exact (large-scale methods) solutions to solve medium-size instances after modelling the problem. Mainly, the guaranteed-service time inventory model is studied and the solution tools are large-scale optimization methods. These techniques can be easily applied to an assembly system.

Fast Simulation of Queues to Optimise Manufacturing Systems

In a traditional queue system, entities wait in a queue until the server is free to process a new one. There are very well known commercial and open-source applications to solve a complete system of connected queues such as manufacturing (assembly) systems. Unfortunately, when the system has to process about 10 million entities, those applications solve it in a very long time or are unable to handle all the entities.

My research interest lies in programming simulation routines using C++ and R languages not only when entities arrive at the queue but also when there are entities already queued as well as entities scheduled at the beginning of the simulation.

The problem is complicated when the system is a whole manufacturing system that has multiple queues in series and parallel. One open question is to analyze the behaviour of the system when the routines are implemented using Graphics Processing Unit computing.

Application of Operations Research and Analytics

In recent years, practitioners and researchers apply data-driven analytics and fact-based decision making for solving practical problems. INFORMS defines analytics as to the "process of transforming data into insights to make better decisions".

The research interest lies in "prescriptive analytics"; thus, the problem under study requires to develop a mathematical model and then to process a large amount of data using advanced computer science and statistical techniques.

Some problems that I have studied include optimizing inventory in bike-sharing systems using millions of trips, modelling the Data Envelopment Analysis with constraints of capacity, and optimizing the length of the telecommunications network to cover communities with optical fibre.

Those problems need to be modelled and then apply metaheuristics, exact methods, or simulation to find an optimum solution.

LuisMoncayoCV.pdf