This course is the second part of a two-part series on Operations Research. While the first part covers Linear Programming, this part focuses on Decision Analysis, Game Theory, and Markov Processes with applications in economic decision-making, strategic interactions, and stochastic systems.
The key objectives are:
Understanding decision-making under uncertainty using expected monetary payoffs and utility theory.
Applying game theory to model economic and strategic interactions.
Analyzing Markov processes and their economic applications in dynamic decision-making, pricing strategies, and inventory models.
📌 Understanding decision-making under uncertainty, comparing monetary payoffs and utility-based approaches.
Expected Monetary Payoffs
Decision Trees and Probabilistic Outcomes
Risk-Neutral Decision Making
Economic Applications: Investment Choices, Firm Strategy
Expected Utility Theory
Von Neumann–Morgenstern Utility Function
Risk Aversion and Certainty Equivalents
Applications: Insurance Markets, Portfolio Choice
🛠 Computational Tools: Python (NumPy for simulations, Matplotlib for decision trees)
📌 Introduction to strategic interactions and equilibrium concepts, focusing on zero-sum games.
Zero-Sum Games
Minimax Theorem and Optimal Strategies
Solving Zero-Sum Games via Linear Programming
Applications: Trade Wars, Auction Design, Labor Bargaining
Mixed Strategy Nash Equilibrium
Concept of Mixed Strategies and Expected Payoffs
Applications: Pricing Competition, Market Entry Models
Computational Methods: Simplex for Mixed-Strategy Computation
🛠 Computational Tools: Python (Game Theory Toolkit, Linear Programming solvers)
📌 Understanding stochastic processes and their role in dynamic economic decision-making.
Introduction to Markov Chains
Transition Matrices and Steady-State Probabilities
Economic Applications: Customer Retention, Brand Loyalty Models
Markov Decision Processes (MDPs)
Bellman Equations and Dynamic Programming
Economic Applications: Pricing Strategies, Optimal Stopping Problems
Applications in Economics and Finance
Stochastic Inventory Models
Credit Rating Transitions and Default Probabilities
🛠 Computational Tools: Python (Markov Chain Simulation, Dynamic Programming Algorithms)
Frederick Hillier & Gerald Lieberman (2020) – Introduction to Operations Research (Core concepts in OR, decision analysis, and Markov processes)
Wayne L. Winston (2003) – Operations Research: Applications and Algorithms (Algorithmic approaches to game theory and stochastic processes)
Robert Gibbons (1992) – Game Theory for Applied Economists (Economic applications of game theory)
Sheldon Ross (2014) – Introduction to Stochastic Processes (Comprehensive treatment of Markov processes)