Pankaj Mehra's new home
Measurement
Modeling
Analysis
Simulation
Workload mix
For each workload in the mix
Allocation/De-allocation and long-vs-short-lived behaviors
Access (and co-access) patterns in space and time
Page-level hotness/coldness
Page classification attributes
Data structure tags if any
as DTMCs and PFQNs
CNN based page classifiers
Watermark (high and low) modeling using Decision Trees and Random Forests
[Pankaj] Our modeling work begins by learning from an all time great, David Bailey of NASA Ames and DOE LBL NERSC. His 1998 paper Challenges of Future High-End Computing models a system of compute and memory using the simplest rule of queueing theory, Little's Law, and presents beautiful and timeless conclusions about the relationship between memory latency and the parallelism needed to exploit that memory at a certain bandwidth. We start to understand where the price will be paid for the extra latency CXL is adding to the memory path.
Analysis of Allocation, Promotion and Demotion (aka Data Placement) policies
[Marginal] Overall memory cost and utilization analysis and optimization with pools and FAM
[Marginal] Access performance analysis and optimization in the presence of far/shared/computational memory
[Joint] cost-performance multi-objective optimization: Branch and Bound with Dominance relations and other Pareto Optimality techniques
[Multi-tenancy, Win-Win-Win strategies, and Nash Equilibria] Exploring the conditions of optimal benefit among End Users