Introduction

Course Goal

At the end of this course you will be able to create dynamic, interactive models with minimal risk and maximum transparency for others to use.

Introduction

TBM is a paradigm shift in Excel modeling. Legacy Excel modeling methods are designed for simplicity. TBM is designed to make complexity simple. 

To illustrate how legacy methods fall short when faced with complexity we will examine a multi-dimensional model created to meet Paul Mireault's Multi-Dimensional Spreadsheet Challenge. Paul challenged the world's best modelers to solve his puzzle. Click here to see the challenge's results. Most modelers used traditional methods. I (and a friend of mine) used a database approach. Paul wrote:

"Participants using the Database approach usually have the simplest formulas for aggregated variables. We expect that they will be the easiest to maintain."

Indeed, TBM models are: easier to maintain, expand, modify, construct and accept analytical functionality. This course will show how. But before we tackle complex problems, we will explore familiar territory for today's financial modeler, we will see how TBM can also address simple 3 way financial models (See GIF). And we will learn about TBM from a professional perspective. 

Course Outline

Professional Modeling


An amateur practises until he can do a thing right, 

a professional until he can’t do it wrong.” 


Psychology for Musicians by Percy C. Buck

Most people think skill differentiates amateurs from professional modelers. That's not how I see it. Non-professionals can be highly skilled too. I know some. In my opinion what differentiates amateurs from professional modelers is who uses the model: the modeler or someone else.

Modeling for Ourselves vs Modeling for Others

There are fundamental differences in how we model when we model for ourselves versus others. When modeling for ourselves we are: 

All these freedoms spell disaster when modeling for others. When modeling for others we are:

Neglecting any of these requirements can be disastrous. Deliver the wrong thing and we fail. Deliver something our customers cannot understand how to use and we fail. Deliver something customers can easily break, or breaks while being used and we fail.

Models that Cannot Fail

To re-frame Percy C. Buck's excellent quote: An amateur models until everything works, a professional until nothing can fail.

This course focuses on building models that cannot fail. But more than that, it shows how to improve transparency and efficiency while reducing complexity and risk. It also shows how to provide greater flexibility, scalability, and connectivity than traditional methods. These benefits grow in value as models become more complex and connected to other models, data sources, and people.

Modeling for Complexity and Flexibility

Modeling with spreadsheets started in 1979 with the birth of VisiCalc. VisiCalc ran on a 32K-byte Apple II which limited spreadsheets to 254 rows and 63 columns . Models had to be extremely simple. Early modelers typically took a P&L printout, keyed it into a spreadsheet, and replaced totals with formulas. With these simple models we could answer questions like, "How will a 5% increase in raw materials impact profits?"

PCs now run more than 1,000 times faster over 64 cores and Excel can have over a million rows with over 16 thousand columns. Clearly, the spreadsheet's evolution enables far more complex models but the foundations of spreadsheet modeling in many sectors remains unchanged - their calculations still look like financial statements - and that severely limits complexity and flexibility. 

Today's Excel is no longer restricted to mimicking financial statements. Today's Excel has the power to mimic the systems that create them. These systems are often called ERP (Enterprise Resource Planning) software. All ERPs share the same basic architecture: Tables store data; Programs process data, Presentations display data. This architecture supports, not just financials, but human resources, payroll, sales forecasting, production planning and scheduling, supply chain planning and scheduling, and much more. This architecture can support all these functions because it is designed for flexibility and complexity.  

Making Complexity Easier to Understand

To state the obvious, complex models have more cells than simple models. To understand any model in Excel, we must examine those cells to see how they relate to each other. To understand formula cells using cell addresses for precedents, which look like: =A1+B1, we must examine each cell reference to learn what A1 and B1 are. Hopefully, those cells have labels. That takes a lot of time and is too cumbersome for complex models. There is a better way, and that better way is how every ERP on the planet references information. 

Every ERP shares the same basic architecture of which tables are the foundation. Excel now has tables. In 2007 Excel's Lists became Tables with Structured References; at-which-time, Excel's tables look like an ERP's files. They both have column headings (aka Field names). When program's use a file's data, they refer to data elements by field name, not position. This is a huge leap for enabling complexity because it removes the translation step required to understand cell references. We can just look at the formula and know what each precedent is: =[@[Material Costs]] + [@[Labor Costs]]. That saves huge amounts of time during development and auditing. 

Making Models Flexible

Flexibility is being able to use one thing for other things. ERPs are not built for just one business type. They are flexible enough to be leveraged across many different businesses. One reason they are so flexible is because a lot of thought went into table design. Those thoughts formed what is known as Normalized Data. Normalized data is structured in such a way as to store any specific piece of information once and easily associate it with related information. The ability to relate data easily was missing from Excel until Microsoft added the Data Model. 

Summary

New Excel features like tables, structured references, the Data Model, etc. may seem exotic. But if you can master cell reference and all of its manifestations (relative, absolute, mixed), you can easily master these in just a few minutes. Once learned you will never go back to the old methods.


Conclusion

TBM mimics systems proven to address the needs of complex projects across a broad spectrum of applications. This makes TBM ideal for modelers in engineering, operations, supply chain, sales, and especially FP&A (Finance, Planning and Analysis) departments that must integrate with ERPs to model and analyze not just financials, but human resources, payroll, sales forecasting, production planning and scheduling, supply chain planning and scheduling, and much more.