Data Analysis

Preface

We are at a very interesting time of Information Technology and Decision Support System. At one side, we are claiming our advancement in technology offers access of data beyond imagination with incredible speed of processing power on the other side our Business decision makers can't realize that they are any better equipped over their previous generation to pick up the right alternative supported by relevant data point. So our train is running faster and more faster over the period of time but unable to take us to our destination with any improved time, may be the advantage of speed we are gaining is not being channelized in right direction, we are not paying enough attention to the route it takes.  In the contrary, the availability of supper computing speed and cheap accessibility of data making us more reluctant to chart out a well thought out plan before taking a data analysis exercise, which compromises the efficiency of the data analysis, making the net improvement of the exercise effort as zero. This article is an effort to discover the basic steps and principle of data analysis, and try to find out the steps or methods which really add value to a data analysis project for a Decision Support System, taking a back to basics approach.

We’ll critically analyze couple of successful data analysis exercised to support decision making at different era of Computer history and try to figure out in spite of wide gap of data accessibility and processing power what makes a data analysis effort useful and impact creating to business decision making.

Executive Summary

Analyzing the decision support system from the early introduction of computer in the industry, it has been observed technology provided more accurate and well organized data to management for decision making but the core principle of successful data analysis method for decision making is always remained same.

Crux of this methodology is :

 

Pre-computerization decision support system

 

Prior to 1960,  there was hardly any electronic copy of information, any analysis to be done has to do manually.

 

In this era, all organization used to maintain their documents and historical data in form of files. At this time organization information management can be visualize as old library management system. Efficiency of information based decision has highly relied on 2 factors.

With no or little help from computer any information crunching was time taking and very costly process. So to do any decision support analysis managers need to do a trade off, how much effort he will be able to divert without sacrificing regular production related activities. Any extensive data analyze without producing any value was out of luxury in those days management.

Case study of DSS prior computerization

A very common decision making data gathering process almost all manufacturing industry used to do was inventory classification and analysis. Most common of all of the classification was : The ABC classification process

ABC analysis, Pareto’s law or “80/20 rule” are all-synonymous of the same tool which basically states that 20% of a given population represents 80% of a specific characteristic. In purchasing, the basic ABC analysis is used to identify which segments represent most of the spend in a given category or portfolio. Most of the time, few segments in a portfolio constitute the largest part of the total spend.

Usually,

- The A segments represent approximately 80% of the total spend within a category

- The B segments represent the following 15% of the total spend within a category

- The C segments are the remaining (most of the time several segments) which represents the final 5% of the total spend

 

The following 6-step’s are used to perform an ABC analysis:

1st step – Identify the objective and the analysis criterion

2nd step –Collect data about the analyzed population

3rd step – Sort out the list by decreasing impact

4th step – Calculate the accumulated impact and the percentage

5th step – Identify the classes

6th step – Analyze the classes and take appropriate decisions

 

1st step – Identify the objective and the analysis criterion

In this step, the objective is clearly understood. In the proposed example below, we are responsible for buying security material for a plant. Our objective is to generate savings on the price of the different equipments, by renegotiating existing contracts. We would like to know which equipment segment should be considered for renegotiation efforts and resources, so that the maximum savings are generated. Therefore, we choose to undertake can ABC analysis of the purchasing category, classified by annual spend.

2nd step – Collect data about the analyzed population

In this step, collect the data necessary to carry out the analysis: The list of the different safety equipments, and the corresponding annual spend.

3rd step – Sort out the list by decreasing impact

In this step, sort out the list of equipment, in decreasing order, according to our criterion of analysis ,i.e. annual spend.

 

 4th step – Calculate the accumulated impact and the percentage

In this step, accumulate the annual expense and calculate the percentage of the accumulated spend compared to the total. Example: the coverall category alone makes 650 k€, which is 44.11% of the total spend; the cover all and monitoring systems make together 1017 k€, which is 69.01% of the total expense. Coverall + monitoring systems + safety boots make 1139 k€, which is 77.32% of the total spend. Etc.

 

5th step – Identify the buy classes

In this step, separate the categories into three different classes. Keep in mind that the aim of the game is not to go for precise 20%-80%, but rather to understand which segments to focus efforts to get the results that are expected in the example by renegotiating prices.

6th step – Analyze the classes and make appropriate decisions

In this final step, do a final analysis to determine next steps. The decisions out of this analysis could be:

• To investigate how many suppliers offer coveralls (and monitoring systems and safety boots). If there are several, what do we gain by consolidating the volume on one supplier?

 

At least once in a year management used to take this pain taking analysis to control inventory expenses. It involves a big team effort. Not only for collecting and prepare data but also analyzing data normally used to take long hour meeting for multiple team members.

Early introduction of computer in the industry

 

Prior to the mid-1960s, it was not cost effective to build large-scale information systems. The first Management Information Systems (MIS) were developed at about that time in large companies. MIS focused on providing managers with structured, periodic reports. Much of the information was from accounting and transaction systems.  

Academic works for decision support systems before 80’s

 

In the late 1960s, a new type of information system became practical – model-oriented DSS or management decision systems. Two DSS pioneers, Peter Keen and  Charles Stabell (1978), claim the concept of decision support evolved from “the  theoretical studies of organizational decision making done at the Carnegie Institute of  Technology during the late 1950s and early '60s and the technical work on interactive  computer systems, mainly carried out at the Massachusetts Institute of Technology in the  1960s.

In 1971, Michael S. Scott Morton’s book Management Decision Systems:  Computer-Based Support for Decision Making was published. In 1968-69 Scott  Morton studied how computers and analytical models could help managers make a key  decision. He conducted an experiment in which managers actually used a Management Decision System (MDS). Marketing and production managers used an MDS to coordinate production planning for laundry equipment. Scott Morton's research was a pioneering implementation, definition and research test of a model-based decision support system. 

T.P. Gerrity, Jr. focused on Decision Support Systems design issues in his 1971  Sloan Management Review article titled "The Design of Man-Machine Decision  Systems: An Application to Portfolio Management". His system was designed to support investment managers in their daily administration of a clients' stock portfolio.  DSS for portfolio management have become very sophisticated since Gerrity began his research. 

In 1974, Gordon Davis, a Professor at the University of Minnesota, published his influential text Management Information Systems: Conceptual Foundations, Structure, and Development. He asserted the MIS concept was “a substantial extension of the concepts of managerial accounting taking into consideration the ideas and techniques of management science and the behavioral theories of management and decision making ".  

Davis defined a Management Information System as "an integrated, man/machine system for providing information to support the operations, management, and decision-making functions in an organization. The systems utilize computer hardware and software, manual procedures, management and decision models, and a database."   

Davis's Chapter 12 titled "Information System Support for Decision Making", and Chapter 13 titled "Information System Support for Planning and Control" created the setting for the development of a broad foundation for Decision Support Systems research and practice. MIS was in many ways beginning to converge with DSS concepts. 

By 1975, J. D. C. Little was expanding the frontiers of computer-supported modeling. 

Little's DSS called Brandaid was designed to support product, promotion, pricing and advertising decisions. Little, in his Management Science article titled "Models and Managers: The Concept of a Decision Calculus" identified criteria for designing models to support management decision--making. His criteria included: robustness, ease of control, simplicity, and completeness of relevant detail. 

Peter G. W. Keen and Michael Scott Morton's DSS textbook titled Decision Support Systems: An Organizational Perspective was published in 1978. Their text provided a comprehensive behavioral orientation to DSS analysis, design, implementation, evaluation and development.

The Software Industry in the 1970s

 

The 1970s saw the contract programming industry continue to grow at a rapid pace.  These companies came to be known as "professional services" firms reflecting the fact that they often provided a broad range of consulting, analysis and design services in addition to programming.

The software products industry became firmly established as a viable source of software for computer users.   If, at the beginning of the 1970s, customers were skeptical that software purchased from a vendor could meet their needs as well as software written in-house, by the end of the decade almost all computer users were buying some portion of their software from software products companies.  As a result of unbundling, the hardware vendors were also major players in the software products field. 

In 1971, ICP held its first annual Million Dollar Awards program recognizing software products that had generated over $1 million in revenue. Twenty-nine software products made the list.  By 1976, the Million Dollar Awards roster had grown to 100 products from 64 software companies.  Fifty-two products had passed $5 million in revenue, 15 had passed $10 million, 4 had passed $20 million, and Cincom’s TOTAL database management system received an award for revenues in excess of $50 million.

The invention of the PC in the mid-1970s led to the founding of the first PC software firms such as Microsoft and Software Arts, which produced the VisiCalc spreadsheet program.

Despite the growth and demonstrated success of the software industry throughout the 1970s, it was not generally recognized as a major investment opportunity.   Funding dried up even for those professional services firms that had gone public in the 1960s and the newly-established software products firms had an extremely difficult time raising capital to fund their growth.  Almost all of their growth was internally financed and many of the company founders extensively leveraged their own personal assets to keep the companies going.  In 1978, a full ten years after its founding, Cullinane Corporation went public, the first software product company to do so.  But the perception of the software business as an investment opportunity was still a number of years away.

 

Mainframe Computers

 

Mainframe computer is a large computer situated in its own air-conditioned room. It normally contains at least one very powerful processor and many times more memory than a personal computer. Such computers were the main workhorses of computing used in the industries up to the late 1980s.

Several manufacturers produced mainframe computers from the late 1950s through the 1970s. The group of manufacturers was first known as "IBM and the Seven Dwarfs": IBM, Burroughs, UNIVAC, NCR, Control Data, Honeywell, General Electric and RCA. Later, shrinking, it was referred to as IBM and the BUNCH. IBM's dominance grew out of their 700/7000 series and, later, the development of the 360 series mainframes. The latter architecture has continued to evolve into their current zSeries/z9 mainframes which, along with the then Burroughs and now Unisys MCP-based mainframes, are among the few mainframe architectures still extant that can trace their roots to this early period. That said, while they can still run 24-bit System/360 code, the 64-bit zSeries and System z9 CMOS servers have nothing physically in common with the older systems. Notable manufacturers outside the USA were Siemens and Telefunken in Germany, ICL in the United Kingdom, and Fujitsu, Hitachi, Oki, and NEC in Japan. The Soviet Union and Warsaw Pact countries manufactured close copies of IBM mainframes during the Cold War; the Strela is an example of an independently designed Soviet computer.

 

Desk top computing revolution

 

The ‘70s and the ‘80s could be easily described as the ‘computer war’. Every company had a new kind of computer, better than the last that they wanted to change the world. Everyone knew it was only a matter of time before one was adopted as the standard, with all the advantages for software compatibility this would bring – and they were desperate for it to be their model that made the big time.

In the ‘70s, two computers nearly became dominant: the Apple II and the Commodore 64. Both of these computers sold in the millions, inspiring a whole generation – they were used for everything from office tasks to games.

It was in 1980, however, that IBM launched its IBM PC, and things really went crazy. IBM’s PC wasn’t patented. IBM went to a small company named Microsoft to get an operating system for this computer, and ended up with DOS, but Microsoft was willing to license DOS to anyone else who paid their fee. By 1984, ‘IBM PC compatible’ computers were available, and a de facto standard was born. Software makers could finally write their programs for one operating system and one hardware configuration – and anyone computer that didn’t follow the specification to the letter was quickly left with no programs to run.

In 1990, Microsoft released Windows 3.0 (the first version of Windows to be really successful), and the PC’s lock on the marketplace was set in stone. The release of the Pentium and Windows 95 made it finally the fastest, cheapest and easiest system around, and it quickly stopped making sense to develop software for anything else.

From then on, the PC was the dominant computer – today, it is estimated to have between 95% and 98% of the market, with almost all the rest being held by Apple Macintosh computers.

Case Study from the period of early introduction of  computer based MIS  in the industry

A leading automobile industry of India decided to phase off one of its model. The purpose of the phase off was to discontinue with old model car for better brand imaging. Those cars were still have some demand, so continue to produce that model was still generating revenue but management felt the production facility can be better utilized and company will have better competitive edge in the market if they can introduce the new models as soon as possible.

But this car model was rolling out from the factory for last 20 years, so plant has still good amount of inventory at various stores and warehouses, initial estimation of that inventory was multiple crores of rupees. So immediate phase off the model will hit a big loss in the accounting, if finance need to write off those inventory value from book. On the other hand if they continue production of that model without the knowledge of where to stop, regular Production Planning Control department will keep bringing material to meet production requirement.

It’s a typical model phase off dilemma for this kind of industry. Now we’ll discuss with the available computer resources at that time how the phase off decision has been taken with more information driven planned approach, which was more scientific rather than intuition based decision which was widely practiced before introduction of computer in the industry.

What computer resource organization was having that time?

Company was having an ingress based mainframe system, developed in house, for managing inventory, production, finance and maintain sales track.

Other than this every department was having a DOS based PC loaded with Dbase and Lotus 123, WS etc.

Steps followed to execute the project :

 

Once these two graphs are available to the management they can take decision how much additional car they will manufacture and leave rest of the inventory to get exhausted by the spare part demand with more fact and figure available with them.

March of ERP systems

 

What is ERP?

Enterprise Resource Planning or ERP is an industry term for integrated, multi-module application software packages that are designed to serve and support multiple business functions. An ERP system can include software for manufacturing, order entry, accounts receivable and payable, general ledger, purchasing, warehousing, transportation and human resources. Evolving out of the manufacturing industry, ERP implies the use of packaged software rather than proprietary software written by or for one customer. ERP modules may be able to interface with an organization's own software with varying degrees of effort, and, depending on the software, ERP modules may be alterable via the vendor's proprietary tools as well as proprietary or standard programming languages.

 

Brief History of ERP

 

The focus of manufacturing systems in the 1960's was on Inventory control. Most of the software packages then (usually customized) were designed to handle inventory based on traditional inventory concepts. In the 1970's the focus shifted to MRP (Material Requirement Planning) systems that translated the Master Schedule built for the end items into time-phased net requirements for the sub-assemblies, components and raw materials planning and procurement.

In the 1980's the concept of MRP-II (Manufacturing Resources Planning) evolved which was an extension of MRP to shop floor and Distribution management activities. In the early 1990's, MRP-II was further extended to cover areas like Engineering, Finance, Human Resources, Projects Management etc i.e. the complete gamut of activities within any business enterprise. Hence, the term ERP (Enterprise Resource Planning) was coined.

The following table summarizes the evolution of ERP from 1960s to 1990s.

Case Study in ERP era

Problem Statement : Establish a game plan to keep  supply of a  critical part, causing plant stop, which was supplied under quantity from an overseas vendor due to wrong supply planning caused by less number mentioned in BoM.

 Problem details : In an Automobile industry, faced a typical issue with their supply chain Bills of Material (BoM). One of its door pad clip is required 6 per car but in BoM its 4. So this part is over consumed. But wit out this part plant need to stop production. As with out this Door pads can’t be fixed and car rolled out door pad can penetrate water in side car if its rain, as the rolled out cars are stored in open storage area and it was monsoon time, and the area receives heavy rain fall for 4 months.

This is an imported part and comes in kit form packed in 24car parts in a kit. And it takes about 30 days to place additional material and bring it by ship. And 7 days to bring it by air but pri9ce is 20 times more.

Without this part plant need to close and that’s a heavy lose for the company for every closed day and also impact the delivery schedule.

So a task force has been assigned with a job to chalk down a plan to ensure regular supply of the part with minimum expenditure and avoiding plant shut down and to solve the problem forever.

So the recommendation came out is as follows:

But to execute this plan you need to have a real time snapshot of the material through out the supply chain, fortunately they had a well maintained ERP system for operating the business, where this data can be available.

Now the challenge is how to extract data from the system and to make decision to resolve this problem and present correct data to upper management.

So they took help of the IT department who ran couple of queries in the data base to capture a real time snapshot of the material across the supply chain, and provide a data in this format :

To keep up a regular production of 25 cars per day, with this supply of the shortage part, we need to figure out where and how much the part is going to be out of stock. So a graph has been created for the expected on hand balance for next 30 days considering supply as the on-hand and in transit material, and consumption is 25 cars material per day.

Find out the maximum shortage quantity in this 30 days and order that by Air ( here its 1099). And once the 1st stock out is after 7 days, plant won’t face a stock out situation, because the air shipment will arrive in 7 days. And remaining to total deficit quantity will be ordered by ship and it will take 30 days to reach. So with the expected Air shipment the projected on hand balance will look like this :

So the steps are :

Once that delivery will be available plant will overcome the problem for ever. Thus with the real time available in ERP helped management for making decision to resolve this problem very controlled way.

Point to be noted, here we needed expert IT professional create right query to the ERP system, understanding the need from the executing professional, and present those data to them in some tool. This is how the executives reached the decisive conclusion by analyzing the data.

Conclusion

Analyzing the decision support system from no computers to the early introduction of computers in the industry even the recent era of computerization, it has been observed technology is providing more and more accurate and well organized data to management for decision making but the core principle of data analysis method for decision making is always the same.

Crux of the methodology :