statrefs home‎ > ‎Flotsam and Jetsam‎ > ‎Courses‎ > ‎

Training Ideas


 







General Notes

 

On the topic of stats …

 

If/when I can reduce workload in other areas, I would be excited about the possibility of continuing earlier efforts to expand skillsets and competencies with plant QEs in key areas.  These might include the areas noted below.

 

If you would like, we can work with anyone at your facilities that you feel currently is, or would like to become, a subject-matter-expert in any area.  They might assist with course preparation, teaching various aspects, or might simply focus efforts to master specialized material (where it adds value for you).

 

Regards,

 

Cliff

 

 

What the goal is:   to develop rock-solid competencies for company QEs

What the goal is not:   the goal is NOT to turn everyone into a fully-qualified statistician

 

 

Most of these topics would include discussions regarding:

  • the assumptions behind the statistical methods,
  • how to conduct diagnostics on assumption validity, and 
  • what to do when the assumptions aren't met

Formats for reports of statistical analyses (some templates exist) would also be discussed (aside from Protocol Report formats already templated).

 

 

Basic Methods

  • Data management
    • importing from various sources (.csv files, MS Excel, MS Access, ODBC)
    • stacking/unstacking
    • recoding
    • standardizing
    • validation of a dataset
  • descriptive stats
  • graphs and charts
  • t-test
  • chi-squared test
  • etc. etc. etc.

 

 

Attribute Sampling Plans (Z1.4, MIL STD 105x)


I get the sense that there are those who still only have a superficial understanding of how these plans are used, why they work, and when they are appropriate.  The goal would be to develop understanding to the level that the QE can reproduce the tables and O-C curves.  We would also discuss when these sampling plans are appropriate, and when they are not (with a case study from a previous project).

 

We can cover case studies, such as:

  • a Double Sampling plan put in place to address concerns from engineering regarding the cost of testing;
  • the use of the Isolated Lot sampling plan; and
  • the use of a Continuous Sampling plan.

 

We would cover other sampling plans and approaches as well as a natural part of this training.  (For instance, ANSI/ASQC Q3 for Isolated Lots.)

 

 

Regression


This isn't the standard course in regression.  A course in regression needs to be the backbone of most of the material in other areas.  We will develop expertise beyond the simple, linear, first-order least squares fit that many apply reflexively.  We will also cover - in detail - regression diagnostics to check for the validity of the model and goodness-of-fit.  There are MANY topics that need to be covered that I am not listing here.

 

We definitely need to cover regression in-depth prior to any detailed coverage of ANOVA, DOE, and such.

 

 

Logistic Regression


This is a tool that has been used much more than anticipated.  It might be time for a rigorous course in its use for the QEs.

 

 

Statistical Process Control


This would not be the standard offering commonly available.  This would be based on an Advanced SPC course.  We would cover the theory behind why SPC charts work and when they are valid, and would use the "hypothesis testing" approach to SPC to enable the inclusion of plant economic considerations in the design of an SPC study or program and its decision criteria.

 

We would also cover development of an SPC program, to include Phase 1 and Phase 2 SPC methods. 

 

Economic considerations, including the concepts of ARL1 and ARL2 and ATS, leading to rationale and economically balanced considerations for subgroup sample size and frequency.

 

The combination of SPC and EPC, along with an introduction to supporting time series methods.

 

Additional and more advanced SPC tools would be covered - some for proficiency, some for familiarity.


 

An observation regarding advanced SPC methods:


Realizing that I'm preaching to the choir, I feel that these methods will do us no good if we lack the will to employ the basic SPC methods on a regular basis, and to act on the results.  If we want to achieve "World Class" anything, we need to start acting on special cause variation as well as incorporating SPC methods into monitoring ongoing production (for key products and characteristics, as well as key process inputs), and insisting on the use of DOE-based methods, sometimes contrary to what engineering thinks, to optimize our manufacturing processes. We will also need - as several have pointed out - to tie all of this back to Voice of the Customer in a quantitatively meaningful way.

 

 

Measurement System Analysis


Again, this would not be the typical offering of GR&R.  We cover the theory behind the methods that has enabled us to extend the GR&R tool to many nontraditional situations.  The approach is based on the use of EMS tables, random effects models and variance components, and nested designs when designing the experiment and conducting the analysis.  We have many case studies that could be incorporated.

 

Based on the theory, we have offered several innovations in this area.  There is at least one more innovation being explored (as I have time) … Repeated Measures GR&R.  Also possible is a model for GR&R with censoring in the data.

 

 

Design of Experiments


The study of DOE should be preceded by ANOVA and Regression.  We already have quite a library of DOE training topics.

  • Factorial
  • Fractional factorial (including Plackett Burman screening experiments, both geometric and nongeometric),
  • 2-level vs. 3-level vs. mixed level
  • Central Composite Designs (CCD)
  • Mixture DOE
  • Response Surface methods 


Related DOE topics include:

  • blocking
  • aliasing/confounding
  • design resolution
  • orthogonality
  • contrasts
  • design generator and defining relation.

 

 

Survival (Reliability) Analysis


We would continue to offer the "advanced introduction", and can expand on this topic as needed.


A related topic would cover Maximum Likelihood methods, and how they allow parameter estimation in the presence of data censoring.


  • Nonparametric survival analysis (Kaplan-Meier)
  • Parametric survival/reliability analysis (Weibull, Lognormal, Exponential, Gamma, other)


 

Nonparametric Methods


These are the methods that are often called on when the assumptions behind a statistical model/method are not met.  There are too many to list here.  We would probably provide a broad but shallow coverage of the many nonparametric methods, and can tailor any other coverage as needs arise.

 

 

Time Series Analysis


This may or may not be useful for the QEs, but there are others outside of QM who might benefit.  I am occasionally called on to do work for other groups in this area.

 

 

Bayesian Methods


This is an extremely powerful set of tools that can have application in our areas.  I have had some basic (graduate-level) training.  I would like to introduce these concepts/tools and keep an eye open for potential application.  One area might be Bayesian Acceptance Sampling (ASQC Basic References in Quality Control: Statistical Techniques Volume 7).  Bayesian Survival Analysis might be another potential area for application.

 

 



There are some things that we can do to demonstrate key aspects of the theory in the areas noted above. 


Examples include: 

  • Where do the constants come from that are used in SPC? 
  • How do we test the assumptions and validity of a statistical model? 
  • How exactly is censored (missing) data used in the analysis? 
  • What is MLE (used in many statistical algorithms) and why is it useful and valid?


 



Comments