Transformation

Transformation allows for solving a large number of cases while using many fewer algorithms. The concept is to place any piece or group of pieces into a position where correct pieces would usually be placed. Then an algorithm is performed that solves the other pieces relative to the incorrectly placed piece or pieces and then the transformation is undone. To use a couple of examples, this can reduce L5E to around the same number of cases as ELL and reduce L5C from 614 cases to 42 cases. As a concept, transformation originated in NMLL, the last layer method. In NMLL, as can be seen on the Google Sheets document, to improve move-count the best transformation is chosen. In 2012, I developed a table showing the relationship among corners when transformed. This table shows what CLL case the corners transform into when a single turn or a rotation is performed. I later used this to further develop the A2 method. Also in 2012 I showed how it can be applied to OLLCP, CLL+1, and other algorithm sets. Transformation as a concept was re-discovered in 2017 by Joseph Briggs, who used it to created the very cool method called 42. He calls the concept Conjugation.