If you need the proof of these theorems and the definition of RLCT, please see,
(1) S. Watanabe, Asymptotic equivalence of Bayes cross validation and widely applicable information criterion in singular learning theory. Journal of Machine Learning Research, vol11, pp.3571-3594, 2010.
Note : Hold-out cross validation, which is often referred to as out-of-sample test, has much larger variance than leave-one-out cross validation. Therefore hold-out cross-validation is generally not recommended, except in cases where there is no other viable option.