An algorithm which employs a degree of randomness as part of its logic. thus either the running time, or the output (or both) are random variables. in some cases, probabilistic algorithms are the only practical means of solving a problem. it typically uses uniformly random bits as an auxiliary input to guide its behavior, in the hope of achieving good performance in the "average case" over all possible choices of random bits.
For example the problem of finding an 'a' in an array of n elements, given that half are 'a's and the other half are 'b's.
We can look at each element of the array, but this would take very long (n/2 operations) if the array were ordered as 'b's first followed by 'a's. There is a similar drawback with checking in the reverse order, or checking every second element.
For any strategy at all in which the order in which the elements will be checked is fixed, (a deterministic algorithm):
we cannot guarantee timeliness
for all possible inputs.
On the other hand, if we were to check array elements at random, then we will quickly find an 'a' with high probability, whatever the input.
Always outputs the correct answer, but its running time is a random variable. (Las Vegas algorithms)
Fixed execution time (as a function of the input size), but allow a small probability of error. (Monte Carlo algorithms)
If the choice of the pivot in quick sort is randomized, it will result in n*log(n) time regardless of the formation of the input.
For instance, the Solovay–Strassen primality test is used to determine whether a given number is a prime number. It always answers true for prime number inputs; for composite inputs, it answers false with probability at least 1/2 and true with probability at most 1/2. Thus, false answers from the algorithm are certain to be correct, whereas the true answers remain uncertain; this is said to be a (1/2)-correct false-biased algorithm.
Applying a randomized algorithm like the one above several times, can be helpful in the increase of the confidence in the results.