Introduction
Laymen explanation
Technical explanation
In probability theory, Chebyshev's inequality (also called the Bienaymé–Chebyshev inequality) guarantees that, for a wide class of probability distributions, no more than a certain fraction of values can be more than a certain distance from the mean. Specifically, no more than 1/k2 of the distribution's values can be more than k standard deviations away from the mean (or equivalently, at least 1 − 1/k2 of the distribution's values are within k standard deviations of the mean).
Mathematical explanation
Below is the example for K=2
Benefits
Role in machine learning
Use in neural networks
Reference
https://youtu.be/DWsDqKlW7Z4
https://en.wikipedia.org/wiki/Chebyshev%27s_inequality
https://medium.com/analytics-vidhya/basic-understanding-of-gaussian-distribution-and-chebyshevs-inequality-ba57ba79a635
https://images.app.goo.gl/dq3eXqTdx1hLbbfPA