Gosset's famous paper introducing the t-test was published a hundred years ago, and since then we've had a century in statistics devoted to characterizing sampling error and accounting for small sample sizes. The next century, in which Big Data comes of age, will be focused on different issues.
https://www.youtube.com/watch?v=3xOK2aJ-0Js&feature=player_embedded
Thirty trillion sensors...so sample size is infinite, but does that mean uncertainty disappears? Welcome to the dawn of the second century of statistics, in which methods to handle mensurational uncertainty will dominate.
mensurational uncertainty
nonstationarity
model uncertainty
Beyond the statistician's bag of marbles
How binary sampling data informs us when we can’t make the usual assumptions