kernel density estimate
KDE basically smooths the histogram of the data using some kernel function. Gaussian kernel smooths data really well.
Tophat kernel is probably picking the max value from a sliding window (I guess).
At the end, it normalize the smoothed histogram, then it becomes a probability density... i.e. the integral is 1.
The below script plots the histogram on the left, and the density by KDE on the right. The bandwidth parameter of the KernelDensity function specifies the sliding window size.
Adjust the window size according to data range and the required smoothness.
import numpy as np
import matplotlib.pyplot as plt
from sklearn.neighbors import KernelDensity
X = (np.random.normal(0, 1, 100) * 100).astype(int)
X = np.sort(X)
X_plot = np.array([i for i in set(X)])
X_plot = np.sort(X_plot)
X = X.reshape(-1,1)
X_plot = X_plot.reshape(-1,1)
fig, ax = plt.subplots(1, 2, figsize=(12, 6))
# # histogram
ax[0].hist(X[:, 0], bins=10)
ax[0].set_xlabel('Value')
ax[0].set_ylabel('Count')
ax[1].set_title("Histogram")
# Gaussian KDE
kde = KernelDensity(kernel="gaussian", bandwidth=20).fit(X)
log_dens = kde.score_samples(X_plot)
ax[1].fill(X_plot[:, 0], np.exp(log_dens))
ax[1].set_title("Gaussian Kernel Density")
ax[0].set_ylabel('log density')
ax[0].set_xlabel('Value')
plt.show()