It's important to understand why AI/ML models with deep neural network architectures make specific predictions to establish trust
We propose TERP, that explains black-box behavior by fitting local linear models
However, some fitted models (explanations) are easier for humans to understand than others
Model complexity is not a good descriptor for human interpretability
a Illustrative input feature coefficients for linear model 1. b Coefficients for linear model 2. Both models have the same number of model parameters (six). However, model 2 is significantly more human-interpretable than model 1, where two of the six features stand out as most relevant for predictions.
We introduce interpretation entropy (S) capable of quantifying human-interpretability of any linear model
Read my recent work to learn more ...
A fun AI generated podcast titled, "Dazzling results, but the how?, the why?, it's …", where the hosts summarize this paper without the technical details can be found here (released by Dr. Sam Scarpino)
α-aminoisobutyric acid9 or, (Aib)9 is a small polypeptide that undergoes left- to right-handed (and vice-versa) helical transitions. Due to time-scale limitations, traditional Molecular Dynamics (MD) is not viable to study such a system. In my recent work, I employed AI based methods (SPIB) to learn the reaction coordinate (RC) that describes this system. Afterwards, I perform metadynamics based on SPIB learned reaction coordinate and successfully observe this transition.
Small molecule permeation through a biological membrane remains a challenging problem for MD. The challenge arises primarily from the entropic component of the problem i.e, the molecule needs to find a suitable pathway to go from one side of the membrane to the other.
Similar to the problem of studying helical transitions of polypeptide described above, I employed SPIB based metadynamics and successfully simulated permeation of a small compound (benzoic acid) through phospholipid (DMPC) bilayer.
Read my work involving these two systems to learn more ...