2nd Award, Physical Science and Engineering Category
Gravitational waves from massive and violent events billions of light-years away permeate throughout space, compressing and stretching everything in their way. These waves are known to be propagating fluctuations and disturbances in gravitational fields, and are often referred to as “ripples in spacetime”. However, since the force of gravity is the weakest of the four fundamental forces (almost 4.15×1042 times weaker than the electromagnetic force), gravitational waves are exceedingly small, hence making their detection all the more difficult. Furthermore, a noise background is present in nearly all gravitational wave detections, due to various cosmological, astrophysical, and local effects. Using machine learning, the gravitational wave signal could be extrapolated from the stochastic background noise in order to increase the quality of gravitational wave data for future analysis.
Training and testing data was sourced from LIGO’s strain data releases from observational runs and merger events. Training sessions were performed on an M1 chip 8-core GPU running TensorFlow in Python 3.8.5. Due to the similarities between gravitational wave signal processing and automatic speech recognition, a LSTM (Long short-term memory) autoencoder framework was implemented. The algorithm showed an improvement on the SNR of about 24%, which was improved with additional functions used to whiten and bandpass the data. While the model was trained with white Gaussian noise, it was still able to deal with changes in the noise distributions of real LIGO non-stationary noise. Further studies include learning the loss function through a GAN machine learning model and implementing parameter estimation.