Hi! I am a research associate in the Department of Computer Science at Stanford University, working with Prof. Jure Leskovec.
Before that, I was a postdoctoral researcher in the Department of Electrical and Systems Engineering at the University of Pennsylvania (Penn) under the supervision of Professor Alejandro Ribeiro. I received my Ph.D. in Electrical and Computer Engineering from the University of Minnesota. I worked with the Signal and Tensor Analytics Research (STAR) group, under the supervision of Professor Nikos Sidiropoulos. Before that, I earned my Diploma (5 year degree) in Electrical and Computer Engineering at the National Technical University of Athens.
My research interests include Signal Processing, Machine Learning, Tensor Analytics, Graph AI, Network Mining, Optimization and Data Analysis.
August 2025: 🏆 Exciting news! Our paper "Relational Graph Transformer" received the Best Paper Award at the Temporal Graph Learning Workshop at KDD 2025! 🎉
August 2025: I served as a panelist at at the Temporal Graph Learning Workshop at KDD 2025.
July 2025: New preprint alert! Check out our new paper "GREmLN: A Cellular Regulatory Network-Aware Transcriptomics Foundation Model". We present the first transformer-based foundation model for single-cell genomics that uses gene regulatory networks. This model can accelerate research on diseases like cancer and Alzheimer’s.
June 2025: Our workshop "New Perspectives in Advancing Graph Machine Learning" was accepted at the Thirty-Ninth Annual Conference on Neural Information Processing Systems (NeurIPS 2025). The submission deadline is September 2nd, 2025, 11:59 pm AoE!
June 2025: New preprint alert! Check out our new survey paper "Relational Deep Learning: Challenges, Foundations and Next-Generation Architectures". We discuss the latest and greatest on Relational Deep Learning and share our vision towards foundation models for Relational Databases.
May 2025: Our tutorial Relational Deep Learning: Challenges, Foundations and Next-Generation Architectures is accepted at the ACM KDD 2025. See you in Toronto!
April 2025: Two papers accepted at ICML 2025, "RelGNN: Composite Message Passing for Relational Deep Learning", and "Zero-shot Generalization of Gnns over Distinct Attribute Domains". Congratulations to everyone!
February 2025: New preprint alert! Check out our new paper "KGGen: Extracting Knowledge Graphs from Plain Text with Language Models". We introduce KGGen, a Python package (pip install kg-gen) that generates high-quality KGs from text, reducing sparsity through entity clustering. We also release MINE, the first benchmark for evaluating KG extraction.
February 2025: New preprint alert! Check out our new paper "RelGNN: Composite Message Passing for Relational Deep Learning". In this paper, we design a novel GNN architecture tailored to answering predictive queries in Relational Databases. Our novel approach achieves state-of-the-art performance with up to 25% improvement over the competing baselines!
January 2025: Our paper "Learning Efficient Positional Encodings with Graph Neural Networks" was accepted at the 13th International Conference on Learning Representations (ICLR). See you all in Singapore!
December 2024: Giving an invited talk on "Relational Deep Learning: Graph Representation Learning on Relational Databases" at the Caltech AI Bootcamp, Caltech University.
November 2024: Giving an invited talk on "Next Generation Positional Encodings for Graph Representation Learning" at the Learning on Graphs (LoG) meetup, Stanford University.
November 2024: Our paper "A Transferable Graph Autoencoder Framework for Network Alignment" was accepted at the Learning on Graphs Conference (LoG 2024)!
November 2024: Giving a talk titled "Towards Next Generation Graph Transformers" at the Stanford Graph Learning Workshop, Stanford University.
October 2024: We are organizing the Stanford Graph Learning Workshop (November 5, Stanford, CA). The workshop will bring together leaders from academia and industry to showcase recent advances in Machine Learning and AI in Relational domains, Foundation Models, and Agents.
October 2024: Check out our new paper "LoRTA: Low Rank Tensor Adaptation of Large Language Models". In this paper, we propose a novel low-rank tensor adaptation framework that finetunes large language, vision, and protein models with increased accuracy and at least an order of magnitude fewer parameters.
October 2024: I am co-teaching CS 224W: Machine Learning with Graphs along with Jure!
September 2024: Check out our new paper "Generalizability of Graph Neural Networks for Decentralized Unlabeled Motion Planning". In this paper, we propose a decentralized policy for Motion Planning, learned via a Graph Neural Network. The GNN policy trained on 100 robots generalizes to scenarios with up to 500 robots, outperforming state-of-the-art solutions by 8.6\% on average and significantly surpassing greedy decentralized methods.
August 2024: Our paper "Online Canonical Correlation Analysis via Rayleigh-Ritz Projections" was accepted at the 2024 Asilomar Conference on Signals, Systems, and Computers (Oct. 27th – Oct 30th, Monterey, CA).
July 2024: Our paper "Zero-Shot Generalization of GNNs over Distinct Attribute Domains" was accepted at the TF2M workshop at the International Conference in Machine Learning (ICML) 2024, in Vienna, Austria!
May 2024: Our paper "A Graph Autoencoder Approach to Crowdsourcing" was accepted at the IEEE 13th Sensor Array and Multichannel Signal Processing Workshop (SAM 2024). See you in Corvallis!
April 2024: I am organizing a special session with Prof. Traganitis on "Multiview learning", at the 2024 Asilomar Conference on Signals, Systems, and Computers (Oct. 27th – Oct 30th, Monterey, CA).
February 2024: I will be attending the 2024 Information Theory and Applications Workshop (ITA) to give an invited talk on Counting Graph Substructures with Graph Neural Networks.
February 2024: I am co-teaching CS 246: Mining Massive Datasets along with Jure!
January 2024: Our paper "Counting Graph Substructures with Graph Neural Networks" was accepted at the 12th International Conference on Learning Representations (ICLR). See you all in Vienna!
January 2024: I am organizing a special session with Prof. Traganitis on "Learning with few labels" at the 2024 IEEE 13th Sensor Array and Multichannel Signal Processing Workshop.
December 2023: Our paper "Graph Neural Networks Are More Powerful Than we Think" was accepted for presentation at the 2024 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). See you all in Seoul!
December 2023: Our paper "Multi-Target Tracking With Transferable Convolutional Neural Networks" won a best student paper award at the IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing!
November 2023: I will be joining the Department of Computer Science at Stanford University as a research associate working with Prof. Jure Leskovec in January 2024.
November 2023: Giving an invited talk on "Representation Learning with Graph Neural Networks: Harnessing Message-Passing beyond Weisfeiller-Lehman" at the Department of Electrical, Computer & Systems Engineering, Rensselaer Polytechnic Institute.
October 2023: Invited poster titled "A Spectral Analysis on the Representation Power of GNNs " at the Fall Fourier Talks, University of Maryland.
October 2023: Our paper "Multi-Target Tracking With Transferable Convolutional Neural Networks" was accepted for presentation at the IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing. It was also selected as a best student paper award finalist.
July 2023: Check out our new paper "Network Alignment with Transferable Graph Autoencoders". In this paper, we propose a novel graph neural network architecture to solve network alignment (graph matching) for very large graphs.
September 2023: Giving an invited talk on "Representation Learning with Graph Neural Networks" at the Department of Computer Science, Stanford University.
September 2023: Giving an invited talk on "Representation Learning with Graph Neural Networks" at the Department of Computer Science, Ecole Polytechnique.
August 2023: Our paper "Solving Large-scale Spatial Problems with Convolutional Neural Networks" was accepted for presentation at the Asilomar Conference on Signals, Systems and Computers, Pacific Grove 2023. It was also selected as a best student paper award finalist.
July 2023: Check out our new paper "Transferability of Convolutional Neural Networks in Stationary Learning Tasks". In this paper, we propose a deep convolutional neural network (CNN) framework to solve spatial problems at a very large scale. The proposed framework is strongly supported by our developed theory on the transferability properties of the CNNs!
June 2023: Giving an invited talk on "Representation Learning on Graphs and Tensors" at the Department of Electrical Engineering, Technical University Darmstadt.
May 2023: Giving an invited talk on "Representation Learning on Graphs and Tensors" at the Department of Electrical and Computer Engineering, Southern Methodist University.
April 2023: Giving an invited talk on "Deep Learning through the Lens of Signal Processing" at the Department of Electrical and Computer Engineering, University of California San Diego.
March 2023: Giving an invited talk on "Graph Neural Networks Are More Powerful Than we Think" at the Department of Electrical and Computer Engineering, Oklahoma State University.
March 2023: Giving an invited talk on "Representation Learning on Heterogeneous Data" at the Department of Electrical and Computer Engineering, University of Georgia.
February 2023: Our paper "Space-Time Graph Neural Networks with Stochastic Graph Perturbations" was accepted for presentation at the IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP) 2023. See you all in Rodhes!
February 2023: I will be attending the 2023 Information Theory and Applications Workshop (ITA) to give an invited talk on the representational power of Graph Neural Networks.
January 2023: We will be presenting a short course on Graph Neural Networks at the IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), which will take place in Greece from June 04 to June 09, 2023.
November 2022: Check out our new paper "Space-Time Graph Neural Networks with Stochastic Graph Perturbations". In this paper we study the stability properties of Space-Time Graph Neural Networks (ST-GNNs) to stochastic Graph Perturbations. We further propose an ST-GNN architecture that efficiently processes dynamic graphs!
October 2022: Check out our new paper "Deep Convolutional Neural Networks for Multi-Target Tracking: A Transfer Learning Approach". In this paper we propose a deep convolutional neural network (CNN) architecture for Multi-Target Tracking. We also provide theoretical conditions on the transferability properties of the CNNs!
August 2022: This Fall semester I am teaching ESE 5140 Graph Neural Networks at the University of Pennsylvania.
July 2022: I will be attending the NSF-Simons Mathematical and Scientific Foundations of Deep Learning Annual Meeting on September 29-30 at the Simons Foundation in New York City.
May 2022: Check out our new paper "Graph Neural Networks Are More Powerful Than we Think"! In this paper study the representational power of Graph Neural Networks (GNNs). Contrary to common belief, we show that the Weisfeiler-Lehman (WL) algorithm is not the real limit and that GNNs can discriminate between the majority of real graphs. We also design convolutional architectures, illustrated below, that are provably more expressive than the WL algorithm and achieve great performance in the task of graph classification!
April 2022: Check out our video presentation for Space-Time Graph Neural Networks!
January 2022: Our paper "Space-Time Graph Neural Networks" was accepted for presentation at the Tenth International Conference on Learning Representations (ICLR 2022).
November 2021: Check out our new paper "Space-Time Graph Neural Networks". In this paper we propose a novel convolutional neural network architecture tailored to data that are time-varying and are also supported on a graph. Exciting results in the decentralized controller systems field that are also supported by strong theoretical analysis!
October 2021: Our paper "GAGE: Geometry Preserving Attributed Graph Embedding" was accepted for presentation as a regular paper at The Fifteenth International Conference on Web Search and Data Mining (WSDM2022--acceptance rate: 20.23%).
April 2021: Check out the video of our webinar in Signal Processing Society (SPS), titled "Tensor Completion from Regular Sub-Nyquist Samples"
March 2021: Our paper "Generalized Canonical Correlation Analysis: A subspace Intersection Approach" has been published in IEEE Transactions in Signal Processing (TSP).
February 2021: Our paper "PREMA: Principled Tensor Data Recovery from Multiple Aggregated views" has been published in IEEE Journal of Selected Topics in Signal Processing (J-STSP).
January 2021: Prof. Sidiropoulos and I will be giving a seminar titled "Tensor Completion from Regular Sub-Nyquist Samples" on Wednesday, January 20, 2021 11:00 am EST. The webinar is part of IEEE Signal Processing Society's (SPS) webinar series. Find more details and RSVP here.
December 2020: Our paper "TeX-Graph: Coupled Tensor-Matrix Knowledge-Graph embedding for COVID-19 drug repurposing" was accepted as a regular paper at SIAM International Conference on Data Mining (SDM21--acceptance rate: 21.25%).
November 2020: Check out our new paper "GAGE: Geometry Preserving Attributed Graph Embedding".
November 2020: Check out the video of my Ph.D. thesis defense titled "Tensor Methods for Signal Reconstruction and Network Embedding".
October 2020: I will be joining University of Pennsylvania (Penn) as a postdoctoral researcher at the Department of Electrical and Systems Engineering under the supervision of Professor Alejandro Ribeiro.
October 2020: Check out our new paper "TeX-Graph: Coupled Tensor-Matrix Knowledge-Graph embedding for COVID-19 drug repurposing".
October 2020: I defended my Ph.D. thesis on "Tensor Methods for Signal Reconstruction and Network Embedding".
September 2020: Our paper titled "Downlink Channel Feedback in FDD Massive MIMO Systems via Tensor Compression and Sampling" will be presented at Asilomar Conference on Signals, Systems, and Computers.
August 2020: Check out the video of my presentation titled "Hyperspectral Super-resolution: A Tensor Factorization Approach" as part of Agricultural Robotics and Automation webinar series.
July 2020: I will be giving a seminar titled, "Hyperspectral Super-resolution: A Tensor Factorization Approach", as part of Agricultural Robotics and Automation webinar series. The webinar will take place on Friday, July 24th 10:00 am ET via Zoom and is free and open to the public, please register. More details at: http://ieeeagra.com/events/webinar-july-24-2020.
May 2020: Check out Faisal's new webpage! Faisal's research spans the areas of machine learning, data mining and very large databases.
May 2020: Check out the video and slides for our paper "Tendi: Tensor Disaggregation from Multiple Coarse Views" as presented in the 24th PAKDD. The approach fuses aggregated multidimensional data from multiple sources to produce clean data in fine granularity. The developed framework benefits a plethora of machine learning and data science tasks!
March 2020: Invited Talk @ Baidu Cognitive Computing Lab titled " Tensor Completion from Regular Samples: Theory and applications".
January 2020: Our paper "Tendi: Tensor Disaggregation from Multiple Coarse Views" has been accepted for presentation in the 24th PAKDD (21% acceptance rate), Singapore 11-14 May 2020.
November 2019: Check out our new paper "PREMA: Principled Tensor Data Recovery from Multiple Aggregated Views ". Code can be found here.
October 2019: Our paper "Tensor Completion from Regular Sub-Nyquist Samples" has been accepted for publication in IEEE Transactions in Signal Processing (TSP). Available code can be found in github.com/marhar19/fMRI_acceleration.
September 2019: Our paper "Large-scale Canonical Polyadic Decomposition via Regular Tensor Sampling" has been accepted for presentation in EUSIPCO 2019, A Coruna, Spain 2-6 Sept 2019.
March 2019: Check out my new Github page: github.com/marhar19.
March 2019: Check out our new paper "Tensor Completion from Regular Sub-Nyquist Samples".
February 2019: Our paper "Regular Sampling of Tensor Signals: Theory and Application to fMRI " has been accepted to ICASSP, Brighton 12-17 May 2019.
January 2019: Our paper "Hyperspectral Super-resolution: A Coupled Tensor Factorization Approach" has been published in IEEE Transactions in Signal Processing (TSP). Available code can be found in github.com/marhar19/HSR_via_tensor_decomposition.
December 2018: Our paper "Structured SUMCOR Multiview Canonical Correlation Analysis for Large-Scale Data" has been published in IEEE Transactions in Signal Processing (TSP).
May 2018: Our paper "Hyperspectral Super-resolution: Combining Low Rank Tensor and Matrix Structure" has been accepted to ICIP, Athens 7-10 October 2018.
April 2018: Check out our new paper "Hyperspectral Super-resolution: A Coupled Tensor Factorization Approach". Available code can be found in github.com/marhar19/HSR_via_tensor_decomposition.
April 2018: Check out out new paper "Structured SUMCOR Multiview Canonical Correlation Analysis for Large-Scale Data".
January 2018: Our paper "Hyperspectral Super-resolution via Coupled Tensor Factorization: Identifiability and Algorithms" has been accepted to ICASSP, Calgary 2018.
January 2018: Our paper "Large Scale Regularized SUMCOR GCCA via Penalty Dual Decomposition" has been accepted to ICASSP, Calgary 2018.
November 2015: Our paper "Max-min feasible point pursuit for non-convex QCQP" was presented in Asilomar Conference on Signals, Systems and Computers, Pacific Grove 2015.