Home
WELCOME TO MY WEBSITE!
Hi! I am a research associate in the Department of Computer Science at Stanford University, working with Prof. Jure Leskovec.
Before that, I was a postdoctoral researcher in the Department of Electrical and Systems Engineering at the University of Pennsylvania (Penn) under the supervision of Professor Alejandro Ribeiro. I received my Ph.D. in Electrical and Computer Engineering from the University of Minnesota. I worked with the Signal and Tensor Analytics Research (STAR) group, under the supervision of Professor Nikos Sidiropoulos. Before that, I earned my Diploma (5 year degree) in Electrical and Computer Engineering at the National Technical University of Athens.
My research interests include Signal Processing, Machine Learning, Tensor Analytics, Graph AI, Network Mining, Optimization and Data Analysis.
News
February 2024: I will be attending the 2024 Information Theory and Applications Workshop (ITA) to give an invited talk on Counting Graph Substructures with Graph Neural Networks.
February 2024: I am co-teaching CS 246: Mining Massive Datasets along with Jure!
January 2024: Our paper "Counting Graph Substructures with Graph Neural Networks" was accepted at the 12th International Conference on Learning Representations (ICLR). See you all in Vienna!
January 2024: I am organizing a special session with Prof. Traganitis on "Learning with few labels" at the 2024 IEEE 13th Sensor Array and Multichannel Signal Processing Workshop.
December 2023: Our paper "Graph Neural Networks Are More Powerful Than we Think" was accepted for presentation at the 2024 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). See you all in Seoul!
December 2023: Our paper "Multi-Target Tracking With Transferable Convolutional Neural Networks" won a best student paper award at the IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing!
November 2023: I will be joining the Department of Computer Science at Stanford University as a research associate working with Prof. Jure Leskovec in January 2024.
November 2023: Giving an invited talk on "Representation Learning with Graph Neural Networks: Harnessing Message-Passing beyond Weisfeiller-Lehman" at the Department of Electrical, Computer & Systems Engineering, Rensselaer Polytechnic Institute.
October 2023: Invited poster titled "A Spectral Analysis on the Representation Power of GNNs " at the Fall Fourier Talks, University of Maryland.
October 2023: Our paper "Multi-Target Tracking With Transferable Convolutional Neural Networks" was accepted for presentation at the IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing. It was also selected as a best student paper award finalist.
July 2023: Check out our new paper "Network Alignment with Transferable Graph Autoencoders". In this paper, we propose a novel graph neural network architecture to solve network alignment (graph matching) for very large graphs.
September 2023: Giving an invited talk on "Representation Learning with Graph Neural Networks" at the Department of Computer Science, Stanford University.
September 2023: Giving an invited talk on "Representation Learning with Graph Neural Networks" at the Department of Computer Science, Ecole Polytechnique.
August 2023: Our paper "Solving Large-scale Spatial Problems with Convolutional Neural Networks" was accepted for presentation at the Asilomar Conference on Signals, Systems and Computers, Pacific Grove 2023. It was also selected as a best student paper award finalist.
July 2023: Check out our new paper "Transferability of Convolutional Neural Networks in Stationary Learning Tasks". In this paper, we propose a deep convolutional neural network (CNN) framework to solve spatial problems at a very large scale. The proposed framework is strongly supported by our developed theory on the transferability properties of the CNNs!
June 2023: Giving an invited talk on "Representation Learning on Graphs and Tensors" at the Department of Electrical Engineering, Technical University Darmstadt.
May 2023: Giving an invited talk on "Representation Learning on Graphs and Tensors" at the Department of Electrical and Computer Engineering, Southern Methodist University.
April 2023: Giving an invited talk on "Deep Learning through the Lens of Signal Processing" at the Department of Electrical and Computer Engineering, University of California San Diego.
March 2023: Giving an invited talk on "Graph Neural Networks Are More Powerful Than we Think" at the Department of Electrical and Computer Engineering, Oklahoma State University.
March 2023: Giving an invited talk on "Representation Learning on Heterogeneous Data" at the Department of Electrical and Computer Engineering, University of Georgia.
February 2023: Our paper "Space-Time Graph Neural Networks with Stochastic Graph Perturbations" was accepted for presentation at the IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP) 2023. See you all in Rodhes!
February 2023: I will be attending the 2023 Information Theory and Applications Workshop (ITA) to give an invited talk on the representational power of Graph Neural Networks.
January 2023: We will be presenting a short course on Graph Neural Networks at the IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), which will take place in Greece from June 04 to June 09, 2023.
November 2022: Check out our new paper "Space-Time Graph Neural Networks with Stochastic Graph Perturbations". In this paper we study the stability properties of Space-Time Graph Neural Networks (ST-GNNs) to stochastic Graph Perturbations. We further propose an ST-GNN architecture that efficiently processes dynamic graphs!
October 2022: Check out our new paper "Deep Convolutional Neural Networks for Multi-Target Tracking: A Transfer Learning Approach". In this paper we propose a deep convolutional neural network (CNN) architecture for Multi-Target Tracking. We also provide theoretical conditions on the transferability properties of the CNNs!
August 2022: This Fall semester I am teaching ESE 5140 Graph Neural Networks at the University of Pennsylvania.
July 2022: I will be attending the NSF-Simons Mathematical and Scientific Foundations of Deep Learning Annual Meeting on September 29-30 at the Simons Foundation in New York City.
May 2022: Check out our new paper "Graph Neural Networks Are More Powerful Than we Think"! In this paper study the representational power of Graph Neural Networks (GNNs). Contrary to common belief, we show that the Weisfeiler-Lehman (WL) algorithm is not the real limit and that GNNs can discriminate between the majority of real graphs. We also design convolutional architectures, illustrated below, that are provably more expressive than the WL algorithm and achieve great performance in the task of graph classification!
April 2022: Check out our video presentation for Space-Time Graph Neural Networks!
January 2022: Our paper "Space-Time Graph Neural Networks" was accepted for presentation at the Tenth International Conference on Learning Representations (ICLR 2022).
November 2021: Check out our new paper "Space-Time Graph Neural Networks". In this paper we propose a novel convolutional neural network architecture tailored to data that are time-varying and are also supported on a graph. Exciting results in the decentralized controller systems field that are also supported by strong theoretical analysis!
October 2021: Our paper "GAGE: Geometry Preserving Attributed Graph Embedding" was accepted for presentation as a regular paper at The Fifteenth International Conference on Web Search and Data Mining (WSDM2022--acceptance rate: 20.23%).
April 2021: Check out the video of our webinar in Signal Processing Society (SPS), titled "Tensor Completion from Regular Sub-Nyquist Samples"
March 2021: Our paper "Generalized Canonical Correlation Analysis: A subspace Intersection Approach" has been published in IEEE Transactions in Signal Processing (TSP).
February 2021: Our paper "PREMA: Principled Tensor Data Recovery from Multiple Aggregated views" has been published in IEEE Journal of Selected Topics in Signal Processing (J-STSP).
January 2021: Prof. Sidiropoulos and I will be giving a seminar titled "Tensor Completion from Regular Sub-Nyquist Samples" on Wednesday, January 20, 2021 11:00 am EST. The webinar is part of IEEE Signal Processing Society's (SPS) webinar series. Find more details and RSVP here.
December 2020: Our paper "TeX-Graph: Coupled Tensor-Matrix Knowledge-Graph embedding for COVID-19 drug repurposing" was accepted as a regular paper at SIAM International Conference on Data Mining (SDM21--acceptance rate: 21.25%).
November 2020: Check out our new paper "GAGE: Geometry Preserving Attributed Graph Embedding".
November 2020: Check out the video of my Ph.D. thesis defense titled "Tensor Methods for Signal Reconstruction and Network Embedding".
October 2020: I will be joining University of Pennsylvania (Penn) as a postdoctoral researcher at the Department of Electrical and Systems Engineering under the supervision of Professor Alejandro Ribeiro.
October 2020: Check out our new paper "TeX-Graph: Coupled Tensor-Matrix Knowledge-Graph embedding for COVID-19 drug repurposing".
October 2020: I defended my Ph.D. thesis on "Tensor Methods for Signal Reconstruction and Network Embedding".
September 2020: Our paper titled "Downlink Channel Feedback in FDD Massive MIMO Systems via Tensor Compression and Sampling" will be presented at Asilomar Conference on Signals, Systems, and Computers.
August 2020: Check out the video of my presentation titled "Hyperspectral Super-resolution: A Tensor Factorization Approach" as part of Agricultural Robotics and Automation webinar series.
July 2020: I will be giving a seminar titled, "Hyperspectral Super-resolution: A Tensor Factorization Approach", as part of Agricultural Robotics and Automation webinar series. The webinar will take place on Friday, July 24th 10:00 am ET via Zoom and is free and open to the public, please register. More details at: http://ieeeagra.com/events/webinar-july-24-2020.
May 2020: Check out Faisal's new webpage! Faisal's research spans the areas of machine learning, data mining and very large databases.
May 2020: Check out the video and slides for our paper "Tendi: Tensor Disaggregation from Multiple Coarse Views" as presented in the 24th PAKDD. The approach fuses aggregated multidimensional data from multiple sources to produce clean data in fine granularity. The developed framework benefits a plethora of machine learning and data science tasks!
March 2020: Invited Talk @ Baidu Cognitive Computing Lab titled " Tensor Completion from Regular Samples: Theory and applications".
January 2020: Our paper "Tendi: Tensor Disaggregation from Multiple Coarse Views" has been accepted for presentation in the 24th PAKDD (21% acceptance rate), Singapore 11-14 May 2020.
November 2019: Check out our new paper "PREMA: Principled Tensor Data Recovery from Multiple Aggregated Views ". Code can be found here.
October 2019: Our paper "Tensor Completion from Regular Sub-Nyquist Samples" has been accepted for publication in IEEE Transactions in Signal Processing (TSP). Available code can be found in github.com/marhar19/fMRI_acceleration.
September 2019: Our paper "Large-scale Canonical Polyadic Decomposition via Regular Tensor Sampling" has been accepted for presentation in EUSIPCO 2019, A Coruna, Spain 2-6 Sept 2019.
March 2019: Check out my new Github page: github.com/marhar19.
March 2019: Check out our new paper "Tensor Completion from Regular Sub-Nyquist Samples".
February 2019: Our paper "Regular Sampling of Tensor Signals: Theory and Application to fMRI " has been accepted to ICASSP, Brighton 12-17 May 2019.
January 2019: Our paper "Hyperspectral Super-resolution: A Coupled Tensor Factorization Approach" has been published in IEEE Transactions in Signal Processing (TSP). Available code can be found in github.com/marhar19/HSR_via_tensor_decomposition.
December 2018: Our paper "Structured SUMCOR Multiview Canonical Correlation Analysis for Large-Scale Data" has been published in IEEE Transactions in Signal Processing (TSP).
May 2018: Our paper "Hyperspectral Super-resolution: Combining Low Rank Tensor and Matrix Structure" has been accepted to ICIP, Athens 7-10 October 2018.
April 2018: Check out our new paper "Hyperspectral Super-resolution: A Coupled Tensor Factorization Approach". Available code can be found in github.com/marhar19/HSR_via_tensor_decomposition.
April 2018: Check out out new paper "Structured SUMCOR Multiview Canonical Correlation Analysis for Large-Scale Data".
January 2018: Our paper "Hyperspectral Super-resolution via Coupled Tensor Factorization: Identifiability and Algorithms" has been accepted to ICASSP, Calgary 2018.
January 2018: Our paper "Large Scale Regularized SUMCOR GCCA via Penalty Dual Decomposition" has been accepted to ICASSP, Calgary 2018.
November 2015: Our paper "Max-min feasible point pursuit for non-convex QCQP" was presented in Asilomar Conference on Signals, Systems and Computers, Pacific Grove 2015.