Luana Ruiz
Assistant Professor, Department of Applied Mathematics and Statistics, Johns Hopkins University
Wyman Park Building N452
Baltimore, MD 21211
lrubini1-at-jh-dot-edu
I am an assistant professor with the Department of Applied Mathematics and Statistics at Johns Hopkins University, the Mathematical Institute for Data Science (MINDS), and the Data Science and Artificial Intelligence Institute (DSAI). Before that, I was a METEOR and FODSI postdoctoral fellow working with Prof. Stefanie Jegelka at MIT, and a Google Research Fellow at the Simons Institute for the Theory of Computing. I obtained my PhD from the Electrical and Systems Engineering Department at Penn, where I was very fortunate to be advised by Prof. Alejandro Ribeiro.
My research is at the intersection of machine learning, signal processing, and network science, with a focus on developing scalable algorithms for learning on non-Euclidean domains such as graphs and data manifolds. I work on large-scale graph information processing, graph neural networks (GNNs), and the theoretical limits of transferability and generalization in graph-based learning. I am also interested in physics-informed machine learning, manifold learning, and combinatorial optimization, especially as they relate to the structure and dynamics of complex systems.
You can find my CV here, and a selected list of publications is included below.
Selected publications
2025
- Local Distance-Preserving Node Embeddings and Their Performance on Random GraphsarXiv preprint arXiv:2504.08216, 2025
- Improved Image Classification with Manifold Neural NetworksIn 50th ICASSP, 2025
- A Generative Model for Controllable Feature Homophily in GraphsarXiv preprint arXiv:2509.23230, 2025Submitted to ICASSP 2026
- Graph Semi-Supervised Learning for Point Classification on Data ManifoldsarXiv preprint arXiv:2506.12197, 2025Submitted to IEEE TSP
- Subsampling Graphs with GNN Performance GuaranteesarXiv preprint arXiv:2502.16703, 2025
- A Local Graph Limits Perspective on Sampling-Based Graph Neural NetworksIn 2025 IEEE International Symposium on Information Theory (ISIT), 2025
- Dirichlet Meets Horvitz and Thompson: Estimating Homophily in Large Graphs via SamplingIn 59th Asilomar Conf. on Sig. and Syst., 2025To appear
- Graph Sampling for Scalable and Expressive Graph Neural Networks on Homophilic GraphsIn 33rd European Signal Processing Conference (EUSIPCO), 2025
2024
- Stability to Deformations of Manifold Filters and Manifold Neural NetworksIEEE Trans. Signal Process., 2024
- Geometric Graph Filters and Neural Networks: Limit Properties and Discriminability Trade-offsIEEE Trans. Signal Process., 2024
- A Poincaré Inequality and Consistency Results for Signal Sampling on Large Graphs (Spotlight)In 12th ICLR, 7–11 may 2024
- A Spectral Analysis of Graph Neural Networks on Dense and Sparse GraphsIn 49th ICASSP, 14-19 apr. 2024
2023
- Learning by Transference: Training Graph Neural Networks on Growing GraphsIEEE Trans. Signal Process., 14-19 apr. 2023
- Transferability Properties of Graph Neural NetworksIEEE Trans. Signal Process., 14-19 apr. 2023
- Graph Neural Tangent Kernel: Convergence on Large GraphsIn 40th ICML, 23-29 jul. 2023
2021
- Graph Neural Networks: Architectures, Stability and TransferabilityProc. IEEE, 23-29 jul. 2021
- Graphon Signal ProcessingTrans. Signal Process., 23-29 jul. 2021
- Stability of Neural Networks on Riemannian Manifolds (Best Paper Award)In 29th EUSIPCO, 23-27 aug. 2021
- Iterative Decoding for Compositional Generalization in TransformersarXiv:2110.04169 [cs.LG], 23-27 aug. 2021
2020
- Gated Graph Recurrent Neural NetworksTrans. Signal Process., 23-27 aug. 2020
- The Graphon Fourier TransformIn 45th ICASSP, 4-8 may 2020
- Graphon Neural Networks and the Transferability of Graph Neural NetworksIn 34th NeurIPS, 6-12 dec. 2020
2019
- Gated Graph Convolutional Recurrent Neural Networks (Best Paper Award)In 27th EUSIPCO, 2-6 sep. 2019