Luana Ruiz
Assistant Professor, Department of Applied Mathematics and Statistics, Johns Hopkins University

Office N452
Wyman Park Building
Baltimore, MD 21211
I am an Assistant Professor with the Department of Applied Mathematics and Statistics at Johns Hopkins University. Before that, I was a METEOR and FODSI postdoctoral fellow working with Prof. Stefanie Jegelka at MIT, and a Google Research Fellow at the Simons Institute for the Theory of Computing. I obtained my PhD from the Electrical and Systems Engineering Department at Penn, where I was very fortunate to be advised by Prof. Alejandro Ribeiro.
My primary research interests are in the areas of machine learning and signal processing over networks. My current work focuses on large-scale graph information processing and graph neural network architectures. Please check a selected list of my publications below.
news
Nov 16, 2023 | Invited long talk “Large-Scale Graph Machine Learning: Tradeoffs, Guarantees and Dynamics” at DeepMath 2023. |
---|---|
Nov 10, 2023 | I am serving as TC chair at ICASSP 2024 and as publicity chair at Graph Signal Processing Workshop (GSPW) 2024. Consider submitting an abstract to GSPW! |
Oct 17, 2023 | New preprint A Local Graph Limits Perspective on Sampling-Based GNNs with Yeganeh and Amin. |
Oct 16, 2023 | Talk “Large-Scale Graph Machine Learning: Tradeoffs, Guarantees and Dynamics” at INFORMS. |
Sep 22, 2023 | Talk “Large-Scale Graph Machine Learning: Tradeoffs, Guarantees and Dynamics” at the IFML Seminar at UT Austin. |
Sep 14, 2023 | Talk “Manifold Neural Networks for Large-Scale Geometric Information Processing” at the JHU AMS Department Seminar. |
Aug 21, 2023 | In Tokyo for ICIAM 2023 to speak about graphon neural tangent kernels at the “Geometric Methods in Machine Learning Minisymposium”, organized by Jeff Calder and Leon Bungert. |
Aug 1, 2023 | I have officially started my new position at JHU. |
Jul 23, 2023 | Paper “Graph Neural Tangent Kernel: Convergence on Large Graphs” at ICML 2023. |
selected publications
2023
- Learning by Transference: Training Graph Neural Networks on Growing GraphsIEEE Trans. Signal Process., 2023
-
- Geometric Graph Filters and Neural Networks: Limit Properties and Discriminability Trade-offsarXiv [cs.LG]::2305.18467, 2023
-
-
-
2021
-
-
- Stability to Deformations of Manifold Filters and Manifold Neural NetworksarXiv [cs.LG]:2106.03725, 4-10 jun. 2021
- Stability of Neural Networks on Riemannian Manifolds (Best Paper Award)In 29th EUSIPCO, 23-27 aug. 2021
- Iterative Decoding for Compositional Generalization in TransformersarXiv:2110.04169 [cs.LG], 23-27 aug. 2021
2020
- Invariance-Preserving Localized Activation Functions for Graph Neural NetworksTrans. Signal Process., 23-27 aug. 2020
-
-
- Graphon Neural Networks and the Transferability of Graph Neural NetworksIn 34th NeurIPS, 6-12 dec. 2020
2019
- Gated Graph Convolutional Recurrent Neural Networks (Best Paper Award)In 27th EUSIPCO, 2-6 sep. 2019