June 29, 2022

Q&A with Stratis Ioannidis, associate professor at Northeastern University and Meta academic collaborator

By: Meta Research

In this monthly interview series, we turn the spotlight on members of the academic community and the important research they do — as thought partners, collaborators, and independent contributors.

For June, we nominated Stratis Ioannidis, an associate professor of electrical and computer engineering at Northeastern University. Stratis has worked with Meta researchers throughout his career and led one of the six research teams that received a Meta Research Award in 2020. In this Q&A, Stratis talks about the project he worked on as a short-term employee with the Core Data Science (CDS) team during a sabbatical and shares what’s next for his research.

Q: Can you tell us about your background in academia, your role at Northeastern, and the research you specialize in?

Stratis Ioannidis: I’m an associate professor at Northeastern University’s College of Engineering. I spent several years in industry research before returning to academia. My research focuses on machine learning, distributed optimization, networking, and privacy. I address diverse questions in these fields, such as, “How can we make AI systems more secure, robust to noise, and resistant to attacks?”

One of the projects I’ve been working on with my colleagues at Northeastern is called “Learning from Comparisons.” We’re creating classifiers for the early detection of retinopathy of prematurity, which is the leading cause of blindness in premature babies worldwide and can be cured if it’s detected early enough. Our classifiers are trained on over image rankings, which we have found to be more informative and less susceptible to noise than diagnostic labels.

Q: How did your collaboration with Meta start?

SI: I’ve been in touch with Meta team members for a while. I’ve given presentations at the Meta campus, cowritten a few papers with Meta researchers, and connected with Core Data Science (CDS) team members at conferences and other academic functions. I started an official collaboration with Meta in 2020 when my team at Northeastern received a Meta Research Award for our “Learning from comparisons” project. We used funding from the award to accelerate the training of deep neural network classifiers trained over ranking data. I took a sabbatical in 2021 after we published the results of our research, and approached the CDS team about opportunities to visit Meta.

Q: What project did you work on with Meta’s Core Data Science team during your sabbatical?

SI: During my short-term visit with the CDS team, I worked on understanding trajectories that users follow online, the type of interactions they have with online entities, and how these interactions influence their subsequent decisions. We developed frameworks to understand how users engage with different communities and other types of content, and understand users’ potential future actions. We were particularly interested in content gateways — that is, entities that promote future interactions with other content. The end goal of this project is to improve the assessment of gateway scores. In the short term, my work on the project helped us improve the quality of gateway definitions and create better algorithms to detect them. The findings from our work in this stage of the project will help industry and academic communities evaluate the concept of “gateway-ness” and give researchers directions on how to develop better prediction tools.

Q: Are there any research challenges or topics you’re particularly excited about at the moment?

SI: I’m interested in a lot of topics right now. In general, I am interested in how machine learning algorithms should be updated and revised to consider desiderata other than accuracy. This includes many properties that users might anticipate from a machine learning system, such as privacy, uncertainty, robustness to both noise and adversarial attacks, and fairness. Understanding how to design algorithms that have these properties, while also attaining good trade-offs with respect to accuracy, is quite important for increasing our trust of machine learning systems and algorithms.

Q: Where do you see your collaboration with Meta going in the future?

SI: Working with the CDS team has given me a tremendous opportunity to work on real-world problems that I would never encounter in academia. It’s an interesting experience that requires flexing different “brain muscles,” because the end goal can be more impactful than a research paper. In industry, you’re developing tools to solve a problem at scale, and in this type of work you have to think about projects at different timescales and stages — from ideation through experimentation and deployment. You also have to show that your algorithms will work in practice. I love this challenge and would like to work with Meta’s CDS team again in the future.

Q: Where can people learn more about your research?

SI: My university webpage is the best place to learn more about research, including the “Learning from comparisons” project.