Omid Sadeghi
 I'm a final-year Ph.D. candidate in the Electrical and Computer Engineering department at the University of Washington, supervised by Prof. Maryam Fazel. I've also recently obtained my Master's degree in Mathematics (with a focus on mathematical optimization) here at UW. Before coming to UW, I did my bachelor's at the Sharif University of Technology, where I obtained B.S. degrees in Electrical Engineering and Mathematics. Also, during my undergraduate studies, I spent the summer 2015 as a junior research assistant at the Chinese University of Hong Kong (CUHK) under the supervision of Prof. Chandra Nair.
I specialize in applying convex optimization tools to tackle non-convex problems (particularly continuous submodular maximization) in Machine Learning under various additional considerations (e.g., limited resource/budget availability, privacy, incentive compatibility, and fairness) and in both online and offline settings. My research finds applications in online advertising and online resource allocation problems.
Competencies: Python, CVXPY, Scikit-learn, Pandas.
In my spare time, you can find me learning new languages (currently learning German and Spanish), going for a run, or playing soccer.
Research Interests
No-Regret Online Prediction with Strategic Experts, With Maryam Fazel, Under Review.
Online SuBmodular + SuPermodular (BP) Maximization with Bandit Feedback, With Adhyyan Narang, Lillian J Ratliff, Maryam Fazel, and Jeff Bilmes, Under Review.
Function Design for Improved Competitive Ratio in Online Resource Allocation with Procurement Costs, With Mitas Ray, Lillian J. Ratliff, and Maryam Fazel, Under Review.
Fast First-Order Methods for Monotone Strongly DR-Submodular Maximization, With Maryam Fazel, SIAM Conference on Applied and Computational Discrete Algorithms (ACDA23) .
Improved Regret Bounds for Online Submodular Maximization, With Prasanna Raut and Maryam Fazel, ICML 2021 Workshop on Subset Selection in Machine Learning: From Theory to Applications.
Differentially Private Monotone Submodular Maximization Under Matroid and Knapsack Constraints, With Maryam Fazel, AISTATS 2021. [Talk], [Poster]
A Single Recipe for Online Submodular Maximization with Adversarial or Stochastic Constraints, With Prasanna Raut and Maryam Fazel, NeurIPS 2020 (Spotlight presentation: 280/9454 submissions). [Short Talk], [Long Talk], [Poster]
Online DR-Submodular Maximization: Minimizing Regret and Constraint Violation, With Prasanna Raut and Maryam Fazel, AAAI 2021. [Talk], [Poster], [Full Version]
Online Continuous DR-Submodular Maximization with Long-Term Budget Constraints, With Maryam Fazel, AISTATS 2020. [Talk], [Poster]
Online Algorithms for Budget-Constrained DR-Submodular Maximization, With Reza Eghbali and Maryam Fazel, ICML 2020 workshop on Negative Dependence and Submodularity for ML (NDSML 2020). [Slides], [Full Version]
Notes and Surveys
CV
Download here.
Contact
omids@uw.edu
|