Zahra Atashgahi

I’m Zahra Atashgahi. Currently, I’m a Ph.D. candidate at the University of Twente, Data Management & Biometrics (DMB) group. During my Ph.D., I focus on Deep Learning and, particularly, sparse neural networks. I seek to develop algorithms to solve different tasks efficiently in terms of computational costs and data requirements.

Research Interests

Machine Learning, Sparse Neural Networks, Deep Learning, Cost-efficient Neural Networks, Time Series Analysis, Feature Selection, Anomaly Detection, Healthcare

News

  • 04/2024, I will defend my Ph.D. thesis, “Advancing Efficiency in Neural Networks through Sparsity and Feature Selection”, on the 30th of April at 16:30 at the University of Twente.
  • 01/2024, Our paper, “Supervised Feature Selection via Ensemble Gradient Information from Sparse Neural Networks” has been accepted at AISTATS conference.
  • 05/2023, My doctoral consortium paper has been accepted for publication at IJCAI 2023.
  • 05/2023, our tutorial “T27: Sparse Training for Supervised, Unsupervised, Continual, and Deep Reinforcement Learning with Deep Neural Networks” has been accepted at IJCAI 2023.
  • 05/2023, We organized ICLR 2023 Workshop on Sparsity in Neural Networks in Kigali, Rwanda on May 5th 2023.
  • 04/2023, I have been accepted for a 3-months Ph.D. internship at Booking.com starting from July 2023.
  • 03/2023, I gave a talk TrustML Young Scientist Seminars organized by RIKEN-AIP center on “Learning Efficiently from Data using Sparse Neural Networks” (Link).
  • 01/2023, I’ve started my research vist at University of Cambridge, van der Schaar Lab, under supervision of Prof. Dr. Mihaela van der Schaar
  • 01/2023, Our paper, “Supervised Feature Selection with Neuron Evolution in Sparse Neural Networks” has been accepted at TMLR journal (OpenReview, code).
  • 09/2022, Our paper, “Where to pay attention in sparse training for feature selection?” has been accepted at NeurIPS 2022 (OpenReview).
  • 07/2022, Our paper, “A Brain-inspired Algorithm for Training Highly Sparse Neural Networks” got accepted for publication in Machine Learning Journal (ECML-PKDD 2022 journal track). Find more information here. (paper, code)
  • 04/2022, our tutorial “Sparse Neural Networks Training” has been accepted at ECMLPKDD 2022 conference. Find more information here.
  • 01/2022, Our paper, “Deep Ensembling with No Overhead for either Training or Testing: The All-Round Blessings of Dynamic Sparsity” has been accepted at ICLR 2022. Find more information check here.
  • 09/2021, Our paper, “Sparse Training via Boosting Pruning Plasticity with Neuroregeneration” has been accepted at NeurIPS 2021. Find more information check here.
  • 07/2021, Our manuscript, “Quick and Robust Feature Selection: the Strength of Energy-efficient Sparse Training for Autoencoders” has been accepted for publication in Machine Learning Journal (ECML-PKDD 2022 journal track). Find more information here. (paper, code)

For more news check here.

Education

PeriodDegreeUniversity
May 2020 - presentPh.D.University of Twente
Oct. 2019 - Apr. 2020Ph.D.Eindhoven University of Technology *
Sep. 2017 - Sep. 2019M.Sc.Amirkabir University of Technology (Tehran Polytechnic)
Sep. 2013 - Jul. 2017B.Sc.Amirkabir University of Technology (Tehran Polytechnic)

* My PhD started at the Eindhoven University of Technology; after a few month I moved to University of Twente together with Dr. Decebal Mocanu, my Ph.D. /supervisor.