Zahra Atashgahi
I’m a Data Scientist passionate about bringing machine learning algorithms to everyday life to positively impact people’s lives. I completed my PhD in artificial intelligence at the University of Twente, where I focused on enhancing the efficiency of deep learning models through sparsity and feature selection. Starting in October 2024, I’m excited to join the Ikea AI lab as a Data Scientist.
Research Interests
Machine Learning, Sparse Neural Networks, Deep Learning, Cost-efficient Neural Networks, Time Series Analysis, Feature Selection, Anomaly Detection, Healthcare
News
- 09/2024, I presented our paper paper, “Adaptive Sparsity Level during Training for Efficient Time Series Forecasting with Transformers” at ECML-PKDD 2024. Please find the presentation slides here.
- 07/2024, Our paper, “Unveiling the Power of Sparse Neural Networks for Feature Selection” has been accepted at ECAI 2024.
- 06/2024, I will be joining the IKEA AI lab as a Data Scientist as of October 2024.
- 06/2024, Our paper, “Adaptive Sparsity Level during Training for Efficient Time Series Forecasting with Transformers” has been accepted at ECML-PKDD 2024.
- 04/2024, I defended my Ph.D. thesis, “Advancing Efficiency in Neural Networks through Sparsity and Feature Selection”, on the 30th of April at 16:30 at the University of Twente.
- 01/2024, Our paper, “Supervised Feature Selection via Ensemble Gradient Information from Sparse Neural Networks” has been accepted at AISTATS 2024.
- 05/2023, My doctoral consortium paper has been accepted for publication at IJCAI 2023.
- 05/2023, our tutorial “T27: Sparse Training for Supervised, Unsupervised, Continual, and Deep Reinforcement Learning with Deep Neural Networks” has been accepted at IJCAI 2023.
- 05/2023, We organized ICLR 2023 Workshop on Sparsity in Neural Networks in Kigali, Rwanda on May 5th 2023.
- 04/2023, I have been accepted for a 3-month Ph.D. internship at Booking.com starting from July 2023.
- 03/2023, I gave a talk at TrustML Young Scientist Seminars organized by RIKEN-AIP center on “Learning Efficiently from Data using Sparse Neural Networks” (Link).
- 01/2023, I’ve started my research visit at the University of Cambridge, van der Schaar Lab, under the supervision of Prof. Dr. Mihaela van der Schaar
- 01/2023, Our paper, “Supervised Feature Selection with Neuron Evolution in Sparse Neural Networks” has been accepted at TMLR journal (OpenReview, code).
- 09/2022, Our paper, “Where to pay attention in sparse training for feature selection?” has been accepted at NeurIPS 2022 (OpenReview).
- 07/2022, Our paper, “A Brain-inspired Algorithm for Training Highly Sparse Neural Networks” got accepted for publication in Machine Learning Journal (ECML-PKDD 2022 journal track). Find more information here. (paper, code)
- 04/2022, our tutorial “Sparse Neural Networks Training” has been accepted at ECMLPKDD 2022 conference. Find more information here.
- 01/2022, Our paper, “Deep Ensembling with No Overhead for either Training or Testing: The All-Round Blessings of Dynamic Sparsity” has been accepted at ICLR 2022. To find more information check here.
- 09/2021, Our paper, “Sparse Training via Boosting Pruning Plasticity with Neuroregeneration” has been accepted at NeurIPS 2021. To find more information check here.
- 07/2021, Our manuscript, “Quick and Robust Feature Selection: the Strength of Energy-efficient Sparse Training for Autoencoders” has been accepted for publication in Machine Learning Journal (ECML-PKDD 2022 journal track). Find more information here. (paper, code)
For more news check here.
Education
Period | Degree | University |
---|---|---|
May. 2020 - Apr. 2024 | Ph.D. | University of Twente |
Oct. 2019 - Apr. 2020 | Ph.D. | Eindhoven University of Technology * |
Sep. 2017 - Sep. 2019 | M.Sc. | Amirkabir University of Technology (Tehran Polytechnic) |
Sep. 2013 - Jul. 2017 | B.Sc. | Amirkabir University of Technology (Tehran Polytechnic) |
* My PhD started at the Eindhoven University of Technology; after a few month I moved to University of Twente together with Dr. Decebal Mocanu, my Ph.D. /supervisor.