Karim Haroun
From 2022 to 2025, I was a PhD student in deep learning, affiliated to the Embedded Artificial Intelligence Lab (LIAE) from CEA-LIST, and the SPARKS team at the Laboratory of Computer Science, Signals and Systems of Sophia Antipolis (I3S), Université Côte d'Azur. In my thesis, I have worked on dynamic neural network compression, a novel compression paradigm where the computational graph is adapted to the hardness of input instances, which leads to input-adaptive computation. This contrasts with traditional static compression techniques where the goal is to reduce the model size during training, either by pruning weights and/or activations, or by reducing the precision of the weights and/or activations using quantization. Specifically, I was interested in how to leverage attention in Transformer models to guide the inference and design dynamic compression strategies that varies the length of the input sequence by leveraging information redundancies between tokens.
My current research mainly focuses on various theoretical aspects of neural networks. Among these are the exploration of more theoretically grounded dynamic neural network compression seen in my PhD thesis, dynamic deep learning during training and how it can be leveraged for continual learning, and generative models. Additionally, I am exploring a handful of applications that include dense tasks in computer vision, natural language processing and EEG-based Brain-Computer Interfaces (BCIs).