Evolutionary Multi-Objective Quantization of Randomization-Based Neural Networks
Published in IEEE Symposium Series on Computational Intelligence (SSCI), 2023
Full paper can be found here
Abstract: The deployment of Machine Learning models on hardware devices has motivated a notable research activity around different strategies to alleviate their complexity and size. This is the case of neural architecture search or pruning in Deep Learning. This work places its focus on simplifying randomization-based neural networks by discovering fixed-point quantization policies that optimally balance the trade-off between performance and complexity reduction featured by these models.
Specifically, we propose a combinatorial formulation of this problem, which we show to be efficiently solvable by multi-objective evolutionary algorithms. A benchmark for time series forecasting with Echo State Networks over 400 datasets reveals that high compression ratios can be achieved at practically admissible levels of performance degradation, showcasing the utility of the proposed problem formulation to deploy reservoir computing models on resource-constrained hardware devices.