Animesh Trivedi (VU) receives NWO KLEIN-1 grant
Machine Learning from Storage: Democratizing Machine Learning for All Using Scalable, Efficient, Distributed Non-Volatile Memory Storage Technology. Value of the award is 350K euro, to be spread over four years. Animesh Trivedi is Assistant Professor in the Massivizing Computer Systems research group.
In machine learning, large models are associated with more accuracy and intelligence. Building and training of large models processes large amounts of data in CPU-attached and accelerator on-board DRAM memories, which are not scalable, energy-inefficient, and extremely expensive, thus putting large model training out of reach for many users. They propose to leverage Non-Volatile Memory (NVM) storage technology to design a Machine-learning-from-Storage model training paradigm, where data is dynamically moved between NVM storage and DRAM memories on-demand, thus unleashing a new class of previously-unavailable cost- and energy-efficiency for all.