27363

Dataloader Parameter Tuner: An Automated Dataloader Parameter Tuner for Deep Learning Models

JooYoung Park, DoangJoo Synn, XinYu Piao, Jong-Kook Kim
Korea University, Seoul, Republic of Korea
arXiv:2210.05244 [cs.DC]

@misc{https://doi.org/10.48550/arxiv.2210.05244,

   doi={10.48550/ARXIV.2210.05244},

   url={https://arxiv.org/abs/2210.05244},

   author={Park, JooYoung and Synn, DoangJoo and Piao, XinYu and Kim, Jong-Kook},

   keywords={Distributed, Parallel, and Cluster Computing (cs.DC), FOS: Computer and information sciences, FOS: Computer and information sciences},

   title={Dataloader Parameter Tuner: An Automated Dataloader Parameter Tuner for Deep Learning Models},

   publisher={arXiv},

   year={2022},

   copyright={arXiv.org perpetual, non-exclusive license}

}

Download Download (PDF)   View View   Source Source   

596

views

Deep learning has recently become one of the most compute/data-intensive methods and is widely used in many research areas and businesses. One of the critical challenges of deep learning is that it has many parameters that can be adjusted, and the optimal value may need to be determined for faster operation and high accuracy. The focus of this paper is the adjustable parameters of the dataloader. The dataloader in a system mainly groups the data appropriately and loads it to the main memory for the deep learning model to use. We introduce an automated framework called Dataloader Parameter Tuner (DPT) that determines the optimal value for the parameters required for the dataloader. This framework discovers the optimal values for the number of dataloader’s subprocesses (i.e., worker) and prefetch factor through grid search to accelerate the data transfer for machine learning systems.
No votes yet.
Please wait...

* * *

* * *

HGPU group © 2010-2024 hgpu.org

All rights belong to the respective authors

Contact us: