-
Notifications
You must be signed in to change notification settings - Fork 16
Open
Description
Hello Mr. Ahmed
I'm working with a large dataset (~70GB worth of raw logs) for training, but I'm limited by 128GB RAM and 24GB GPU RAM. How can I efficiently create dataset and train this model in stages, considering these memory constraints? I'm currently considering using data generators and K-folds, but would appreciate any insights or alternative approaches you have or you may have tested.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels