-
Notifications
You must be signed in to change notification settings - Fork 86
Closed
Labels
bugSomething isn't workingSomething isn't workingoomOut-of-memory risk with large datasetsOut-of-memory risk with large datasets
Description
Summary
focal.mean(), focal.apply(), and focal_stats() raise NotImplementedError for dask+cupy inputs. Users with GPU clusters are forced to .compute() first, negating dask's memory benefits and causing OOM on large datasets.
Affected Functions
| Line | Function | Issue |
|---|---|---|
focal.py:164 |
mean() |
NotImplementedError for dask+cupy |
focal.py:502 |
apply() |
NotImplementedError for dask+cupy |
focal.py:922 |
focal_stats() |
NotImplementedError for dask+cupy |
Severity
MODERATE — only affects dask+cupy users, but forces full materialisation which causes OOM on large datasets.
Suggested Fix
- Implement dask+cupy backends using
map_overlapwith cupy kernels (similar to the existing dask+numpy path) - The pattern is well-established in other spatial ops in this codebase (e.g., slope, aspect, convolution)
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't workingoomOut-of-memory risk with large datasetsOut-of-memory risk with large datasets