Hi everyone, I have an enterprise account, I’ve just started developing using sentinel2 data and python, I’m finding it quite overwhelming.
I’ll go straight to the point:
I need to create a ML model, using input data from 2 countries. I’d like to use all raw bands data from several years (as a time series) and apply post processing locally. What’s the best (and cost-efficient) way to do so?
To go deeper, that’s exactly the data science case explained here:
(Large-scale data preparation — introducing Batch Processing | by Grega Milcinski | Sentinel Hub Blog | Medium)
I’ve already tried a few alternatives, but I don’t get what’s the “right one”:
→ “naive” approach, splitting big area into smaller tiles then looping standard SH requests over every tile and time frame
→ Use Batch Processing, but as it only returns one image per time period, loop it over small time frames.
→ Use multitemporal EvalScript with Batch Processing (though I have almost zero knowledge about JavaScript)
→ Use eolearn library InputTask, that seems to solve the issue, however it’s not optimized using BatchProcess (and tutorials on it are quite outdated)
