Skip to main content

I apologies for the many questions, and somewhat differing in topics, but it would be unpractical to split over different threads…


1- if you could kindly double check…


retrieving sentinel-1 data with processing api over a geometry with a bbox of 1000x1000 pxls, 2 polarization bands, float32, full sar processing options (terrain correction, speckle), 30 timestamps: 1000² / 512² * 2 / 3 * 2 * 2.5 * 2 * 30 = 762,94 PU


retrieving sentinel-2 data with processing api over a geometry with a bbox of 1000x1000 pxls, 11 bands, uint16, 36 timestamps: 1000² / 512² * 11 / 3 * 36 = 503.54 PU


2- do pixels outside of aoi geometry, masked as nodata, count towards PU calculation?


3- would SH api through copernicus dataspace and actual paid subscription interact in any way? Would it be possible using CDS to retrieve some data (catalog, statistical, even actual images) and then switch to paid SH for others, for example offline sentinel1?


4- how long does it generally take to obtain polarimetric data with processing api (assuming the above set up)?


5- in case of rare instances with bbox and overal PU larger than the limits (2500 pxls per dimension, 2000 PU per request if I recall) is it doable to split data retrieval (spatially and/or temporally) over multiple requests (remaining within per min limitations)? I fear Batch api might be a bit overkill at the moment…


6- PrePaid plans are equivalent to Basic but with a yearly pool of PUs correct? So they might be better suited for unevenly distributed workloads over the year?


7- Any plans for meteorological data or suggestions for other services that might easily integrate SH api?


Many thanks

Hi,

  1. Seems correct and is aligned to the doc.
    • S1: 1000² / 512² (Area of interest) * 2 / 3 (number of input bands) * 2 (float32) * 2.5 (RTC) * 2 (Speckle Filtering) * 30 (timestamps)
    • S2: 1000² / 512² (Area of interest) * 11 / 3 (number of input bands) * 36 (timestamps)
  2. The height and width of the bounding box of your requested geometry are taken for PU calculation.
  3. The Sentinel Hub API deployed on Copernicus Data Space Ecosystem is an independent deployment. Could you please elaborate your use case and explain what kind of “interaction” do you are expect?
  4. The response time of Processing API is generally in seconds. Of course the response time can be longer, maybe a couple of minutes, if it’s a request involving a large amount of data or requiring heavy processing.
  5. Asynchronous Processing API is perfect for this use case and it will be available soon on the Copernicus Data Space Ecosystem.
  6. Pre-paid plans have features set from the Basic plan, but the amount of PUs allocated to you will be different. I’d suggest creating a trial account and track your usage. Then you could choose the plan fits to you the most. If the usage of processing units and requests of your workload does not fit to any existing plans, please feel free to contact us through info@sentinel-hub.com and we will be happy to help you.
  7. AFAIK there’s no plan to include meteorological data as a Sentinel Hub default data collection. However, if you’d like to develop a workflow on top of Sentinel Hub APIs for your own data, please have a look at Bring Your Own COG API which enables you to import your own data to Sentinel Hub and works with your data using features supported by Sentinel Hub APIs.


  1. Well the (probably half-cooked) idea was to simply try using monthly PUs budget on CDS as much as possible and then when exhausted switch to your deployment for the remaining data/executions, either during processing of the same aoi, or picking one at the start. Not sure if it’s worth the extra work required though, or if cost estimates would be robust enough.




  2. It would fit very well indeed, available for CDS regular users too? However I fear the PU budget there might be too limited since this would be needed at least for the big aois… here it would be only available for enterprise plans right?



Thanks for your clarifications, appreciated.


Hi,

  1. As I mentioned they are independent deployments. Let’s say you have a workflow which costs 100 PUs in average and you save the result to your local machine. You can of course run it on Copernicus Data Space Ecosystem until you run out of PUs, then switch to Sentinel Hub subscription and complete the rest. However, if you are using, for example, Bring Your Own COG API, on Copernicus Data Space Ecosystem, you won’t be able to reach them with your Sentinel Hub subscription.

  2. Please follow the news on Copernicus Data Space Ecosystem website and you’ll have more info when it’s released. Apart from the Asynchronous Processing API, we also have large area utilities which may fit to your use case. Please have a look at the example notebook.


Thanks for the insights, I’ll check the utilities and further developments on the async api.

Best regards


Reply