Skip to main content

Hi everyone,

Project snapshot

  • AOI: Costa Rica (~51,000 km²)

  • Plots: 14,000 (1 ha each)

  • Dates to inspect: Jan-2021, Jan-2022, Jan-2023, Jan-2024

  • Tools: Collect Earth Desktop + QGIS

  • Team: 5 interpreters working in parallel

  • Current plan: Insights Platform – Basic (70 k PU/month, 500 PU/min)

Issue

  • With Visual-tile streaming (1 PU/tile) we exhausted the quota in a week.

  • Usage dashboard has shown 0 PU and only 9 requests for several days (possible burst-triggered block).

  • We need to restart work and pick a workflow that won’t hit the limit again.

Options I’m considering

  1. Stay with streaming but upgrade temporarily (Enterprise S or L) and add a caching proxy.

  2. Download all Visual Quads (Basemap Global) and serve them via MapProxy/GeoServer; work offline.

  3. Use NICFI quads (1,500 PU/quad) only in forest zones and Basemap Global elsewhere.

  4. Push quads to GEE and let interpreters view via Earth Engine.

Key questions

  • What workflow do others use when multiple analysts review large visual samples?

  • Any best practices to stay within the 500 PU/min burst when 5 people pan/zoom simultaneously?

  • For Basemap Global, can anyone confirm ~300 PU per quad? (It’s not in public docs.)

  • Is it smarter to take Enterprise S for one month and downgrade back to Basic, or buy top-ups on Basic?

  • If you’ve combined Collect Earth with quads served from MapProxy, any performance tips?

Any guidance, sample scripts, or lessons learned would be greatly appreciated.
Thanks in advance!

If you’d like to download TFO/NICFI quads, you’d need to use the Basemaps API.
You cannot access NICFI basemap quads via SentinelHub at all (you can access visual previews, but not any SR data and not the underlying quads). 

Also note that if you have TFO (aka NICFI) access, then you should have access to use the data in Google Earth Engine directly with no PU charges. There’s no need to “push” things to GEE, as the data is already accessible there.  See https://docs.planet.com/platform/integrations/google-earth-engine/tfo-gee/

Streaming via the tiles-services API is a good option for this, though it does consume tile views (currently not PUs) and you can exhaust your quota if multiple users are sharing one account. However, streaming allows you to effectively extract only the regions you’d like to work with, and also work at multiple scales.  

Streaming is especially useful when you’re accessing the data programatically, because you can stream full bit depth values very efficiently. Gdal/rasterio/QGIS/etc let you use streaming protocols to treat each mosaic as a single local file and access the same values as the downloaded data very efficiently at multiple spatial scales.  For example, see https://github.com/planetlabs/notebooks/blob/master/jupyter-notebooks/Basemaps-API/streaming.ipynb (there’s an example of working with NICFI/TFO data specifically towards the end).  Streaming does consume tile views, but it enables efficient analysis (and also interactivity) over much larger regions than is feasible for downloaded data.

With that said, if you want to minimize quota usage and are willing to host your own data, then download may make more sense. There’s an introduction to the basemaps API (and an example client) here: https://github.com/planetlabs/notebooks/blob/master/jupyter-notebooks/Basemaps-API/basemaps_api_introduction.ipynb


Reply