Skip to main content

Hi Community,
I feel there is a need to make the step from analyzing just a couple of eo images to analyzing hundreds or more images for many applications. Even local solutions involve many images. Building data cubes seems to be a good solution, I would like to hear what your experience is with (local) data cube environments or frameworks and Planet data! 
Some exciting solutions are around, such as rasdaman, opendatacube, and gdalcubes. Google earth engine seems to be working, but I don’t think I get the code to deploy GEE on our infrastructure. However, so far, I haven’t found reading procedure for planet data, whereas the usual (and boring 😜) Sentinel and Landsat data seem to be supported out of the box.
It would be very interesting to get inspiration.

Cheers 🤙,
Christoph

 

Hi Christoph

We have had some success processing large time-series of Planet data using GEE and SEPAL (sepal.io). We order and download Planet daily data to GEE then use the functionality of SEPAL to create best-pixel composites and time-series to do any number of classification tasks, CCDC, BFAST, etc. 

I’m sure using opendatacube, etc will also work quite well...but haven’t actually tried that myself.

Best

Erik


Great discussion topic @Christoph Hütt - thanks for bringing it up. I’d love to hear more what people are doing, and it sparks an idea for our Dev Rel team to better document the most popular solutions, perhaps add something to our notebook collection.

One route that I’ve not yet tried out which I believe should work is to order Planet data with STAC metadata (ideally with harmonization and co-registration tools), and then use odc-stac to get that into open data cube. I think Planet’s STAC output has all the fields needed, and if it doesn’t know then it should soon, with the next STAC metadata update.

@Erik Lindquist - do you all do co-registration in your GEE data cube creation? There was a recent question in this forum: 

Curious if you have any insight, as I know GEE less well.

best regards,

Chris


@cholmes - It’s a good question. I typically don’t do any co-registration within GEE as, by the time the imagery get there, there is not much one can do via GEE (I don’t think). In my experience, the Planet imagery arrive more or less OK for TS analysis. There are some individual acquisitions that can be mis-registered. These I sometimes have to find manually and remove from the TS.

We do use the on-order harmonization tools, which seem to work well. We also have GEE code to ‘histogram match’ daily Planet acquisitions to a custom-processed ‘basemap’ - e.g. BRDF-corrected Landsat. 

I think to do any sort of custom co-registration, one would have to do it outside of GEE then import to GEE as an image collection. I haven’t worried too much about it, however, instead relying on the data depth of daily Planet time-series and relatively robust algorithms to deal with any ‘noise’. Still looking into it all…

 

Erik


Christoph,

I’ll just add a little onto what Erik and Chris indicated and also point out that besides working with the data locally, in addition to what Erik mentioned, it is also important to note that there are already analysis ready Planet data in Google Earth Engine (GEE), and updated monthly. These are the mosaics that Planet is providing through the partnership with Norway’s International Climate and Forests Initiative (NICFI), and the data are already there in the form of “image collections”, which are essentially data cubes. More details of those image collections are provided here.

If you’re interested in seeing some basic GEE scripts that we have developed on the SERVIR side for using those mosaics, see: https://code.earthengine.google.com/?accept_repo=users/servirscience/planet_nicfi. Access to the mosaics in GEE provides the benefits of the cloud, i.e., being able to process the data without having to download anything.

Emil


Reply