Skip to main content

I have a question about how planet is processing NDVIs

 

I can download a 4 or 8 band image from planet and then process it in Qgis using the NDVI formula (NIR-Red) / (NIR+Red) and get a nice looking NDVI.

When I get the same image from the API and drag it into Qgis the image has much lower total reflectance its about 70% darker. For example in the hand processed image Low value Vs high value is .219 and .585 in the same image that is pulled from the API the value is -.04 and .268

 

In the subscription there is a line called bandmath. "type": "bandmath","parameters"": {"b1": "(b4-b3)/(b4+b3)"," "pixel_type": "32R"}}

 

it looks like bandmath is running the same NDVI formula I am B4 is Nir and B3 is Red how am I getting such different values between the two processes and why is there negative values in every image I get? I have maybe once hand processed a good image that had a low of less than .12 ( a -.04 maybe) and never high much above .90

 

Any ideas what processing is going on with the planet NDVI thats giving me these values?

Thanks!

@Nick Simonian there are many reasons why this could be happening. The image could be taken using a different satellite, under different light condition and QGIS also applies its own stretch to the imagery. With regards to negative values, that could be due to clouds or presence of water. 


Its the same image.

We have an API that pulls a clipped image daily (Or when available) for fields of our choice.

I pull the image off our server and download it and open it in qgis. Its already been processed by planet and I can see the gray scale values like I listed.

I can upload the same boundary in planet explorer find the same image taken at the same time and download the 4 or 8 band image and then process that in qgis using the NDVI formula (which looks the same in the dev code) and I can see the raw gray scale output before I apply any color maps or things like that.

The histograms looks nearly identical and the image looks the same its just darker vs the hand processed image.

Planet seems to be adding some other form of correction and I need to know what that is

The difference between the hand processed image and the planet image is not linear the difference can vary between matching images by more than 12% of the mean value of pixels. The % change at the low end can be like 140% and the high end can change by 50% or more.


Hi @Nick Simonian 

 

I spoke to @Pooja Pandey and she informed me that this would perhaps require a deeper investigation. We have created a support ticket on your behalf and have copied the email you have used in your Community account and recommend that you send some examples with scene IDs.

 

Generally speaking, we do apply some post processing techniques that could be the reason for the differences. The image quality reports explain this in depth here:
https://support.planet.com/hc/en-us/articles/360037649614-L2-Data-Quality-Reports-for-the-PlanetScope-Constellation

 

To other users facing this same issue: 

When this issue is resolved/answered we will post an update here in the comments to hopefully help you in the future.

We will leave this post open for other users to comment.


Hi Elyhienrich, 

I couldn’t find any updates in this topic and I was wondering if the topic was solved. 

Thank you!


Reply