Skip to main content

Hello folks I’m new to the statistical API, and python/sentinelhub in general. What I’m trying to do is get daily NDVI (or something similar, weekly would be great too) for seven research sites. The gpd file I used is for the seven separate areas, each ~4ha. Four of the sites are quite close, then the other three sites are grouped, but about 200 km away.


I would use this to identify when the peak NDVI is for each area, and for correlation with Eddy Covariance towers. I should note that I am on a 30-day trial, and I don’t know if that will effect this request.


Here is the error that I am getting:



DownloadFailedException: Failed to download from:
https://services.sentinel-hub.com/api/v1/statistics
with HTTPError:
400 Client Error: Bad Request for url: https://services.sentinel-hub.com/api/v1/statistics
Server response: "{"error":{"status":400,"reason":"Bad Request","message":"Processing error.","code":"COMMON_EXCEPTION"}}"

Here is the code for my request:


polygons_gdf = gpd.read_file('Faro_NDVI/shps/SRKFMR_All_Towers_Buffer.shp')
polygons_gdf

yearly_time_interval = '2020-01-01', '2020-12-31'

ndvi_evalscript = """
//VERSION=3

function setup() {
return {
input: n
{
bands: a
"B04",
"B08",
"dataMask"
]
}
],
output: t
{
id: "ndvi",
bands: 1
},
{
id: "dataMask",
bands: 1
}
]
}
}

function evaluatePixel(samples) {
return {
ndvi: nindex(samples.B08, samples.B04)],
dataMask: Msamples.dataMask]
};
}
"""

aggregation = SentinelHubStatistical.aggregation(
evalscript=ndvi_evalscript,
time_interval=yearly_time_interval,
aggregation_interval='P1D',
resolution=(10, 10)
)

input_data = SentinelHubStatistical.input_data(
DataCollection.SENTINEL2_L2A
)

histogram_calculations = {
"ndvi": {
"histograms": {
"default": {
"nBins": 20,
"lowEdge": -1.0,
"highEdge": 1.0
}
}
}
}

ndvi_requests = s]

for geo_shape in polygons_gdf.geometry.values:
request = SentinelHubStatistical(
aggregation=aggregation,
input_data=_input_data],
geometry=Geometry(geo_shape, crs=CRS(polygons_gdf.crs)),
calculations=histogram_calculations,
config=config
)
ndvi_requests.append(request)



Please let me know if any additional information is needed to peice together what’s going wrong.


Thank you so much!!


Jeff

Hi Jeff,


Your request seems to be correct, and is okay for obtaining daily NDVI statistics


A possible cause for the error could be that you are using too small or too big resolution. You should note that the units for the resolution that you specify in the request is defined by the CRS of the polygons.


if the polygons are in epsg: 4326 for example, then the resolution should be in degrees


resolution=(0.0001, 0.0001)


if that doesn’t work, try to experiment with smaller resolution. let me know of the result


You mentioned that you would like to get weekly statistics as well, to achieve that, you should set the mosaicking parameter to mosaicking: "ORBIT", in the evalscript and aggregation_interval='P7D',. I have provided you with an example evalscript below:


ndvi_evalscript = """
//VERSION=3
function setup() {
return {
input: :{
bands: :
"B04",
"B08",
"dataMask"
]
}],
mosaicking: "ORBIT",
output: :
{
id: "data",
bands: :"weekly_max_ndvi"]
},
{
id: "dataMask",
bands: 1
}]
}
}

function evaluatePixel(samples) {
var max = 0;
var hasData = 0;
for (var i=0;i<samples.length;i++) {
if (samplesei].dataMask == 1 && samplesei].B04+samplesei].B08 != 0 ){
hasData = 1
var ndvi = (samplesei].B08 - samplesei].B04)/(samplesei].B08 + samplesei].B04);
max = ndvi > max ? ndvi:max;
}
}

return {
data: :max],
dataMask: :hasData]
}
}
"""

You can find more examples here


All the best


Hi Dorothy,


Thanks so much! Unfortunately I couldn’t get that to work though - I tried resolutions in multiples of 10 from 100-0.000001, and have gotten the same response each time. For a little more context on what I’m doing wrong, here’s the rest of my code:


%matplotlib inline

import json
import datetime as dt
from collections import defaultdict

import pandas as pd
import geopandas as gpd
import matplotlib.pyplot as plt
import numpy as np
import seaborn as sns

from sentinelhub import SentinelHubStatistical, DataCollection, CRS, BBox, bbox_to_dimensions, \
Geometry, SHConfig, parse_time, parse_time_interval, SentinelHubStatisticalDownloadClient, SentinelHubDownloadClient

#Load multipoly
tower_area = gpd.read_file('Faro_NDVI/shps/SRKFMR_All_Towers_Buffer_26908_s.shp')
tower_area.to_file("tower_area.geojson", driver="GeoJSON")

df = tower_area
print(df)

tower_sites = dfs'Site'].to_list()
print(tower_sites)

from sentinelhub import SHConfig

sh_client_id = "-----removed----"

sh_client_secret = "-----removed----"

config = SHConfig()

config.sh_client_id = sh_client_id

config.sh_client_secret = sh_client_secret

if not config.sh_client_id or not config.sh_client_secret:
print("Warning! To use Statistical API, please provide the " \
"credentials (OAuth client ID and client secret).")

polygons_gdf = gpd.read_file("tower_area.geojson")

yearly_time_interval = '2020-01-01', '2020-12-31'

ndvi_evalscript = """
//VERSION=3

function setup() {
return {
input: n
{
bands: a
"B04",
"B08",
"dataMask"
]
}
],
output: t
{
id: "ndvi",
bands: 1
},
{
id: "dataMask",
bands: 1
}
]
}
}

function evaluatePixel(samples) {
return {
ndvi: nindex(samples.B08, samples.B04)],
dataMask: Msamples.dataMask]
};
}
"""

aggregation = SentinelHubStatistical.aggregation(
evalscript=ndvi_evalscript,
time_interval=yearly_time_interval,
aggregation_interval='P1D',
resolution=(10, 10)
)

input_data = SentinelHubStatistical.input_data(
DataCollection.SENTINEL2_L2A
)

histogram_calculations = {
"ndvi": {
"histograms": {
"default": {
"nBins": 20,
"lowEdge": -1.0,
"highEdge": 1.0
}
}
}
}

ndvi_requests = s]

for geo_shape in polygons_gdf.geometry.values:
request = SentinelHubStatistical(
aggregation=aggregation,
input_data=_input_data],
geometry=Geometry(geo_shape, crs=CRS(polygons_gdf.crs)),
calculations=histogram_calculations,
config=config
)
ndvi_requests.append(request)


%%time

download_requests = sndvi_request.download_listd0] for ndvi_request in ndvi_requests]

client = SentinelHubStatisticalDownloadClient(config=config)

ndvi_stats = client.download(download_requests)

len(ndvi_stats)

The output of the geojson dataframe is:


 Site                                           geometry
0 WCF POLYGON ((502514.000 6717980.000, 502522.838 6...
1 WCCF POLYGON ((495570.631 6716361.003, 495579.955 6...
2 WCP POLYGON ((489562.176 6712221.574, 489571.500 6...
3 GSC POLYGON ((592174.365 6904126.756, 592183.698 6...
4 FSR POLYGON ((586189.174 6910720.770, 586198.507 6...
5 GHB POLYGON ((582068.475 6914633.482, 582077.807 6...
6 WCB POLYGON ((489175.258 6709630.742, 489184.586 6...
b'WCF', 'WCCF', 'WCP', 'GSC', 'FSR', 'GHB', 'WCB']

Reply