Skip to main content

I encountered a very strange issue (error) with the Stats API.


If I make the time range any time from current utc time, it return NaN for all statistical values for the time series:


Current time utc: ā€œ2023-06-23T00:57:10Zā€


"timeRange": {
"to": "2023-06-23T00:57:10Z",
"from": "2023-06-09T00:57:10Z"
}

ā†’ result is all NaNs for entire range of returned statistics.


However if I make it yesterdayā€™s time utc by subtracting a few hours, depending on the time, value are returned normally for the entire time series.


{
"timeRange": {
"to": "2023-06-22T23:01:50Z",
"from": "2023-06-08T23:01:50Z"
}
}

What is going on behind the scenes here, and is this a bug that will be fixed? How can I approach getting the most recent data if I cannot use the time now in utc, but need yesterdayā€™s date?


Thanks.

Iā€™d like to add the following further case. I should add that my aggregationInterval is 1PD or 1 day.


If I simply subtract 24 hours from the current utc time, it does not work (returns NaNs):


timeRange": {
"to": "2023-06-22T05:18:43Z",
"from": "2023-06-08T05:18:43Z"
}

If I set the to to yesterday at midnight but donā€™t change the from time it does not work (returns NaNs):


"timeRange": {
"to": "2023-06-22T23:59:59Z",
"from": "2023-06-08T05:10:19Z"
}

If I set the to and from to exactly midnight yesterday, it finally works:


"timeRange": {
"to": "2023-06-22T23:59:59Z",
"from": "2023-06-12T23:59:59Z"
}

It seems as though:



  1. To cannot be today, it must be yesterday at midnight

  2. You have the set exactly the same time stamp for to and from fields

Why?


Can you perhaps add full request example (without authentication token, obviously), so that we can debug it? It is much easier to look at it this way.


Sure! Thanks Grega.


This returns NaNs:


"stats_request": {
"input": {
"data": [
{
"dataFilter": {
"mosaickingOrder": "mostRecent"
},
"type": "sentinel-2-l2a"
}
],
"bounds": {
"bbox": [
12.44693,
41.870072,
12.541001,
41.917096
]
}
},
"aggregation": {
"timeRange": {
"to": "2023-06-22T06:02:10Z",
"from": "2023-06-12T06:02:10Z"
},
"evalscript": "\n//VERSION=3\nfunction setup() {\n return {\n input: [{\n bands: [\n \"CLM\",\n \"dataMask\"\n ]\n }],\n output: [\n {\n id: \"data\",\n bands: 1\n },\n {\n id: \"dataMask\",\n bands: 1\n }]\n }\n}\nfunction evaluatePixel(samples) {\n return {\n data: [samples.CLM],\n dataMask: [samples.dataMask]\n }\n}\n",
"aggregationInterval": {
"of": "P1D"
}
}
}

This returns the data:


"stats_request": {
"input": {
"data": [
{
"dataFilter": {
"mosaickingOrder": "mostRecent"
},
"type": "sentinel-2-l2a"
}
],
"bounds": {
"bbox": [
12.44693,
41.870072,
12.541001,
41.917096
]
}
},
"aggregation": {
"timeRange": {
"to": "2023-06-22T23:59:59Z",
"from": "2023-06-12T23:59:59Z"
},
"evalscript": "\n//VERSION=3\nfunction setup() {\n return {\n input: [{\n bands: [\n \"CLM\",\n \"dataMask\"\n ]\n }],\n output: [\n {\n id: \"data\",\n bands: 1\n },\n {\n id: \"dataMask\",\n bands: 1\n }]\n }\n}\nfunction evaluatePixel(samples) {\n return {\n data: [samples.CLM],\n dataMask: [samples.dataMask]\n }\n}\n",
"aggregationInterval": {
"of": "P1D"
}
}
}

The only difference is the date fields.


My best guess is that because the interval is 1 day, the time range must be larger than 1 day or it fails, for all intervals. So, assuming we want to search only the last day, the from field must be the end of yesterday (minus one day at 23:59:59), and the to field must be the beginning of tomorrow (plus one day at 00:00:00, even though it is in the future) as follows:


current utc time: "2023-06-27T10:00:00Z"
"timeRange": {
"to": "2023-06-28T00:00:00Z",
"from": "2023-06-26T23:59:59Z"
},

It would be great if the API could handle random input dates smaller or larger than the aggregation period automatically rather than returning NaNs or at least throw a date aggregation error so the user knows their timeRange field is incorrect.


Hi Mark, Apologies for not updating the thread. We are looking into it and when we have found a fix we will update the thread here. All the best.


This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.


Reply