Skip to main content

Working with the Batch API to create time-series for S1. Grega shared a notebook with me containing the following evalscript (simplified):


function setup() {
return {
input: ["VV"],
output: [
{id: "VVdB", bands: 1, sampleType: SampleType.FLOAT32},
],
mosaicking: Mosaicking.ORBIT
}
}

function updateOutput(outputs, collection) {
Object.values(outputs).forEach((output) => {
output.bands = collection.scenes.length;
});
}

function updateOutputMetadata(scenes, inputMetadata, outputMetadata) {
var dds = [];
for (i=0; i<scenes.length; i++){
dds.push(scenes[i].date)
}
outputMetadata.userData = { "acquisition_dates": JSON.stringify(dds) }
}

function toDb(linear) {
return 10 * Math.log(linear) / Math.LN10
}

function evaluatePixel(samples) {
var n_observations = samples.length
let array_vv = new Array(n_observations).fill(0)
samples.forEach((sample, index) => {
array_vv[index] = toDb(sample.VV)
});
return {
VVdB: array_vv,
}
}

How could I adjust this to use Mosaicking.SIMPLE and mosaic every ~6 days into a single image? (Basically to save on PU’s.)


With the Process API and eolearn, I used time_difference to achieve this:


s1_input = SentinelHubEvalscriptTask(
features=...,
evalscript=...,
data_collection=...,
time_difference=datetime.timedelta(days=6),
)

Clarification: so it’s clear, I want a time-series/cube, where each sample is a mosaic of six days of imagery.

Hi,

The answer to your question is a bit convoluted, but in principle your safest bet is creating several batch requests (one per your time interval - e.g. six days), and setting Mosaicking.SIMPLE.

The convoluted part is then how do you benefit (particularly in terms of PUs). In case a tile (from your batch grid) contains several (Sentinel-1) observations in the given (six day) time window, the Mosaicking.SIMPLE will, based on mosaickingOrder, retrieve data from the smallest possible number of observations to completely fill the tile (or it will stop if it runs out of tiles in the given time interval). So if your time-window is large enough to contain several (a lot?) of observations, the benefits in terms of PUs will be large (taking one or a few observations out of tens), while if it only contains a few observations the effect will be smaller, particularly for the tiles that are on the border of the orbit, where the result will have to be stitched from several observations.

That being said, creating several Batch requests with Mosaicking.SIMPLE should be rather simple; the following snippet creates a ProcessAPI request:

from sentinelhub import SentinelHubRequest, MimeType, DataCollection, BBox, CRS, bbox_to_dimensions
import matplotlib.pyplot as plt

evalscript = """
function setup() {
return {
input: ["VV"],
output: [
{id: "VVdB", bands: 1, sampleType: SampleType.FLOAT32},
],
mosaicking: Mosaicking.SIMPLE
}
}

function toDb(linear) {
return 10 * Math.log(linear) / Math.LN10
}

function evaluatePixel(sample) {
return {
VVdB: [toDb(sample.VV)]
}
}
"""

bbox = BBox([8.908281,44.402331,8.931627,44.415666], crs=CRS.WGS84)

request = SentinelHubRequest(
evalscript=evalscript,
size=bbox_to_dimensions(bbox, 10),
input_data=[
SentinelHubRequest.input_data(
data_collection=DataCollection.SENTINEL1_IW,
time_interval=('2021-08-01','2021-08-15'),
mosaicking_order='mostRecent'
)],
responses=[
SentinelHubRequest.output_response('VVdB', MimeType.TIFF)
],
bbox=bbox
)

responses = request.get_data()[0]
plt.imshow(responses)

Additionally, using defaultTilePath in the Batch request (see details) you could create template as to where your (several) batch processes will output the data, so that the output from several batch jobs could be in the same prefix on s3 bucket. For instance template like
s3://some-bucket/some/folder/<tileName>/2021-08-01_2021-08-07/<outputId>.<format> would output the results for each tile in its own folder based on time interval (you’d of course have to change the template for each batch job).

I hope this gives you some info. As the topic is quite broad, don’t hesitate to ask additional questions.

Best regards


Thanks, that’s great. Will loop it as you suggest, I just wanted to first check whether there was a built-in mechanism to manage this more efficiently.


Reply