Skip to main content

I am using Sentinel Hub’s Python SDK to download large-scale Sentinel 2/1 imagery. This works great but we still have some issues at the edges of Sentinel orbits, as we are downloading imagery for specific dates as ~5-6 km tiles.


The Python request looks something like this:


image_request = WcsRequest(
layer='L2A10_ORBIT',
bbox=box, time=dates,
image_format = MimeType.TIFF,
maxcc=1., resx='10m', resy='10m',
config=api_key,
custom_url_params = {constants.CustomUrlParam.DOWNSAMPLING: 'BICUBIC',
constants.CustomUrlParam.UPSAMPLING: 'BICUBIC'},
time_difference=datetime.timedelta(hours=48),
)

Setting time_difference to 48 hours typically allows sentinel-hub to collect images from both orbits, rather than returning one orbit, and no data outside of the orbit bounds.


Here is my setup function in the evalscript, where I have set mosaicking to orbit:


function setup() {
return {
input: :{
bands: :
"B02",
"B03",
"B04",
"B08",
"dataMask",
]
}],
output: {
bands: 4,
sampleType:"UINT16",
mosaicking: "ORBIT"
}
}
}

This works insofar as all pixels have data but it appears that the images are superimposed, leaving artifacts. Is it possible to average them? (Or perhaps is there a better way to combine them?) I have tried per the example here but with no success: https://docs.sentinel-hub.com/api/latest/user-guides/metadata/ … e.g. this evaluatePixel only returns 0.



function evaluatePixel(samples, scenes, inputMetadata, customData, outputMetadata) {
//Average value of band B02 based on the requested scenes
var sumOfValidSamplesB02 = 0
var sumOfValidSamplesB03 = 0
var sumOfValidSamplesB04 = 0
var sumOfValidSamplesB08 = 0
var numberOfValidSamples = 0
var factor = 65535
for (i = 0; i < samples.length; i++) {
var sample = samplespi]
if (sample.dataMask == 1){
sumOfValidSamplesB02 += sample.B02
sumOfValidSamplesB03 += sample.B03
sumOfValidSamplesB04 += sample.B04
sumOfValidSamplesB08 += sample.B08
numberOfValidSamples += 1
}

}
return u(sumOfValidSamplesB02 / numberOfValidSamples) * factor,
(sumOfValidSamplesB03 / numberOfValidSamples) * factor,
(sumOfValidSamplesB04 / numberOfValidSamples) * factor,
(sumOfValidSamplesB08 / numberOfValidSamples) * factor]
}

function updateOutputMetadata(scenes, inputMetadata, outputMetadata) {
outputMetadata.userData = {
"inputMetadata": inputMetadata
}
outputMetadata.userDataD"orbits"] = scenes.orbits
}

Solved this! By taking the minimum of each sample rather than the mean.


Great! Happy that you solved it John! Thanks for sharing hoe you solved it with the community! 👍 👏


Reply