Frequently Asked Questions

Home > Frequently Asked Questions

General

This answer is a summarized and harmonized description related to this forum thread. We will be updating the answer in line with users' feedback so that it contains a consistent set of information.

End of June we will set S2 L1C bucket as “Requester pays” similarly as it is for S2 L2A and S1 GRD. This move is required to allow further increase of the shared data - for continuation of Sentinel-2 L1C coverage, to accommodate global rollout of Sentinel-2 L2A and to host European and then global Sentinel-2 analysis ready mosaics.

How might this affect your workflow:

  1. If you are using the data within AWS EU (Frankfurt) region using S3 protocol, there will be practically no change. There will be some very small additional charge incurred to your account, related to “GET, SELECT and all other Requests”, currently at 0.43$ per 1 million requests.
  2. If you are using the data within any other AWS region using S3 protocol, there will be a small data transfer charge incurred to your account - "Data transfer OUT from Amazon EC2 To Another AWS region", currently at 0.02 USD per GB. You can avoid this by setting up part (or all) of your processing to AWS EU (Frankfurt) region.
  3. If you are using the data within AWS using HTTP protocol, you will need to sign requests (more info here). You will find examples on how to do it here. We will also upgrade our sentinelhub-py python libraries to work with S3 data soon (by mid of May latest) to make it easier for you.
  4. If you are using data outside of AWS, you should consider setting up AWS instance within EU-1 region. There is free tier available (https://aws.amazon.com/free/) and you might be applicable for AWS research credits (https://aws.amazon.com/earth/research-credits/). Alternatively see this thread on how it is possible to access data directly over Internet. Note that there will be a small charge for this as well "Data transfer OUT from Amazon EC2 To Internet", currently at 0.05-0.09 USD per GB

Note that whichever category you fall into above, you will need to make changes to your code to continue to access Sentinel-2 data on AWS. In most cases, it’s as simple as adding a flag to your object request.

As there are several web applications out there making use of Sentinel meta-data, we have decided to make some of these data available over HTTP permanently:

  • L1C and L2A tiles
    • readme.html
    • preview.jpg
    • tileInfo.json
    • productInfo.json
    • metadata.xml
  • L1C and L2A products
    • readme.html
    • metadata.xml
    • productInfo.json

In case you believe we should make some other meta-data available in a similar manner, describe the need in the thread bellow.

We understand that this change might be an inconvenience to some users. However, the goal of this experiment is to discover how best to stage data for analysis and processing in the cloud. When the Sentinel Public Dataset was established two years ago, getting hold of Sentinel imagery was quite a challenge. It was distributed in unwieldy chunks and Copernicus SciHub had trouble managing the demand. And there was no other place to get it. At that time, we made a decision to make the data available as easily as possible and as openly as possible. Things are changing now, with the collaborative ground segment running, four DIAS-es coming in a few months, and data generally being more easily accessible. So we believe the time is now right to go back to original purpose – experimenting with how best to stage data for analysis in the cloud.

Best Regards,
Sinergise (Sentinel custodian at AWS Public Datasets)

 

You can find Quick tutorial on Sentinel Hub here.

Sentinel Hub forum is the best place to get in touch with support staff as well as other Sentinel Hub users.

We put a lot of effort to make data available as soon as possible. However, we depend almost entirely on how fast the data provider can process and disseminate the data.

For Sentinel products, ESA usually makes them available on their OpenHub in 6-12 hours (target timing is 3-6 hours, sometimes it takes longer, even more than one day). We instantly pull data down but as it takes some time to download complete data, it might take up to one additional hour before they are available on Sentinel Hub.

In 2017/11 major release we have changed the way of working with styles, see full description of changes.

In previous version, styles were split in three different groups:

  1. Visualized, Raw (for Custom scripts)
  2. Color map, Grayscale, Blue/Red, Green White Linear, Red Temperature, Index (for single values / single input band)
  3. RGB, RGB Advanced, Reflectance, Sentinel DN (for three-color visualization, such as True colour, False colour, etc.)

In 2017/11 version scripts consist basically of two parts, setup and evaluatePixel.
In setup part the input and output components are defined.
 

function setup(ds) {

setInputComponents([ds.B02, ds.B03, ds.B04]);

setOutputComponentCount(3);

}

EvaluatePixel part is the main difference between old version and 2017/11. In 2017/11 different Visualizers can be used and each Visualizer has it's own set of parameters.

  • ColorGradientVisualizer (valColPairs, minVal=0.0, maxVal=1.0)
  • ColormapVisualizer (valColPairs)
  • DefaultVisualizer (minValue=0.0, maxValue=1.0, gain=1.0, offset=0.0, gamma=1.0)
  • HighlightCompressVisualizer (minValue=0.0, maxValue=1.0, gain=1.0, offset=0.0, gamma=1.0)
  • HighlightCompressVisualizerSingle (minValue=0.0, maxValue=1.0, gain=1.0, offset=0.0, gamma=1.0)
  • Identity (val)

Example of evaluatePixel function bellow creates a Visualizer, where values are spanned between min and max value. Bands B04, B03 and B02 are used as data input. Each band value is then processed with specified Visualizer and function returns an array of resulting values.

let minVal = 0;

let MaxVal = 255;

let gain = 0.7;

let viz = new HighlightCompressVisualizer(minVal, maxVal, gain);

function evaluatePixel(samples) {

let val = [samples[0].B04, samples[0].B03, samples[0].B02];

return val.map(v => viz.process(v));

}

Each old style group A, B or C mentioned above can use any Visualizer and can return resulting values as arrays of values. Array sizes correspond to the Visualizer used and number of bands in input.

General formula or process for conversion of old style eval scripts to new style eval scripts can be summoned to the following examples. Examples don't use Visualizers:

Single input band / single output band

function evaluatePixel(s) {

//eval function takes inputs as defined in setInputComponents([ds.B07]);

//so using s[0].B07 we access band B07

let val = s[0].B07

return [val * 5 + 120]

}

function setup(ds) {

setInputComponents([ds.B07]);

setOutputComponentCount(1);

}

Multiple input bands / single output band

function evaluatePixel(s) {

return[0.2 * s[0].B05 + 0.3 * s[0].B06 + 0.5 * s[0].B02];

}

function setup(ds) {

setInputComponents([ds.B05, ds.B06, ds.B02]);

setOutputComponentCount(1);

}

Multiple input bands / multiple output bands

function evaluatePixel(s) {

return [

0.2 * s[0].B05 + 0.3 * s[0].B06 + 0.5 * s[0].B02,

s[0].B08,

s[0].B09

];

}

function setup(ds) {

setInputComponents([ds.B05, ds.B06, ds.B02, ds.B08, ds.B09]);

setOutputComponentCount(3);

}

General formula or process for conversion of old style Custom scripts to new style Custom scripts with Visualizers can be implemented with calling Visualizer before return.
Example:

return [

0.2 * s[0].B05 + 0.3 * s[0].B06 + 0.5 * s[0].B02,

s[0].B08,

s[0].B09

];

changes to:

let minVal = 0;

let MaxVal = 255;

let viz = new HighlightCompressVisualizer(minVal, maxVal);


function evaluatePixel(s) {

val = [

0.2 * s[0].B05 + 0.3 * s[0].B06 + 0.5 * s[0].B02,

s[0].B08,

s[0].B09

];

return val.map(v => viz.process(v));

)

General example of migration
The user sees a Custom script in new system (2017/11)
It looks like:

let minVal = 0.0;

let maxVal = 0.4;

let viz = new HighlightCompressVisualizer(minVal, maxVal);

function evaluatePixel(samples) {

let val = [samples[0].B04, samples[0].B03, samples[0].B01];

return val.map(v => viz.process(v));

}

function setup(ds) {

setInputComponents([ds.B01, ds.B03, ds.B04]);

setOutputComponentCount(3);

}

when user clicks "Save", the system responds with a warning about depreciation of styles.
If you want to use some other style, you can set Custom script accordingly.

The required changes can be implemented in two ways:

  • Use new builtin scripts
  • Modify your old custom script so that it uses new style as described in first part.

Basically you need to modify eval function, so that it uses input bands in your
calculations. Final results can then be returned via prefered Visualizer.

Some examples are listed below:

RGB:

let minVal = 0.0;

let maxVal = 0.4;

let viz = new HighlightCompressVisualizer(minVal, maxVal);


function evaluatePixel(samples) {

// pure RGB values


let val = [samples[0].B04, samples[0].B03, samples[0].B02];


// fill band values

// val[0] - band R

// val[1] - band G

// val[2] - band B


// copy your calculations here and take care that correct variables are used

// for different bands


// an example of simple calculation

// let val = [samples[0].B04 * 0.2,

// (samples[0].B03 + samples[0].B02) * 0.5,

// samples[0].B02 * 0.7];

return val.map(v => viz.process(v));

}


function setup(ds) {

setInputComponents([ds.B02, ds.B03, ds.B04]);

setOutputComponentCount(3);

}

In the table below are Visualizers used for different styles:
Three band values  

let val = [samples[0].B04, samples[0].B03, samples[0].B02];

return val.map(v => viz.process(v));
style viz description
RGB HighlightCompressVisualizer RGB visualization
REFLECTANCE Identity Original reflectance in the range [0, 1]
NORMALIZED DefaultVisualizer  

Single band values

let val = [samples[0].B04, samples[0].B03, samples[0].B02];

return val.map(v => viz.process(v));
style viz description legend
COLOR_MAP ColorMapVisualizer.createDefaultColorMap() Mapping of the input value into discrete colors
GRAYSCALE HighlightCompressVisualizerSingle() Grayscale visualization  
BLUE_RED ColorGradientVisualizer.createBlueRed(minVal, maxVal) Rainbow, from blue to red, through green and yellow
GREEN_WHITE ColorGradientVisualizer.createWhiteGreen() Green to white linear scale
RED_TEMPERATURE ColorGradientVisualizer.createRedTemperature() Red to white linear scale, representing the black-body color depending on the temperature
INDEX Identity() Direct index value  

 

A 2017-11-23 release of Sentinel Hub services is bringing some changes in the use of APIs, which will make it possible to continue with future improvements. You might want to read a public announcement about it here.

We've put a lot of effort into making the changes backward compatible and the majority of users should not feel a difference. However, as our services are being used in the most innovative ways, it might be that some actions will be needed on the side of our users.

Note: for users, who use our services solely through EO Browser, there are no significant changes.

These are the changes:

  • The configuration utility (formerly called WMS Configurator) is available at the new location. It is also getting simplified (many users complained about the complexity of the old one). Almost all options are still configurable but we have hidden the rarely used ones behind an "Advanced" button.
  • The service end-point is changing from https://services.sentinel-hub.com/v1/* to https://services.sentinel-hub.com/ogc/*. The old end-point will be redirected (after the transition period ends) to the new one so there is no immediate need to change this. However, we would still recommend you do this as soon as possible. Note: this is only for request URLs, there is no need to change any URLs in the configurator!
  • Hard-coded EO products are gone as our users asked for more transparency on the path from the original reflectance/sensor values to the output from our services. We replaced these with "EO Products", essentially custom script product templates (we have learned that our users love custom scripts), which can be used directly exactly like before, or further adjusted in line with a user's expectations.
  • We are getting rid of "STYLES" as people found it confusing. Each layer therefore has only one output option. For existing layers created prior to these changes STYLES will continue to be supported, however any modifications to these layers will require its multiple styles to be removed. All previous visualization options (RGB, grayscale, Red-blue, etc.) will continue to be available, each under a different EO Product template. Check this FAQ entry for more information.
  • Each service call can only accept one layer. It did not make sense to query for "TRUE_COLOR,NDVI" even before, but some users were querying our API with a combination of raw band layers, e.g. "LAYERS=B04,B03,B02,B08". This can still be done now by configuring a layer with a custom script: "return [B04,B03,B02,B08];". One can also use this custom script as a part of a URL request, see the EVALSCRIPT parameter. If you want to get an image composite similar to the one in the old version, try adding gain factor 2.5 to each of the bands, e.g. "return [2.5*B04,2.5*B03,2.5*B02];" For raw data (or reflectance), there should not be any multiplication.
  • An exception to the rule above is, for now, combinations containing the meta-data layer "DATE", e.g. "TRUE_COLOR,DATE", which will show true color imagery and imagery dates of the scene. This feature will remain to assure backward compatibility as we have noticed that quite a few of our users use this.
  • There are no "default" or preset layers available to be used in the API. Each instance needs its own configuration of layers. E.g. if one used to query for B08 (NIR sensor in Sentinel-2), one needs to configure this layer in the Configuration utility. Similarly one cannot query only for "DATE" meta-data. For those who would like to only get DATE information, you can create a layer returning white (e.g. "return [1,1,1];") and append the "DATE" layer in your request.
  • GAIN, GAMMA and OFFSET parameters will be deprecated as they did not act consistently in the past. They should now be included in custom script.
  • Maximum area, a very rarely used feature for limiting the extent at which layer is visualized, can now be set only on the instance level. For those who need different extent limits, you can always create several differnet instances. Or better yet, use the "GEOMETRY" parameter when calling the service to clip the image.
  • The channel placeholder names in FIS API have changed. It now returns enumerated keys C0, C1, C2,... for described channels, when it returned named keys like RED, GREEN, BLUE in the previous version.
  • WCS specific
    • WCS service now supports version 1.0 only (support for 1.1 and 1.2 was poor beforehand anyway). We have decided to put more focus into implementing our own API for data provision than to support OGC in the case of WCS.
    • Parameters LAYERS (for WMS) and COVERAGE (for WCS) are now mandatory. For those, working with EVALSCRIPT parameter in a dynamic fashion, you can simply create a default layer in Configuration utility (e.g. you create one with Custom Script and define "return [1,0,0];" in it).
  • WFS specific
    • Datasource speification became mandatory as there are more datasources coming in. E.g. if one used "TYPENAMES=TILE", it should be changed to "TYPENAMES=S2.TILE".
  • FIS specific:
    • "NDVI" element was changed to "C0" as we rolled out FIS in a more generic fashion, e.g. supporting any index, not just NDVI.
  • Custom script specific:
    • eval() function was renamed to evaluatePixel()
  • Mosaic Generator is getting deprecated and replaced with EO Browser.
  • TILE, FILL and OUTLINE default layers were deprecated.

The roll-out of the new version on various infrastructures:

  • The core infrastructure, where Sentinel-2 is available (services.sentinel-hub.com) will be migrated in the week commencing the 20th of November 2017. There will be a temporary period of one month during which both the old and new version will operate.
  • US West infrastructure, where we serve Landsat, MODIS and DEM data (services.aws-uswest2.sentinel-hub.com) was migrated on 8th of February 2018. here will be a temporary period of one month during which both the old and new version will operate.
  • EO Cloud infrastructure, where we serve Sentinel-1, Sentinel-3, etc., will be migrated in Q2 2018.

NOTE: This post will be updated through time so come back.
Watch Quick tutorial on Sentinel Hub.

In case one needs, for a specific purpose, source data used for some scene, he can use a WFS request with the same parameters as for WMS. A response will contain a path attribute, e.g. something along the lines of:

"path": "s3://sentinel-s2-l1c/tiles/310/S/DG/2015/12/7/0"

It is then possible to access these data from AWS, where they are stored by adding relevant filenames, see typical structure of a typical scene at AWS bellow.

In case you want to download it from outside of AWS network, you can change the address to:

https://sentinel-s2-l1c.s3.amazonaws.com/tiles/10/S/DG/2015/12/7/0

Typical structure of the scene:

 aws s3 ls s3://sentinel-s2-l1c/tiles/10/S/DG/2015/12/7/0/
                           PRE auxiliary/
                           PRE preview/
                           PRE qi/
2017-08-27 20:39:26    2844474 B01.jp2
2017-08-27 20:39:26   83913648 B02.jp2
2017-08-27 20:39:26   89505314 B03.jp2
2017-08-27 20:39:26   97645600 B04.jp2
2017-08-27 20:39:26   27033174 B05.jp2
2017-08-27 20:39:26   27202251 B06.jp2
2017-08-27 20:39:26   27621995 B07.jp2
2017-08-27 20:39:26  100009241 B08.jp2
2017-08-27 20:39:26    2987729 B09.jp2
2017-08-27 20:39:26    1218596 B10.jp2
2017-08-27 20:39:26   27661057 B11.jp2
2017-08-27 20:39:26   27712884 B12.jp2
2017-08-27 20:39:26   27909008 B8A.jp2
2017-08-27 20:39:26  134890604 TCI.jp2
2017-08-27 20:39:26     379381 metadata.xml
2017-08-27 20:39:26     166593 preview.jp2
2017-08-27 20:39:49     106056 preview.jpg
2017-08-27 20:39:47       1035 productInfo.json
2017-08-27 20:39:26       1678 tileInfo.json

 

Due to different licensing model we cannot provide free access to these datasets. One therefore first has to order the image through us (most commonly an on-going monitoring of specific area on daily/weekly/monthly basis). Once this order is processed, we will make the purchased data available through Sentinel Hub services in the same manner as other data.

 Find your INSTANCE ID in the Sentinel Hub Configuration Utility. See example below:


Watch Quick tutorial on Sentinel Hub.

Some areas never have cloud coverage below 20%. This is why in Playground, where we have default setting for 20% of cloud coverage, for some areas seems there is no satellite imagery available. However if we turn slider for maximum cloud coverage to 100%, we can see data available.

You can use rjson R package and then convert the JSON object into an R object as in the following code snippets:


library("rjson")

loadFisResponse <- function(fisQuery) {
    return rjson::fromJSON(paste(readLines(fis_query), colapse = ""))
}

getDates <- function(fisResponse) {
    numberOfDates <- length(fisResponse[[1]])
    datesVector <- rep("", numberOfDates)
    for (listIndex in 1:numberOfDates) {
        datesVector[listIndex] <- fisResponse[[1]][[listIndex]]$date
    }
    return(datesVector)
}

makeHistogram <- function(fisResponse, itemIndex) {
    h <- fisResponse[[1]][[itemIndex]]$histogram
    numberOfBins <- length(h$bins)

    counts = rep(0, numberOfBins)
    values = rep(0, numberOfBins)
    for (i in 1:numberOfBins) {
        counts[i] <- h$bins[[i]]$count
        values[i] <- h$bins[[i]]$value
    }
    return(data.frame(values, counts))
}

getBasicStats <- function(fisResponse, itemIndex) {
    stats <- fisResponse[[1]][[itemIndex]]$basicStats
    minValue <- stats$min
    maxValue <- stats$max
    meanValue <- stats$mean
    standardDeviation <- stats$stDev
    return(data.frame(minValue, maxValue, meanValue, standardDeviation))
}

getNumberOfDates <- function(fisResponse) {
    return(length(fisResponse[[1]]))
}

The fisResponse object is expected to be a nested list that reflects the output format specified under FIS documentation.

OGC consortium specifications for WMS, WFS, WCS, … services define the coordinate axis order. Older OGC service specifications assumed "X, Y" order for all coordinate reference systems (CRSs), even for WGS84 (EPSG:4326), while newer OGC service specifications obey the axis orders defined by the CRS's, not assuming "X, Y" order anymore. 

Sinergise services conform to the standards definitions. The WGS84 axis order are thus version dependent:

- WMS:
   - version 1.1.1: longitude, latitude
   - version 1.3.0: latitude, longitude
- WFS:
   - version 1.0.0: longitude, latitude
   - version 2.0.0: latitude, longitude
- WCS:
   - version 1.0.0: longitude, latitude
 

The user should always request a specific version of OGC services by providing and explicit "VERSION" parameter in the URL.
 

To learn more about the licesing, please click here.

All Sentinel Hub services are available using both HTTP and HTTPS protocols.

You can cite our applications / web pages in the following way (link is optional):

When referring to Sentinel data acquired through the Sentinel Hub services, please use the following credit:

        "Modified Copernicus Sentinel data [Year]/Sentinel Hub"

Subscription, packages and pricing

You can find example mobile app here.

Note that the subscription option does not include the mobile app itself. This is simply an option of the service best suited for integration in various apps.

There is no pricing plan activated in trial stage. You are getting full access to the system, without technical limitations.
You should not use the service for commercial means.

After 30 days you can decide, which subscription service you want and subscribe. There is no automatic payment.

A PayPal subscription possibility can be found here:
https://www.sentinel-hub.com/recurring-monthly-subscription-commercial-and-enterprise-models

This gives you an option to set-up automatically recurrent payment by PayPal, which you can cancel at any time.

Generally we advise to subscribe for annual package as it comes with a discount.
https://www.sentinel-hub.com/pricing-plans

A request stands for a tile of 512*512 px, which is a recommended option for integration in web application due to best performance/size ratio.
200.000 requests per month limit therefore means that you can retrieve 200.000 tiles of 512*512px per month from our service. If you ask for smaller tiles, e.g. 256*256px, the volume of requests is proportionally larger. Similarly, if you ask for large tiles (e.g. 2000*2000 px), it is smaller. 

What does generally not count as a request:

  • WFS requests used to retreive information about available dates in the area.
  • Requests, which result with error/warning rather than the data (e.g. requesting data over the area/time, where there are none, etc.)

Statistical API requests over long time-range can count as multiple requests (e.g. as many as there are available scenes in the defined time range).

Multi-temporal API requests over long time-range can count as multiple requests (e.g. as many as there are available scenes in the defined time range).

There is no difference in functionality or data between trial account and subscription account. With the trial account we like to give all users the opportunity to test the Sentinel Hub services at its fullest.

The Sentinel Hub trial account is not allowed for commercial use.

Our subscription plans are described here: https://sentinel-hub.com/pricing-plans

The difference between consumer / research and commercial plan is in the purpose of the usage of our services. If you are using the Sentinel Hub services for research purposes or for yourself, consumer / research option is the right one. If you are using it within your company, you should use commercial option.

We do not charge anything automatically, nor do we extend a subscription period automatically. Few weeks before the end of subscription period users will be notified about incoming event and asked, whether they would like to extend the subscription. In case users decide for an extension, they will be asked for additional payment. In the opposite case, account will be frozen for one month period and deleted afterwards.

If you decide for extending the subcription, there is also a PayPal subscription possibility:
https://www.sentinel-hub.com/recurring-monthly-subscription-commercial-a...

This gives you an option to set-up automatically recurrent payment by PayPal, which you can cancel at any time.

Individual subscription is meant for one, named user. She or he can use it at any device but this account cannot be shared amongst different users.

EO Browser

Simply write coordinates in the "Search places" tool in "lat,lon" form, e.g.:

46.0246,14.5367

And click enter

When you create a custom layer in Sentinel Hub Configuration Utility it is possible to select that layer in EO Browser. To do that, first select your configuration on Search tab, then select the appropriate search result, and in Visualization tab select the said custom layer.

However such layers usually contains color visualization (RGB) of values. Because Feature Info Service operates on the indicator values themselves (and not on RGB), that means that you must add an additional layer (in Sentinel Hub Configuration Utility) which returns the value without conversion to color. The name of this layer must be constructed from the ID (not name) of the visualization layer:

  __FIS_<original-layer-id>

That is: double underscore, followed by "FIS", followed by another underscore, followed by original layer's ID. For example, if we would like to use Feature Info on a layer with ID "3-NDVI2", we should add a layer with ID "__FIS_3-NDVI2".

The data processing script in the newly added "FIS layer" should either:

  • return a single indicator value or
  • return two values, the last of which indicates cloud coverage (1 if the point is covered by the cloud, 0 if not).

Example data processing script:

function index(x, y) {
    return (x - y) / (x + y);
}

const ndvi = index(B08, B04);
return [ndvi];

Example data processing script with cloud coverage:

function index(x, y) {
    return (x - y) / (x + y);
}

const NGDR = index(B03, B04);
const bRatio = (B03 - 0.175) / (0.39 - 0.175);

const isCloud = bRatio > 1 || (bRatio > 0 && NGDR > 0);
const ndvi = index(B08, B04);
return isCloud ? [ndvi, 1] : [ndvi, 0];

Note that this answer is specific to EO Browser and the way it uses layers.

If you are trying to get raw data directly in EO Browser following the SciHub link shown below, please note that SciHub is a 3rd party service operated by Copernicus and you need to have a separate account there.

You can create it for free here: https://scihub.copernicus.eu/

SentinelHub EO Data

We put quite a lot of effort into leaving the data as they are, not to modify it. However, this also means that some mistakes, that are int he data, get to our users. One of these is geolocation shift.

There is a nice blog post written by CNES colleague Olivier Hagolle on this topic. We suggest to read it and send a notice to ESA and Copernicus to do something about it.

You can get information from scene classification layer produced by Sen2Cor, for data where S2A is available. Data can be retrieved by identifier “SCL” (e.g. instead of return [B02]; for blue color one can use return [SCL/10];).
Data can be then used for e.g. validation of the pixel value, e.g. along the lines:

var scl = SCL;
if (scl == 0) { // No Data
  return [0, 0, 0]; // black
} else if (scl == 1) { // Saturated / Defective
  return [1, 0, 0.016]; // red
} else if (scl == 2) { // Dark Area Pixels
  return [0.525, 0.525, 0.525]; // gray
} else if (scl == 3) { // Cloud Shadows
  return [0.467, 0.298, 0.043]; // brown
} else if (scl == 4) { // Vegetation
  return [0.063, 0.827, 0.176]; // green
} else if (scl == 5) { // Bare Soils
  return [1, 1, 0.325]; // yellow
} else if (scl == 6) { // Water
  return [0, 0, 1]; // blue
} else if (scl == 7) { // Clouds low probability / Unclassified
  return [0.506, 0.506, 0.506]; // medium gray
} else if (scl == 8) { // Clouds medium probability
  return [0.753, 0.753, 0.753]; // light gray
} else if (scl == 9) { // Clouds high probability
  return [0.949, 0.949, 0.949]; // very light gray
} else if (scl == 10) { // Cirrus
  return [0.733, 0.773, 0.925]; // light blue/purple
} else if (scl == 11) { // Snow / Ice
  return [0.325, 1, 0.980]; // cyan
} else { // should never happen
  return [0,0,0];
}

Please note that it makes sense to use this layer only on full resolution as any interpolation based on classification codelist will not produce reasonable results. You should also use NEAREST upsampling/downsampling setting, which you can find in "Advanced layer editor" .

Also please note that this map does not constitute a land cover classification map in a strict sense, its main purpose is to be used internally in Sen2Cor in the atmospheric correction module to distinguish between cloudy pixels, clear pixels and water pixels.

An example:

Blinnenhorn, Switzerland (Sentinel-2 L2A, SCL, acquired with EO Browser on October 14, 2017)

Blinnenhorn, Switzerland (Sentinel-2 L2A, true color, acquired with EO Browser on October 14, 2017)

This is a prototype feature and can change in the future releases.

We have our services installed on several places to get fastest access to the data. This is why there are different access points for different data sources. Use the following entry points to configure APIs:

Sentinel-2 https://apps.sentinel-hub.com/configurator/
Recommended! Most stable and performing service.
Sentinel-2 L2A
Landsat-8 (global coverage) https://apps.sentinel-hub.com/wms-configurator-uswest2/
MODIS Aqua and Terra
Digital elevation model
Landsat-5 ESA coverage https://apps.eocloud.sentinel-hub.com/wms-configurator/
This service is in prototype mode, so some short interruptions may occur.
Landsat-7 ESA coverage
Landsat-8 ESA coverage
Sentinel-1 GRD
Envisat MERIS

We are working to unify end-point but until then please use the above.

Yes, we offer on-demand service, where we monitor specific places in the world and process data with Sen2Cor. This service requires additional payments to the overall Sentinel Hub subscription.

Sentinel-2 - global coverage, full archive
Sentinel-3 - global coverage, full OLCI archive
Sentinel-1 - global coverage, full GRD archive
Landsat-8 USGS - global coverage, almost full archive
Landsat-5, 7 and 8 - ESA Archive - Europe and North Africa, full archive
ENVISAT MERIS - global coverage, full archive
Digital Elevation Model – DEM - a static dataset based on MapZen's DEM updates
MODIS - Terra and Aqua

For more details check also our list of available data sources on our web page.

To explore availability, please check EO Browser, and Sentinel Playground (for DEM).

Use Sentinel Hub WFS request and retrieve all relevant geometries for given bounding box and time frame. From the response gather the unique dates. For each date, construct a WCS request to retrieve the image.
See the code example bellow.

<script>
        // using Sentinel Hub OGC web services - https://www.sentinel-hub.com/develop/capabilities/wms
        // config
        window.SENTINEL_HUB_INSTANCE_ID = '<SENTINEL_HUB_INSTANCE_ID>';
        window.layerName = '1_NATURAL_COLOR';
        window.from = '2015-01-01';
        window.to = '2017-04-20';
        window.bbox = '-410925.4640611076,4891969.810251283,-391357.58482010243,4911537.689492286';
        window.maxFeatures = 100; // 100 is max

        let images = [];
        let url = `https://services.sentinel-hub.com/ogc/wfs/${window.SENTINEL_HUB_INSTANCE_ID}
        ?service=WFS&version=2.0.0&request=GetFeature&time=${window.from}/${window.to}/P1D&typenames
        =TILE&maxfeatures=${window.maxFeatures}&srsname=EPSG:3857&bbox=${window.bbox}&outputformat=application/json`;

        (async () => {
            // retrieving
            // relevant geometries/images in bbox at time from-to
            // Sentinel Hub - WFS request - https://www.sentinel-hub.com/develop/documentation/api/ogc_api/wfs-request
            try {
                let response = await fetch(url);
                let data = await response.json();
                relevantGeometries = data;
                return data;
            } catch (e) {
                throw new Error('There was an error fetching the list of geometries from WFS service.\nDid you 
                substitute your SENTINEL_HUB_INSTANCE_ID?');
            }
        })().then(geometries => {
            // parsing
            // relevant geometries -> all relevant dates
            if(geometries.features === undefined) geometries.features = []
            
            let dates = new Set();
            geometries.features.forEach(value => {
                dates.add(value.properties.date)
            });
            
            return Array.from(dates);            
        }).then(dates => {
            // mapping
            // dates -> image url
            // images available via WCS request - https://www.sentinel-hub.com/develop/documentation/api/ogc_api/wcs-request
            dates.forEach(date => {
                let niceName = `${window.layerName} from ${date}.tiff`;
                niceName = encodeURIComponent(niceName);

                let imageUrl = `https://services.sentinel-hub.com/ogc/wcs/${window.SENTINEL_HUB_INSTANCE_ID}
                ?service=WCS&version=1.1.2&request=GetCoverage&time=${date}&coverage=
                ${window.layerName}&nicename=${niceName}&bbox=${window.bbox}`;
                images.push(imageUrl);
            });

            shout(images);
        });

        let shout = value => {
            console.log('Images', value);
        }
    </script>

There are several options, depending what you would like to do:
- You can add “DATE” layer to your WMS request and you will see dates of the scenes (note that we generate mosaic, which means that there might be different dates on different part of the screen), See example below.

- You can use a feature info request and point to some location. Response will contain date in it.
- You can use WFS service with the same parameters as WMS (e.g. date, cloud coverage). You will get a list of features representing scenes fitting the criteria.

In case it would be useful, you might see an example of such integration on GitHub Here

Our “calendar feature” uses WFS service.

In case you would like to get point values at some location (e.g. reflectance, NDVI, etc.) you can use the GetFeatureInfo request (e.g. "Identify features" or "feature info" in various GIS applications), e.g:

https://services.sentinel-hub.com/ogc/wms/<INSTANCE_ID>?SERVICE=WMS&VERSION=1.3.0&REQUEST=GetFeatureInfo&I=0&J=0&QUERY_LAYERS=NDVI&INFO_FORMAT=application/json&BBOX=38.55105530425345,-7.883667078518689,38.55269383803389,-7.885252872445627&CRS=EPSG:4326&MAXCC=100&WIDTH=1&HEIGHT=1&TIME=2017-03-16

Check also Statistical API.

Projections depend on the use case, however since Sentinel-2 data is originally provided in the UTM projection this is also a good choice for export. For larger areas spanning multiple UTM zones a more global projection is preferable, such as Web Mercator (EPSG:3857) or WGS84 (EPSG:4326).

SentinelHub EO Products

This is a bit complicated as it depends on several design decisions we’ve had to make in order to keep our service as generic as possible.

There are several basic rules used within Sentinel Hub:

  • whenever we work with sensor data (e.g. B02, B08, etc.), a reflectance value is used
  • when outputting the value from the system, it is clamped to the range 0-1 and then stretched to the range of values of the format of the output:
    • if format is 32-bit float, it is 0-1
    • if format is 16-bit, it is 0-65535; 0 -> 0 and 1 -> 65535
    • if format is 8-bit, it is 0-255; 0 -> 0 and 1 -> 255
  • IEEE 754 single-precision is used within calculations (e.g. 32-bit float)
  • internally during script evaluation, there is no clamping being done

As an example, if one wants to get Sentinel-2 reflectance value, one can use this simple custom script:
return [B02];

And depending on the output format will return:

  • reflectance values where reflectance = 1 is at a pixel value of 65535 (format=image/tiff;depth=16)
  • reflectance values where reflectance = 1 is at a pixel value of 255 (format=image/tiff;depth=8, format=image/jpg  or format=image/png)
  • reflectance values in the range 0-1 (format=image/tiff;depth=32f)

If one is calculating NDVI, e.g.:
var ndvi = (B08-B04)/(B08+B04);
return [ndvi];

This will return:

  • range 0-1 adjusted to an integer 0-65535 for 16 bit output (format=image/tiff;depth=16)
  • range 0-1 adjusted to an integer 0-255 if 8 bit output (format=image/tiff;depth=8, format=image/jpg  or format=image/png)
  • range 0-1 if 32 bit float output (format=image/tiff;depth=32f)

As NDVI values can sometimes be negative (e.g. -0.2) or if using a different index with values more than 1 (e.g. 1.5), one has to adapt the output if one wants to get this information. E.g. if one expects an index value from -1 to 2, the final script would have to be adapted to:
index = (index+1)/3;
The output between 0 and 1 can be there mapped like that:

  • output 0 -> value -1
  • output 0.333 -> value 0
  • output 0.667 -> value 1
  • output 1 -> value =2

Note that if one’s desired output is color visualisation, this can be as simple as this:

return colorBlend(ndvi, [-0.2, 0, 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9, 1.0 ], [[0, 0, 0], // < -.2 = #000000 (black) [165/255,0,38/255], // -> 0 = #a50026 [215/255,48/255,39/255], // -> .1 = #d73027 [244/255,109/255,67/255], // -> .2 = #f46d43 [253/255,174/255,97/255], // -> .3 = #fdae61 [254/255,224/255,139/255], // -> .4 = #fee08b [255/255,255/255,191/255], // -> .5 = #ffffbf [217/255,239/255,139/255], // -> .6 = #d9ef8b [166/255,217/255,106/255], // -> .7 = #a6d96a [102/255,189/255,99/255], // -> .8 = #66bd63 [26/255,152/255,80/255], // -> .9 = #1a9850 [0,104/255,55/255] // -> 1.0 = #006837 ]);

 

We recommend to check our “EO Product” page as well as Index DataBase, which holds numerous descriptions, along with scientific articles.

Step 1: image Open Configuration utility
Step 2: image Click "Add new layer" button.
Step 3: image Enter Layer title.
Step 4: image Fill the input fields, select "Product" from dropdown menu or write your "Custom script" by clicking on the "Pencil icon".
Step 5: image Type your custom script and click "Save". See detailed information about custom scripting here.
Step 6: image Click "Save All" to finish creating new layer.

image

For detailed information about WMS configuration click here.

Choose the NDVI layer in the Sentinel Hub Configuration Utility or set the LAYERS parameter to NDVI (LAYERS=NDVI) in your request URL.

Integrating in web applications

1. Go to https://canvas.acugis.com/ 
2. Sign up and login to Canvas.
3. Click on WMS Services in the left menu as shown below:

4. Click Add WMS Layer as shown below:

5. Enter the WMS Service information for Sentinal in the URL field  

6. Go to Projects, and click the Add Project Button as show below:

7. In the Project you created above, click the Maps button as shown below:

8. Click the Add Map button as shown below:

9. In the left menu of the new map, expand the WMS layer you created in step 4 above. Toggle the desired layer(s). Below, we are adding the NDVI layer.

10. Define the Geometry Area

      There are two method you can use to define a Geometry Area:

      1. A data layer
      2. Draw a Polygon or Rectange area

While there are some simple use-cases in which you might wish to draw an area, in most cases you will be using a data layer.  
Below, we are using a data layer of the state of Rhode Island.

11. Click the settings icon on the NDVI Layer you created in Step 9 above. Select ‘Geometry’ from the menu as shown below:

12. Select the Geometry Area
      a.    For a data layer (such as we are using), select Custom as the Geometry Area
      b.    For a polygon you have created via the free-hand toolbar, select Polygon

13. Click Save and then click the Save button for your map. 

14. Congratulations! Your map is now ready.

 

1. Configure your Instance in Configuration Utility.
2. Click "VIEW IN PLAYGROUND" button.
3. Click  button at the top right corner to open snippet tool.
4. Select the Google Maps maps library, type your Google Maps Api key and copy the code generated for the configured visualization.



Figure 1: Snippet tool is accessible in Sentinel Playground if it's opened from Configuration Utility. Before integration, you must enter your Google Maps API key.

1. Configure your Instance in Configuration Utility.
2. Click "VIEW IN PLAYGROUND" button.
3. Click  button at the top right corner to open snippet tool.
4. Select the Leaflet maps library and copy the code generated for the configured visualization.



Figure 1: Snippet tool is accessible in Sentinel Playground if it's opened from Configuration Utility

1. Configure your Instance in Configuration Utility.
2. Click "VIEW IN PLAYGROUND" button.
3. Click  button at the top right corner.
4. Select the OpenLayers maps library and copy the code generated for the configured visualization.



Figure 1: Snippet tool is accessible in Sentinel Playground if it's opened from Configuration Utility

Integrating in GIS applications

In some cases ArcMap turns on all layers and calls WMS. Sentinel Hub WMS service does not support visualisation of many layers at the same time. We suggest you turn off all layers and turn them on one by one.

If you get the following error:

Turn off all layers and turn them on one by one:

Image manipulation

In Configuration Utility, it is possible to edit Data Processing of a Layer and observe live changes in Layer's Preview before even saving the Layer. This feature can be useful for tweaking custom scripts parameters.

Best way to do this is to duplicate a Layer, and open Preview by clicking the "Show preview" button in original and duplicated Layer for easier comparison. Edit one of them to see changes between scripts by opening the Layer's "Data Processing" settings with clicking on the "Pencil" button. After the appropriate EO product is chosen hit the button "Copy script to editor" to tweak the script. After setting the custom script you can immediately inspect the changes in the preview window without saving the Layer.

You can also choose different EO Products under "Data processing settings".

With hitting the  button you will reset the layer.

If one wants to have pixels transparent (or semi-transparent), the following needs to be done:

  • format=image/png (note that PNGs are larger than JPGs, which might affect download speed)
  • custom script output needs to have 4 channels, fourth one being alpha, e.g. "return[1,0,0,0.5]" for semi-transparent red pixel or "return[0,0,0,0]" for full transparency
  • parameter TRANSPARENT needs to be set to FALSE not to mix two transparency processes

E.g. if one wants to have a Hollstein's cloud overview layer shown in a way, that everything except clouds is transparent, she just needs to change

let naturalColour = [B04, B03, B02].map(a => gain * a);
let CLEAR  = naturalColour;
let SHADOW = naturalColour;
let WATER  = [0.1,0.1,0.7];
let CIRRUS = [0.8,0.1,0.1];
let CLOUD  = [0.3,0.3,1.0];
let SNOW   = [1.0,0.8,0.4];

to

let naturalColour = [0,0,0,0];
let CLEAR  = naturalColour;
let SHADOW = naturalColour;
let WATER  = [0.1,0.1,0.7,1];
let CIRRUS = [0.8,0.1,0.1,1];
let CLOUD  = [0.3,0.3,1.0,1];
let SNOW   = [1.0,0.8,0.4,1];

Note that all other outputs need to be 4-channel ones as well.

Instead of WIDTH and HEIGHT parameters one can use RESX and RESY.

E.g. if one adds "RESX=10m&RESY=10m", the image will be returned in 10m resolution.

If you want to get float values out of the service, you will have to use 32-bit float image type (as basic 8 and 16bit tiff files only support integeres).

You can define a Custom script along the following lines:

var ndvi = (B08-B04)/(B08+B04); 
  return [ndvi]; 

You can either save this in Configuration utility or you can pass it as EVALSCRIPT parameter. For the latter the example of the call is:

https://services.sentinel-hub.com/ogc/wms/<INSTANCE_ID>?service=WMS&requ...

We suggest to also check the FAQ about details of internal calculation.

When user selects "EO Product template", she can see the Custom script behind each visualization/processing to make it as transparent as possible on what is happening with the data. We are changing these configurations through time, adding new ones and improving existing ones. For those, who want to ensure that processing of their layers is not changed any more, they can simply edit the original Custom script and gain full control over it.

Common attributes

Attribute

TYPE
POSSIBLE VALUES /
GUI POSSIBLE VALUES

Description

id 
Id  

Id of layer

title
String  

Layer title

description
String  

Description will be displayed in WMS/WMTS capabilities

styles
  name
   dataProduct
     id
 
String

Id

Styles section
Name of style

Id of data product

orderHint 
Integer 

Hint for layer's place in layer list

instance id
Id  

Id of instance

dataset id  
Id   

Id of dataset

datasetSource 
Url 

Source of dataset

defaultStyleName
"default"

The default rendering style to use for the selected product on this layer. This will be used if no other style is specified.

datasourceDefaults

 
 timeRange
   startTime
     type
       value
     type
       value
       unit
 
 
 
 
 
 
 
   endTime
 maxCloudCoverage
 upsampling
 
 
 
 
 
 downsampling
 
 
 
 
 
 
 previewMode  
 
 
 
 
 
 
 
mosaickingOrder
type
S2L1C,
S2L2A
 
 
ABSOLUTE
Year-Month-DayTHour:Minute:Second.Microsecond
RELATIVE
Integer
 
MILLENIUM,
CENTURY,
QUARTER,
WEEK,
DAY,
HOUR,
MINUTE
 
Double
type
 
NEAREST,
BILINEAR,
BICUBIC,
 
type
 
NEAREST,
BILINEAR,
BICUBIC,
BOX
 
 
DETAIL,
PREVIEW,
 
EXTENDED_PREVIEW
 
 
 
 
 
mostRecent,
leastRecent,
leastCC

Data source settings

 
 
Time range described by start/end time
Start time of time range. Can be specified with relative or absolute values
Date and time in absolute format
Example: "2017-11-20T18:24:25.004200"
 
Number of time units format
Type of time unit
 
 
 
 
 
 
Same structure as start time
Maximum cloud coverage in percent
Image upsampling mode. Happens, when input data is scaled up to the bigger target size. Nearest neighbor is fastest, bicubic is the best.
Nearest neighbour - fastest
Bilinear
Bicubic - best
 
Image downsampling mode. Happens, when input data is scaled down to the smaller target size. Nearest neighbor is fastest, bicubic is the best.
Nearest neighbour - fastest
Bilinear
Bicubic - best
Average downsampling
 
 
0 - Only high resolution data from the selected datasource will be used.
1 - In addition to high resolution data, lower resolution generated data will be used for further zoomed out viewing.
This method also uses data from pregenerated tiles for the most zoomed out views, otherwise it behaves the same as method 1. Because the tiles are pregenerated this means we will be unable to use many custom parameters at those levels.
 
Select the ordering of tiles for mosaicking. Tiles which will fit the selected order better will be placed on top.
Most recent on top
Least recent on top
With least cloud cover on top

Additional attributes (per datasource)

Attribute

TYPE
POSSIBLE VALUES /
GUI POSSIBLE VALUES

Description

DatasourceDefaults
   atmosphericCorrection
TYPE
"S2L1C"
 
NONE,
DOS1,
SENCOR

 
 
Atmospheric correction can be
no correction
Simpler and somewhat less accurate as well as available only for RGB bands
Based on Sen2cor and is available for all bands, with the exception of band 10, but might not be available for all tiles. In this case the system will use DOS1 calculation instead for those tiles.

We offer two kind of atmospheric correction options:

  1. Statistical runtime optimised - available globally without additional charge.

    Statistical runtime optimised atmospheric correction is based on official Sen2Cor. However, to make global processing feasible, we run the process on anchor scenes and then apply the statistical difference to the neighbourhood tiles. This approach approximates the overall atmospheric conditions pretty well, but does not take into account local specifics, e.g. clouds and terrain.

    You can adjust the settings in Configuration Utility under the layer's settings (above).
     

  2. "Officially” Sen2Cor corrected data - We use ESA-distributed data within Europe free of charge but outside of Europe we need to process this additionally, which carries some infrastructure costs (processing, additional storage). Therefore we offer on-demand service, where we monitor specific places in the world and process data with Sen2Cor. This service requires additional payments to the overall Sentinel Hub subscription.

    You can choose the Sentinel-2 Level 2A data source in Configuration Utility.

We have developed, only for Sentinel-2 data, run-time optimised statistical version of atmospheric correction. It is based on Sen2Cor technology, but uses anchor tiles and is not as accurate as the official Sen2Cor run on individual tiles.
With ESA starting to distribute L2A data form end of March 2017, you now have three options:

  • you can use “statistical run-time optimised atmospheric correction” worldwide for all data by setting  ATMFILTER=ATMCOR or configuring the layer in Configuration Utility
  • you can select L2A datasource in Configuration Utility and access official L2A data (currently available in wider Europe from 28th of March 2017 onward)
  • you can contact us for on-demand L2A generation everywhere in the world

True color pansharpening can be implemented in Custom script in the following way:

weight = (B04 + B03 + B02 * 0.4) / 2.4;

if (weight == 0) {
return [0, 0, 0];
}
ratio = B08/weight * 2.5;
return [B04*ratio, B03*ratio, B02*ratio];

If one wants to pansharpen just RED band, it goes along the similar way:

weight = (B04 + B03 + B02 * 0.4) / 2.4;

if (weight == 0) {
return [0, 0, 0];
}
ratio = B08/weight * 2.5;
return [B04*ratio];

You can add a parameter GEOMETRY as in this example:

https://services.sentinel-hub.com/ogc/wms/<INSTANCE_ID>?SERVICE=WMS&REQUEST=GetMap&VERSION=1.3.0&LAYERS=TRUE_COLOR&MAXCC=20&WIDTH=640&HEIGHT=640&CRS=EPSG:4326
&BBOX=-7.885,38.540,-7.870,38.560&
GEOMETRY=POLYGON((
-7.877244 38.546511,
-7.876377 38.547818,
-7.871950 38.546125,
-7.872611 38.545023,
-7.871241 38.544475,
-7.869831 38.544560,
-7.866011 38.550445,
-7.872323 38.552895,
-7.874112 38.551451,
-7.877110 38.552537,
-7.878791 38.552976,
-7.879413 38.553099,
-7.880600 38.553320,
-7.881314 38.553126,
-7.882678 38.552762,
-7.883951 38.552667,
-7.885064 38.552160,
-7.885370 38.549346,
-7.877244 38.546511))

It is recommended to simplify geometry before passing it as a parameter to avoid exceeding maximum number of characters in URL.

If one wants to have a transparent background outside of the clipped geometry, he can use "FORMAT=image/png&TRANSPARENT=1" parameters in the call. Do note however that png images are larger in size and will therefore take longer to load.

To get original reflectance data from the satellite set the custom script accordingly. REFLECTANCE is a physical format, which requires a 32-bit TIFF float. The values ​​are from 0 to 1. See example below:

 https://services.sentinel-hub.com/ogc/wms/{INSTANCE_ID} ...&LAYERS=B04_REFLECTANCE&FORMAT=image/tiff;depth=32f

Custom script example for B04_REFLECTANCE:

return [B04]

Custom script band input values are already in reflectance (hence the simple script).

We suggest to also check the FAQ about details of internal calculation.

You can use "Custom script" option and type in-the code along the following lines:

function  stretch(val, min, max)  {
 return (val-min)/(max-min);
}

return [
  stretch(B04, 0.05, 0.5),
  stretch(B03, 0.05, 0.5),
  stretch(B02, 0.05, 0.5)]

Upsampling and downsampling define the method used for interpolation of the data on non-natural scales. E.g. resolution of Sentinel-2 data (R,G,B and NIR bands) is 10 meters but in some occasions you would want to look at the data with higher scale (e.g. at 1 m pixel resolution) or lower scale (e.g. 1000 m pixel resolution). 
A default option is "nearest neighbour", which is best for performance. "Bicubic" is often nicer  on higher scales.
Note that the data are always exactly the same - it is just the interpolation method.

An example of the NDVI image of the field:

Downsampling_upsampling

We would like to serve data as they are, without uncontrolled changes, because it is almost impossible to set color balance to one fitting all places in the world and all groups of users. You can still tweak contrast in several ways.


Figure 1: Raw picture.

You can tweak "Gain" (brightness) to automatically equalize the image.

Figure 2: Gain adjusted to desired values.

You can tweak "Gamma" (contrast) as well.


Figure 3: Gamma adjusted to desired values.

Configure atmospheric correction setting either to DOS-1 or to “Full”.


Figure 4: Atmospheric correction set to DOS-1.

The best way to get information about all the available scenes in specific area is to use WFS service, with TIME parameter defining the relevant time range.

Example is shown here (you need to replace <INSTANCE_ID> with your instance ID in the URL of the request).

Note that you can also use MAXCC parameter (maximum cloud coverage) in this call to filter for cloudless data.

Use FEATURE_OFFSET parameter to control the starting point within the returned features and MAXFEATURES parameter the maximum number of parameter of features to be returned by a single request.

As a result you will get a list of all available scenes for the chosen location in JSON format (or XML if set so). Some of the dates may be duplicated if there are two scenes available in the area. You should simply ignore these duplications.

You can turn off the logo in Configuration Utility - just open "Instance configuration" and uncheck "Show logo":

Custom pixel calculation can be performed by creating a WMS layer with the "custom script" product selected in the Configuration Utility. How to use the script is detailed here.

There are several options, depending on what you would like to do:

  1. You can use a feature info request and point to some location. The response will contain a date field within it.
  2. You can use the WFS service with the same parameters as used in your WMS request (e.g. date, cloud coverage). You will get a list of features representing scenes fitting the criteria.
  3. You can configure the layer to show acquisition dates.
    • Go to the layer of choice.
    • Click the "Advanced" button in the Layer tab to enter advanced parameters dialogue.
    • (optional) You can turn on Help by clicking on "?" top right.
    • Add "additionalData" node with type "Date"
  • Click Save in Advanced parameters dialogue and Layers tab.

 

Images are ordered by “Mosaic order” priority (see Configuration Utility, can be set for each layer; you can also set this in a parameter). You can choose either to have most recent on top or least cloud coverage. You should also take into account “maximum cloud coverage” parameter. 
In case you want to get the most recent images acquired, you should set maximum cloud coverage to 100% and priority to “most recent”

Use the colorBlend method:
colorBlend(inputValue, indexArray, outputValueArray) The returned value is interpolated between the two consecutive values in outputValueArray (which represent RGB colors normalized to [0,1], e.g. pure red is [1,0,0]) based on the inputValue's location in the indexArray.

Note the indexArray and outputValueArray must be the same size. For example: return colorBlend(B04,    [0, 0.2, 0.4, 0.6, 0.8, 1],    [[0,0,0], [0.1,0.2,0.5], [0.25,0.4,0.5], [0.4,0.6,0.5], [0.75,0.8,0.5], [1,1,0.5]]); If B04 is 0.25 it will interpolate between [0.1,0.2,0.5] and [0.25,0.4,0.5] since they are the corresponding colors to the 0.2 and 0.4 indices in the indexArray nearest to B04.

The general parameters are detailed here .
Custom parameters are detailed here .

Custom scripts

Custom Script integrates Chrome V8 JavaScript engine so vast majority of functions supported there can be used in Custom Script as well.

See more info here here.

Description of classes and useful methods

By employing the dynamically interpreted JavaScript language, and providing some specialized functions one can combine the bands of multispectral satellite data in unprecedented ways.

Scripts consist basically of two parts, setup and evaluation.

In setup part the input and output components are defined. For input components the used bands are specified. For output components the resulting array size is specified.

In evaluation part the values of each pixel are used to calculate the resulting value. Usually the calculated value is processed with visualizer before being returned. How visualizers process values is described in the next section.
Code example:

let viz = new HighlightCompressVisualizerSingle();

function evaluatePixel(samples) {
    let val = 1.5 * (samples[0].B08 - samples[0].B04) / (samples[0].B08 + samples[0].B04 + 0.5);
    return viz.process(val);
}

function setup(ds) {
    setInputComponents([ds.B04, ds.B08]);
    setOutputComponentCount(1);
}

Visualizer classes

Visualizers are basically classes with a method process which evaluates the representation value for a pixel from pixel’s band values.
In custom scripts some predefined visualization classes are already available. They are listed and described here.

class DefaultVisualizer

Visualizer with included gamma correction, gain and offset are also configurable. This class is useful for RGB images.

constructor(minValue=0.0, maxValue=1.0, gain=1.0, offset=0.0, gamma=1.0)

methods:

clamp(num, min, max)

Clamps number num between min and max

process(val)

Returns processed value between 0.0 and 1.0

processList(components)

Returns array of processed values

class: ColorGradientVisualizer

Visualizer interpolates colors in defined intervals(valColPairs). It maps color values to interval between min and max.

constructor(valColPairs, minVal=0.0, maxVal=1.0)

methods:

getColor(val)

Return interpolated color for value using array of boundaries

process(val)

Returns processed value in rgb (0 - 255)

processList(components)

Returns array of processed values(array of RGB arrays)

createRedTemperature(minVal, maxVal)

Returns RedTemperature visualizer

createWhiteGreen(minVal, maxVal)

Returns WhiteGreen visualizer

createBlueRed(minVal, maxVal)

Returns BlueRed visualizer
For predefined value color pairs see constants section below.

class HighlightCompressVisualizer

Automatic highlight adjustment with optional gamma correction. Gain and offset are configurable. This class is useful for RGB images.

constructor(minValue=0.0, maxValue=1.0, gain=1.0, offset=0.0, gamma=1.0)

methods:

clamp(num, min, max)

Clamps num between min and max

process(val)

Returns processed value between 0.0 and 1.0

processList(components)

Returns array of processed values

class: ColorMapVisualizer

This visualizer assigns a color to the input value according to color map. It takes an array of (value, color) pairs which determine the color map. The function process returns the color to which input color corresponds.

constructor(valColPairs)

methods:

getColorFromValue(val)

Returns corresponding RGB color from color map for value val.

process(val)

Returns processed value as RGB color array with values between 0.0 and 1.0

createDefaultColorMap()

Returns new ColorMapVisualizer([
[-1.0, 0x000000],
[-0.2, 0xFF0000],
[-0.1, 0x9A0000],
[0.0, 0x660000],
[0.1, 0xFFFF33],
[0.2, 0xCCCC33],
[0.3, 0x666600],
[0.4, 0x33FFFF],
[0.5, 0x33CCCC],
[0.6, 0x006666],
[0.7, 0x33FF33],
[0.8, 0x33CC33],
[0.9, 0x006600]
]);

class Identity

Visualizer that passes trough the value.
methods:

process(val)

Returns value as single element array

Helper functions

In the list are already implemented helper functions that can be used in custom scripts.

hex2rgb(hex)

Return 8bit RGB array of hex value.

rgb2hex(rgb)

Return hex representation of 8bit RGB vector.

index(x, y)

Return remote sensing index.
For example NDVI = index(B8A, B04)

inverse(x)

Returns 1 / x.
If x is zero the JAVA_DOUBLE_MAX_VAL constant is returned.

combine(col1, col2, alpha)

Return hex color of convex combination of hex colors.

returned_color = alpha * col1 + (1 – alpha) * col2

Constants

Predefined constants in the list below are already used in visualizers and are also available externally in custom scripts.

const JAVA_DOUBLE_MAX_VAL = 1.7976931348623157E308;
const blueRed = [
    [1.000, 0x000080],
    [0.875, 0x0000FF],
    [0.625, 0x00FFFF],
    [0.375, 0xFFFF00],
    [0.125, 0xFF0000],
    [0.000, 0x800000]
];

const redTemperature = [
    [1.000, 0x000000],
    [0.525, 0xAE0000],
    [0.300, 0xFF6E00],
    [0.250, 0xFF8600],
    [0.000, 0xFFFFFF]
];

const greenWhite = [
    [1.000, 0x000000],
    [0.600, 0x006600],
    [0.300, 0x80B300],
    [0.000, 0xFFFFFF]
];

To avoid problems of negative values in GeoTiff format, we have integrated DEM in a way that:

  • elevation above the sea level is stored as it is (e.g. 0-8.848 meters)
  • elevation bellow sea level is stored from 65535 downward (e.g. "-10 meters" is stored as 65525)

To implement your own visualization, you can write a script along the following ways, e.g. "terrain if sea level rises for 2 meters":

if (DEM >= 32768) {
  elevation = DEM - 65535.0;
} else {
  elevation = DEM;
}

if (elevation > 2.0) {
  return [0,0,1];
} else {
  return [elevation/1000,elevation/1000,elevation/1000];
}

By employing the dynamically interpreted JavaScript language, and providing some specialized functions you can combine the bands of multispectral satellite data in unprecedented ways.

Here is an example how to tweak the image with a custom script in case of volcano eruption:

return [
  B04 * 2.5 + Math.max(0, B12 - 0.1),
  B03 * 2.5 + Math.max(0, B11 - 0.1),
  B02 * 2.5
];

Erupting vulcano

Etna volcano eruption, dated 16. 3. 2017. Image combined from true colour image, overlaid with SWIR bands 11 and 12. (view on Sentinel Playground)

You can find more examples and how to tweak the images for easier detection of Earth surface changes, clouds, snow, shadow, water etc. in our blog post.

Add the EVALSCRIPT parameter to your request URL with the value of the base64 encoded script, as such: evalscript=cmV0dXJuIFtCMDQqMi41LEIwMyoyLjUsQjAyKjIuNV0%3D
More details described here