I am trying to access the new harmonized landsat sentinel data through stac. I have a handful of polygons that I need to get time-series imagery for from early 2023 to ~October 2023. I have tried so many methods to grab the imagery without downloading the imagery (there’s a lot of images) and my understanding from this (https://gdalcubes.github.io/source/tutorials/vignettes/gc02_AWS_Sentinel2.html#converting-stac-items-to-image-collections) is that stac seems to be the most straightforward method.
I either run into authentication issues (which shouldn’t be an issue anymore), or an issue getting the dataframe or list into a useable data format.
I’m not overly familiar with accessing this type of data in R, but I am familiar with manipulating it once it is pulled into R in whatever format I can get.
I have generated my earthdata login:
edl_netrc(username = "xxxxxxx",
password= "xxxxxxxx",
cloud_config=TRUE)
I also set my gdal config according to EarthData just in case:
setGDALconfig("GDAL_HTTP_UNSAFESSL", value = "YES")
setGDALconfig("GDAL_HTTP_COOKIEFILE", value = ".rcookies")
setGDALconfig("GDAL_HTTP_COOKIEJAR", value = ".rcookies")
setGDALconfig("GDAL_DISABLE_READDIR_ON_OPEN", value = "EMPTY_DIR")
setGDALconfig("CPL_VSIL_CURL_ALLOWED_EXTENSIONS", value = "TIF")
I can run this line successfully (most of the time)
stac_search1<-rstac::stac("https://cmr.earthdata.nasa.gov/stac/LPCLOUD/")%>%
rstac::stac_search(collection=c("HLSS30.v2.0","HLSL30.v2.0"),
datetime="2023-04-01/2023-06-30",
bbox=c(miss["xmin"],miss["ymin"],
miss["xmax"],miss["ymax"]))%>%
post_request()
and it returns:
> class(stac_search1)
[1] "doc_items" "rstac_doc" "list"
> head(stac_search1)
$type
[1] "FeatureCollection"
$stac_version
[1] "1.0.0"
$numberMatched
[1] 269
$numberReturned
[1] 10
$features
$features[[1]]
###Item
- id: HLS.S30.T12TUT.2023093T182921.v2.0
- collection: HLSS30.v2.0
- bbox: xmin: -113.67207, ymin: 46.83564, xmax: -112.18335, ymax: 47.84737
- datetime: 2023-04-03T18:41:08.454Z
- assets:
B03, Fmask, B12, B11, B09, B08, B01, VZA, B02, B8A, B05, SAA, B04, SZA, B10, B07, VAA, B06, browse, metadata
- item's fields:
assets, bbox, collection, geometry, id, links, properties, stac_extensions, stac_version, type
However, when I try to convert the stac_search1 rstac_doc to a stac_image_collection using this function:
stac_search_col1<-stac_image_collection(stac_search1$features,property_filter = function(x) {x[["eo:cloud_cover"]]<20}))
I get this error:
Error in data.frame(id = 1:length(bands), name = bands, type = "", offset = NA, :
arguments imply differing number of rows: 2, 0, 1
Timing stopped at: 0 0 0
The stac_search1 feature is an rstac_query object and a list. I’m not sure how the stac item is supposed to be formatted, so I don’t know how to change it to meet the stac_image_collection request.
I have tried so many methods to this and have run into massive issues every time.
I’ve tried using httr::POST and GET (which returns an http: 500 error unless I include my authentication in line)
search_body<-list(limit=179,
datetime=datetime,
bbox=as.vector(miss),
collections=hls_col)
search_req1<-httr::POST(search_URL,body=search_body,encode="json")%>%
httr::content(as = "text")%>%
jsonlite::fromJSON()
and get a very similar looking dataframe with the same number of features, bands, cloud_cover etc.
I run this shortened loop:
process_remote_file <- function(url) {
temp_file <- tempfile(fileext = ".tif")
httr::GET(url, write_disk(temp_file, overwrite = TRUE),authenticate(user = "xxxxxxxx",password = "xxxxxxxxxx"))
raster_layer<-raster(temp_file)
stars_obj<-st_as_stars(raster_layer)
To try to convert it to a raster and then into a stars object which I can then turn into a data cube? I know my request is (occasionally) going through, but I cannot get the data to a usable point. My end goal is a data cube for timeseries analysis, any help you can offer would be great. Admittedly R is probably not the best place to do this kind of work, I’m just best at R over python or java.
Rebekah is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.