Reading COG Geospatial data from AWS S3 by window with GDAL

Hi there,

I'm working to move code from Python to Rust in mostly geospatial data (satellite data and others) and I need to used a package called GDAL (it is a quite wide used library in earth sciences in multiple languages). The question to the community if anyone has experience with using COG (cloud optimized geotiff) data to read direct with a coordinates window in GDAL Rust (especially data on cloud storage like AWS S3), any suggestion on how to make it work would be greatly appreciated.

My block of code bellow works fine in get the data from AWS S3, however searching in all documentation of GDAL I couldn't make part of the process a projWin slice like the gdal_translate method offer.

Any guess in how can I pass the BOX of coordinates to the read to only get the data for the window I need?

Thank you for the help!

The box bellow in comment is the correct one for this specific image.
And the specific GDAL crate

crate
gdal = "0.15.0"

use gdal::raster::RasterBand;
use gdal::{Dataset, Metadata};
use gdal::config;
use std::path::Path;

pub fn read_cog_tif(){

config::set_config_option("AWS_REGION", "us-west-2").unwrap();
config::set_config_option("AWS_SECRET_ACCESS_KEY", "SECRET").unwrap();
config::set_config_option("AWS_ACCESS_KEY_ID", "KEY_ID").unwrap();
config::set_config_option("AWS_REQUEST_PAYER", "requester").unwrap();

let path = Path::new("/vsis3/usgs-landsat/collection02/level-2/standard/oli-tirs/2023/223/072/LC09_L2SP_223072_20230410_20230412_02_T2/LC09_L2SP_223072_20230410_20230412_02_T2_SR_B6.TIF");
let dataset = Dataset::open(path).unwrap();

let rasterband: RasterBand = dataset.rasterband(1).unwrap();
 
 // the coordinates box is the target block to extract
// projWin=[479940.125947,-1909488.358163318, 481823.591327,-1911120.3908325522],
//                projWinSRS='EPSG:32622'

if let Ok(rv) = rasterband.read_as::<f32>((20, 30), (5, 3), (5, 3), None) {
    println!("{:?}", rv.data);

}

}

A quick search for "translate" in the Rust GDAL docs shows that the Rust wrapper for GDALTranslate() hasn't been implemented yet.

I'm not sure if the multi-dimensional translate does what you want; even if it does, it doesn't seem that easy to use. Perhaps you could try installing GDAL's command-line utilities and then calling gdal_translate as a command-line tool using an external process?

Examples below:

1 Like

Hi @hax10, thanks for the reply. My problem is this service will run in a rust only environment, so all need to be compiled as part of the software.

I search in all options in the Rust crates and didn't found some alternative for GDAL (the library is so universal that almost all geo data uses it).
So, maybe I will need to implement the wrapper myself? (I'm not at that level yet...).

What kind of environment is this? I ask because GDAL is mostly C++ and the Rust GDAL crate is only a set of wrappers over that, so this environment must support pre-compiled non-Rust code?

The Rust GDAL crate developers are active both on GitHub (GitHub - georust/gdal: Rust bindings for GDAL) and on the Georust Discord (GeoRust). Your best bet is to speak to them directly about either solving your problem or contributing code to the crate.

The env will be running is the Internet Computer. Full stack Rust backend system.

Hi @urschrei I'm there in the Discord channel, and I posted the question. No reply yet, so I came to the bigger community. :wink:

Interesting, I hadn't heard of the Internet Computer before. It runs on WebAssembly, so your Rust code may face some limitations too (not all Rust crates work properly when compiled to WASM).

Absolutely great piece of tech, where you can run the canister tech (a smart contract bundle that works like container) and have Ram + CPU + State in the same place, with operations cost similar to cloud but with blockchain level cryptography and resistance, quite a beauty. It works quite fine with rust compiled wasm.

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.