I am exploring language options for data processing of geospatial, large data sets (netcdf, geotiff). I heard good things about Rust from other software engineers and I am wondering if it is good for processing/manipulating data as it wasn't one of the highlighted strength in the website.
Rust is well-suited for this. It gives you control over every byte of memory, so you can efficiently represent your data sets. It's fast enough that you can process every pixel in a TIFF.
I'd say it depends on what your priorities are. If your top priority is fast iteration on your datasets and doing quick experiments, and you don't already know Rust, then I'd wager Rust would be a pretty bad choice.
But if you're looking to build some sort of long-lived data processing pipeline where performance is one of the primary concerns, then yeah, Rust is the best language for writing high-performance stuff like that.
thanks @bgr360 for the clarification. I am looking into creating data processing pipeline; ingesting a large amount of satellite data in geotiff, converting to netcdf, then possibly 're-creating' data into geotiff format.