Web app for exploring scientific data sets on compute server


How should I build an interactive browser / search tool / viewer for data stored
on a compute server?


Please bear in mind that I have no experience in modern web development, so my
questions might be naive or misguided and things which may be obvious to web
developers are likely to be unfamiliar to me.

We have a compute server on which we generate and store gigabytes of data
resulting from computationally expensive processes. Among these data are
(sequences, produced by an iterative process, of) monochromatic 3D images:
f32s describing the values ascribed to cuboidal voxels in a cuboidal field of
view. (These images are unlikely to exceed 350x350x350 voxels.) We tend to keep
these sequences in huge directory trees with various path components encoding
metadata describing multiple parameters relating to the circumstances under
which the images were produced. (Coming up with a better metadata-handling
scheme is another issue I'd like to address, but it's mostly orthogonal to my
current question.)

We need to view axis-aligned, arbitrarily-oriented, arbitrarily-thick slices of
these images. For this purpose we have a (supposedly prototype) viewer written
in python which loads the images as numpy arrays and uses Matplotilb's
imshow to display slices, whose orientation, position and thickness can be
modified interactively within the viewer. Using this viewer across the network
gives a very poor experience, so we tend to poke around the directory structure,
download potentially interesting images to the local machine and run the viewer
there. This process is tedious, error-prone and user-unfriendly in general.

We also need to calculate figures of merit (FOM) on the basis of features of
these images, and view graphs of their evolution in a sequence of images
produced by the iterative process.

I am looking to improve the ergonomics of viewing the images and their
corresponding FOM graphs, by writing a viewer that can run on users' local
machines. The viewer should allow them to search for relevant images, select
viewing parameters, view the FOM graphs (perhaps triggering their calculation,
if they haven't been generated yet) select which FOMs to include in the graph,
adjust the title etc. and save thus-generated image cross-sections or graphs on
the local filesystem.

I as wondering whether a Rust/WASM web application would be a good way to solve
this problem.

While I like learning programming languages, and have enjoyed using many and
varied ones down the years, I decided back in 1999 that I thoroughly dislike
JavaScript, and have watched in horror as it grew from strength to strength on
the web. WASM gives me hope. A few years back, when I tried to do some toy
Rust-WASM projects, I was dismayed by the amount of JS glue that was needed to
make them work, and ran away in disgust. So I'm encouraged to see this

     No JavaScript

Had enough of JavaScript? So have we.

Create apps using Sycamore without touching a single line of JS.

displayed prominently on https://sycamore-rs.netlify.app/

But it seems that Sycamore only targets the frontend. I know very little about
building web backends, but I'm vaguely aware that frameworks such as Rocket or
Diesel tend to have tight integration with databases. I don't think I need a
database: my database is the file system of the compute server, and the images
it contains. I imagine that running some fuzzy searcher (such as fzf or
skim) to filter the metadata encoded in the paths, would allow the navigator
to identify interesting files to view.


In brief, my requirements are

  • in a (web?-)app running on a local machine
  • search for files in a restricted subtree of compute server's filesystem
  • view 2D slices through 3D data and, optionally, save locally
  • launch computation of derived data on the server, view graphs, save locally
  • No JavaScript, please! (Rust/WASM, no problem)


Can you suggest an appropriate architecture for such a browser?