How can I merge a `index.html` and a `some_fig.svg` to one `.html` with using Rust?

I have the following folder my_folder, which contains two files:


the index.html's content contains a href, which point to my_fig.svg:

<a href="my_fig"></a>

And the my_fig.svg has the following content:

<svg>some code</svg>

Is there a way I can merge the my_sig.svg to index.html, the expectd result looks like:

<svg>some code</svg>

I assume you are not asking if this is generally possible to do in HTML (it is and would only require you to copy-paste the svg into the html file), but whether you can do it dynamically at runtime? In which case, have you considered using a template engine like tera for server-side dynamically generated html pages? A template engine would allow you to replace the <a href="my_fig"></a> element with a template expression like {{ svg }}. Upon receiving a request you can pass the svg as a string to the template engine, which then renders the proper html you can pass to your client. I believe this would be way easier than parsing the DOM of your index.html with some library and then searching for a elements with a href attribute referencing the svg, replacing the element with the contents of the hrefed file.

1 Like

I'm tring to do something as follows:

  1. I input some commands in terminal
  2. My compuer receive the commands and generate a rust project
  3. The project has a dependency on crate criterion, and then it is executed with cargo bench
  4. After the project run over, there is a folder generated by criterion, the folder contains index.html and some other .svg files.
  5. Then I try to send the index.html to somewhere else, where the index.html will shown with a style .html, and there only one .htmlcan be received.
  6. After the index.html is sent to someshere else, the href link in it will be invalid. So I have to make the whole folder into one .html file, then I got the problem above.

You could parse the DOM (for example with a crate like scraper) of the html file, search for the elements you want to replace and replace them with the right SVGs. Or you could tarball the whole criterion output, send the tarball to the desired destination and unpack it there.

Edit: As far as I could tell, scraper does not allow DOM manipulations. I created a crude example with the tl crate, though I'm not overly happy with the interface (querying works fine but replacing a node in the DOM could be handled better in my opinion, i.e. using Node::Raw feels weird). You may be able to find a better crate or approach:

tl = "0.7"
use tl::{Node, NodeHandle};

fn main() {
    let index = r#"<div><a href="my_fig"></a></div>"#;

    let svg = r#"<svg><circle cx="50" cy="50" r="40" fill="blue"/></svg>"#;

    let expect = r#"<div><svg><circle cx="50" cy="50" r="40" fill="blue"/></svg></div>"#;

    let mut dom = tl::parse(index, tl::ParserOptions::default()).unwrap();

    // note that you could filter for `href="my_fig"` here directly, rather than doing the loop
    // and if statement below, but I assumed this would be closer to your use-case where you  
    // want to replace multiple svg files rather than just one
    let anchors: Vec<NodeHandle> = dom.query_selector("a[href]").unwrap().collect();
    for anchor in anchors {    
        let parser_mut = dom.parser_mut();

        let anchor = anchor.get_mut(parser_mut).unwrap();

        if anchor.as_tag().unwrap().attributes().get("href").unwrap().unwrap() == "my_fig" {
            *anchor = Node::Raw(svg.into());

    assert_eq!(&dom.outer_html(), expect);


1 Like