Firefox is much slower than Chrome or Edge on some Rust document pages

This is interesting as Firefox suppose to be the sponser of Rust, yet in specific Rust related pages it is much slower than its compatitors.

Example: SmallVec Array Trait

In at least two computers running Windows 10 (compared with: Edge) and MacOS X (compared with: Chrome), we see a quick page load in the compatitors.

In developer mode it seems like the script loading is the blocking item.

I don’t know how to diagnose this and how can I help improve Firefox. I already lodged an bug report a few weeks ago but that was about a local document loading case and this time it is about a hosted page.

3 Likes

Huh. As another set of data points (in macOS):

  1. Firefox (developer edition, 66.0b7) loads and renders this page in ~3 s.
  2. Chrome (72.0.3626.109) spins for ~10-30 seconds. Unresponsive tab, eventually resulting in an “Aw snap!” crash.

EDIT: to be clear, this is attempting to load your example SmallVec Array trait link.

I’m opening documentation page (if you were referring to $ rustup doc) in ~1 sec with Debian Stretch and FF 65 (64-bit)
Dell core I7 7th gen w 8GB mem
Haven’t tried on Chromium because 1 second is fast enough for me.

Edit: with given example mentioned above it’s around 3 seconds

The page loads and renders within a second for me in FF 65 on Windows 10, but the tab then becomes unresponsive and stays that way for quite a long time, for two whole minutes in fact. I wonder how much is due to the 17MB search-index.js?

I hate the web.

8 Likes

I have to be clear.

When I compare the performance, I mean the time from you click on the link until the page is being functional - at least, you can click on the links. When measuring this, I count from the time I hit the “refresh” button until it stops spining.

So my measurement is
Chrome 72.0.3626.109: ~10 seconds + clash (showing the “Aw, snap” page)
Edge 42.17134.1.0: ~3 seconds
Firefox 65.01(64-bit): ~30 seconds.

Machine: Intel Core i5-8500 8GB
OS: Windows 10 version 1803 (OS build 17134.590)

Which version of Rust was used to generate the docs you’re viewing? There has been a regression since the last stable but it should’ve improved a bit with https://github.com/rust-lang/rust/pull/57884

I am not the crate creator, so I have no idea.

My measurements on my PC (Manjaro, Linux 4.19):
Chromium 72.0: approx. 10 seconds
Firefox 65.0: approx. 3 minutes

Actually, my measurements are not web based - it is the same for your local documents I am sure because I saw this before. So the measurements here are not the first load (it could take a long time to transfer the file), it is a “refresh” of the page where the big js file should already be cached.

Finally, I figured out that the success of Edge is somehow “cheating”: the page is loaded successfully, you can click on the links and do what ever you want; however, if you close the page within proberbely the same time period as Firefox, you will receive a "unresponsive page“ notification.

So my guess is that Edge is still loading the js in the backend, but only release the UI so you can interact with the page. However I still highly regard this behavour: in such a case it appears much better than its compatitors. Maybe Edge is the browser I should give another try?

I’m reasonably certain retep was talking about web technology, not network access.

Not much point; it’s dead now, due to be replaced by yet another Chrome skin. Unless you’re on a Mac, your choices now are basically Firefox or Chrome.


To give a clearer indication of what the issue is:

That’s nearly a minute during which the page is just frozen. It’s rendered and scrollable, but it’s otherwise unresponsive. You can’t click on links, select text, etc.

The waterfall display has a huge chunk missing for most of that, with the last 12 seconds taken up by executing search-index.js.

Anyway, the “load then hang” behaviour is probably because the script tag is marked defer. I don’t know if setting async as well would help. The fundamental problem here is that this approach just does not scale. If you navigate to a different page, you get the same hang because the browser has to re-load that whole script. Go back to the first page, same hang again. And yes, it happens even with local copies.

Personally, I’d be happiest if we just burned the whole HTML ecosystem to the ground and started again with something not pathologically awful, but this is what we’re stuck with. I don’t know how, or even if, this can be fixed whilst still keeping a search index and local support.

Switching to a “single page” design would probably help after the initial load, but AFAIK that’s incompatible with local access. Doing server-side searches, same issue. We could start having rustup set up a system service for serving doc pages, but that’s pretty gross and means either not really fixing online docs, or forcing docs.io to do way more work to handle searches.

7 Likes

You don’t. An entire search index that lives inside a user’s browser and not in some actual search index web server is just a scale problem waiting for someone to wander into. Local docs without a server should be ctrl+F’able (or have one giant print.html that you can ctrl+F to search), anything else is going to run into problems like this.

Of course it can be fixed. One way is to permanently store the search index in the browser (using IndexedDB). This means that subsequent page views will be faster.

Another way is to make the page responsive for user input, while it’s still loading. This can be achieved by pausing JS at regular intervals for a few milliseconds – just long enough so the browser can process events and render the page.

And, of course, fixing this bug in Firefox should help.

Another possibility is to disable the search for very big projects, like servo. Instead, we could offer a command-line tool to search the docs.

Note that Github Pages only allows static content, so a full-blown server wouldn’t work for everybody.

2 Likes

Another way I could think of, is to break the big js file into pieces that can be loaded in need. So the browser can load a “bootstrrap” part of the file, then once people performs search, a specific piece of code can be loaded. However, this does not affect people that don’t search.

Another big thought: Instead of js, why not use Webassembly instead? Would it be quicker to load?

I don’t care if its another Chrome skin or not. For me the fact is that it provides the best solution for the problem for users like me (that don’t search), although it is just hiding it.

1 Like

The point of the “it’s dead now” statement is that the version we’re comparing against Chrome is going to disappear in the near future, so pretty soon there won’t be a noticeable difference between how Chrome handles the index loading versus how Edge does it.

Yes, I think simply setting async on the script tag should and would help. Setting defer (as is actually done) is probably not much use though, since the script tag is the last thing in the body anyway.

Another part of the fix might be to use (html5) local storage for the search index, and only reload the data when expired (and potentially split the index into multiple parts that exipire at different times).

Hey, Firefox developer and rustacean too here :slight_smile:

These issues are generally worth knowing about. For this I filed https://bugzilla.mozilla.org/show_bug.cgi?id=1530212.

I was hoping that this was the same issue as https://bugzilla.mozilla.org/show_bug.cgi?id=1516780 (for which I wrote a patch a couple days ago), but seems like for this particular page, most time is spent in JS.

For those that have multi-minute waits on this page, I’d appreciate a lot if you could take a profile using Firefox Nightly with the Firefox profiler (https://profiler.firefox.com/) if you had the time (just in case there’s something going on that I’ve missed in my profile).

There are docs at https://perf-html.io/docs/ on how to take profiles and export them. If you can, pasting the link on that bug I just filed would be very useful!

5 Likes

Might I add that a larger page that is probably more frequently visited is the Iterator trait page. This provides for a larger sample to test.

Right, that’s what https://bugzilla.mozilla.org/show_bug.cgi?id=1516780 is about. That’s a fairly different problem though, we’re thrashing layout and recomputing styles like crazy when fonts load.

2 Likes

Right. I saw two profiles was uploaded and they clearly say that the bottle neck is GC.

Which means Edge’s GC is much more efficient - (when refreshing the smallvec page, I can see the Edge memory usage boost to about 1GB, and then droped down very quickly).