I have the feeling something is wrong with this small project I have compiled.
I’ve created a small project to generate a fuzzy time based on two dates in seconds. It can be found here https://gitlab.com/memborg/fuzzy_time
When I time how efficient JS and WASM is, WASM looses by a factor 10 when a million fuzzy times are generated.
A sample of a single run with a million random dates.
js.fuzzy_time: 1879ms JS average 0.001096 millisecond wasm.fuzzy_time: 15579ms WASM average 0.014035 millisecond
The average time for both JS and WASM is the average time it takes to generate a single fuzzy time.
Firefox performance snippet
Am I drunk or is it reasonable to asume that the overhead for marshalling from JS to WASM will generate this kind of overhead?