Cheating to learn Rust by converting JavaScript to Rust, now I want to improve the code

Alright. I see that. My question was what, if any, is the technical difference between using rustc directly and using cargo which just executes rustc?

I was using rustc herein until folks brought up using cargo.

I'm not following what is special about using cargo?

cargo — or another build tool, but cargo is the standard one for Rust — is practically necessary to write Rust programs that have dependencies, which most non-trivial Rust programs do. Therefore, it is generally recommended to always use Cargo unless you have a special application which requires using something else. In principle, whatever Cargo does can be done by downloading files and running rustc instead — many times, with very long lists of options. People use Cargo because it automates building large programs.

But there is no specific advantage to using Cargo for a single-crate program with no dependencies, as you are currently trying to work with.

The only reason I specifically brought up Cargo at first (and I apologize for contributing to the confusion) is because someone else mentioned profiles, and it's important to understand that profiles are only a Cargo concept, so any advice concerning profiles has to be translated to rustc options if you are using rustc and not Cargo.

3 Likes

is practically necessary

I'm skeptical about that claim.

I remember reading a repository documentation on GitHub that said Node.js is a requirement. Because the module authors decided to write Node.js-specific code. I took that code and wrote JavaScript runtime agnostic code, for the most part changing the code from using node:crypto to standardized Web Cryptography API.

But there is no specific advantage to using Cargo for a single-crate program with no dependencies, as you are currently trying to work with.

I didn't think so.

The only reason I specifically brought up Cargo at first (and I apologize for contributing to the confusion) is because someone else mentioned profiles,

No worries. I'm not that far in to Rust yet to get into the minutae to that degree.

So if I understand correctly there's no way to use cargo for building without a manifest file?

Which means learning the entire separate .toml file structure and options?

A naive start

[profile.dev]
opt-level = 1               # Use slightly better optimizations.
overflow-checks = false     # Disable integer overflow checks.
 ... /.cargo/bin/cargo build --release

failed to parse manifest at `/media/user/1234/Cargo.toml`

Caused by:
  no targets specified in the manifest
  either src/lib.rs, src/main.rs, a [lib] section, or [[bin]] section must be present

One of the reasons I started using Deno is there was necessary dependence on an external configuration file, such as package.json in Node.js.

Network imports and ECMAScript Modules out of the box. Not so in Node.js world, until recently with syntax analyzation.

Even more elaborate is a manifest file structure such as Cloudflare's Workerd uses, Cap'n Proto workerd/src/workerd/server/workerd.capnp at main · cloudflare/workerd · GitHub, which to me, is a whole language in itself.

Why not just use JSON?

Except for -Zscript which integrates the manifest and source, yes, because the manifest file is where Cargo starts looking for configuration and things to build, and everything else is located relative to it.

But, generally, Rust projects will have something more to put in the Cargo.toml file, like dependencies or package metadata. As you’ve noted, there isn’t a lot of value in using Cargo if you in fact have no dependencies — other than following convention.

Most Rust users learn Cargo instead of learning rustc options, at least at first.

The source file should be located at src/main.rs next to the Cargo.toml. If you really want to, you can move it by specifying an explicit path, but part of the value of Cargo is that it knows the conventional layout for Rust projects, which does not require any explicit configuration (other than the manifest file existing).

1 Like

Typically you can start out by creating a project via cargo new and that way you can use cargo without needing to learn the structure of that toml file yet.

Do note that by default it also initializes a git repository for your project; there's a flag --vsc none to opt out of that. To learn more options, it can also be useful to read the --help messages of cargo or it's subcommands. If you've already made a directory for your project, you can also alternatively use cargo init from within it.

A more modern development is the command cargo add which offers a convenient way of adding dependencies without learning to look up the most recent version of your dependency or to write to the Cargo.toml.

One can get pretty far with those 2 commands already.

1 Like

Alright, created src directory, renamed permutations.rs to main.rs.

 CARGO_HOME=/media/user/1234/rust/.cargo RUSTUP_HOME=/media/user/1234/rust/.rustup /media/user/1234/rust/.cargo/bin/cargo build --release
   Compiling permutations v0.0.0 (/media/user/1234/hermes-builds)
    Finished `release` profile [optimized] target(s) in 0.85s

I'm not heavy on the idea of dependencies. Most of my experiments and tests in JavaScript engines and runtimes is writing engine and runtime agnostic code that can be executed in multiple engines and runtimes using standardized built-ins.

That can only go so far because ECMA-262 doesn't spell out I/O for JavaScript, so out of 10 JavaScript engines, runtimes, interpreters, 8 might implement reading stdin differently, 2 might not have that capability at all, yet still be ECMA-262 conformant.

Programmers in Rust world don't have to deal with that, at all. In fact, most JavaScript programmers don't, either, because most are not experimenting and testing JavaScript in canary/nightly builds - they are "typically" stuck in Node.js, Deno, Bun, QuickJS, Hermes worlds exclusively.

I got pretty far with just rustc, after scaping a start from a random Web site that converted JavaScript to Rust, where no such tool that I am aware of exists in Rust world, proper.

Thanks for the tips.

I don't equate a bunch of dependencies as meaning anything. I perceive the macrocosm from the macrocosm.

So if I'm testing out a new programming language (to me) or JavaScript runtime or engine, I take into account all of the opinionated "practices" baked in to the process.

Sp far, for this particular project, Bytecode Alliance Javy is rather straightfoward to compile JavaScript to WASM. The whole JavaScript library is baked in to the WASM, though it's mainly just JavaScript.

With Static Hermes it's possible to # include ... C in the JavaScript source code. With the bonus that we get C emitted that can be compiled to a standalone native executable, too, with gcc or clang.

With AssemblyScript there's custom types. I'm not enthralled with types.

CARGO_HOME=/media/user/1234/rust/.cargo RUSTUP_HOME=/media/user/1234/rust/.rustup /media/user/1234/rust/.cargo/bin/cargo new
error: the following required arguments were not provided:
  <PATH>

Usage: cargo new <PATH>

For more information, try '--help'.

It's like Columbo. Just one more thing...

Compare Deno. Just the 128 MB deno executable itself. No deno.json or configuration file necessary. Written in Rsut. The emitted native executable is 74.9 MB. However, the programmer doesn't have to carry around Rust tool chain at around 500 MB. Now if only Deno could optimize the build to remove all of the JavaScript library of deno_core and denort crates that are not used.

 deno compile -A --no-code-cache --no-check --cached-only --no-npm --no-remote --node-modules-dir=false --target=x86_64-unknown-linux-gnu -o module module.ts.js
Compile file:///media/user/1234/module.ts.js to module

Embedded Files

module
└── 1234/module.ts.js (23.84KB)

Files: 25.07KB
Metadata: 1.14KB
Remote modules: 12B
echo '15 5' | ./module
5 of 1307674367999 (0-indexed, factorial 1307674368000) => [0,1,2,3,4,5,6,7,8,9,10,11,14,13,12]

There is a close analogue to this in Rust; the standard library is split into a few parts, and core and alloc are the parts that do not involve IO or anything else requiring communicating with the operating system (except for memory allocation). std is made of re-exports from core and alloc, plus things that do interact with the operating system (IO, file system, threads, time…).

1 Like

I get it. I just don't get the 2025 reasoning behind the omission entirely (except for ECMAScript Modules, that is a live, two-way binding) of I/O from ECMA-262.

With Rust, there's just Rust. With ECMAScript there's dozens of engines, runtimes, and interpreters A list of JavaScript engines, runtimes, interpreters · GitHub.

So when writing runtime agnostic JavaScript (or TypeScript for the type folks), it's necessary to do something like this to read stdin NativeMessagingHosts/nm_host.js at main · guest271314/NativeMessagingHosts · GitHub

import * as process from "node:process";
const runtime = navigator.userAgent;
const buffer = new ArrayBuffer(0, { maxByteLength: 1024 ** 2 });
const view = new DataView(buffer);
const encoder = new TextEncoder();
// const { dirname, filename, url } = import.meta;

let readable, writable, exit; // args

if (runtime.startsWith("Deno")) {
  ({ readable } = Deno.stdin);
  ({ writable } = Deno.stdout);
  ({ exit } = Deno);
  // ({ args } = Deno);
}

if (runtime.startsWith("Node")) {
  readable = process.stdin;
  writable = new WritableStream({
    write(value) {
       process.stdout.write(value);
    }
  });
  ({ exit } = process);
  // ({ argv: args } = process);
}

and like this to write to stdout native-messaging-piper/nm_piper.js at main · guest271314/native-messaging-piper · GitHub

      const command = ["/bin/bash", ["-c", script]];
      // Node.js, Deno, Bun implement subprocesses differently.
      let stream;

      if (runtime.startsWith("Node")) {
        const { Duplex } = await import("node:stream");
        const { spawn } = await import("node:child_process");
        const { stdout, stderr } = spawn(...command);
        stream = Duplex.toWeb(stdout).readable;
      }

      if (runtime.startsWith("Deno")) {
        const subprocess = new Deno.Command(command.shift(), {
          args: command.pop(),
          stdout: "piped",
          stdin: "piped",
        });
        const process = subprocess.spawn();
        process.stdin.close();
        stream = process.stdout;
      }

      if (runtime.startsWith("Bun")) {
        const subprocess = Bun.spawn(command.flat());
        stream = subprocess.stdout;
      }

To me Rust, JavaScript, C, C++, WASM, et al. are just programming tools in the programming toolbox.

No builder only uses one kind of screw, nail, or wood.

There's no I/O at in in JavaScript. Just an observation of that programming langugae.

Just like the observation that Rust is kind of heavy at 500 MB just to get started.

True, more range in compilation options, and smaller executables. At the expense of carrying around that 500 MB. Before wew even get to any crates.

I understand your POV, and have had similar experiences around WASM, for example.

But as others have said, your choices starting out are "learn Cargo" or "learn all the rustc flags", and almost everyone does the former and will give you advice centered around the former. Rust is also designed with the dependency registry in mind, and the standard library is pretty minimal, so most projects have dependencies eventually. Which also means your dependencies probably have transitive dependencies.

If you don't have dependencies, there's less to learn in order to use rustc directly. But don't miss optimization flags (partially covered above), and don't miss --edition either (which I don't think has been mentioned yet[1]).

I suppose there is also an argument for being "that one person who knows rustc in and out" :slight_smile:.

I am not that person, but for kicks, I decided what would be involved to compile this simple program without Cargo.[2]

use hashbrown::HashMap;

fn main() {
    let hm: HashMap<_, _> = [("Hello", "world")].into();
    println!("{hm:?}");
}

Here's what I ended up with, or you could try it yourself without looking.

N.b. based on cargo build --release -v, i.e. I took a significant shortcut.

#!/bin/bash

# Things I had Cargo do anyway, and then I started from there
declare SKIPPED0='Download hashbrown, figure out dependencies and their versions, download those (transitively)'
# Extra things Cargo does that I didn't include below
declare SKIPPED1='Error report tuning:                   --error-format=json --json=diagnostic-rendered-ansi,artifacts,future-incompat --diagnostic-width=190'
declare SKIPPED2='--cfg sanity checks:                   (example) --check-cfg cfg(feature, values("alloc", "default", "fresh-rust", "nightly", "serde", "std"))'
declare SKIPPED3='Multiple versions of the same crate:   -C metadata=fead36859f74ad81 -C extra-filename=-fead36859f74ad81'
declare SKIPPED4='Misc other debug related stuff:        (example) --emit=dep-info,metadata,link'

# Although I called it boilerplate, this involves incorporating some of my global settings
declare BOILERPLATE='-C opt-level=3 -C embed-bitcode=no -C strip=debuginfo -C target-cpu=native -L .'
declare LIBBOILERPLATE="--crate-type lib --cap-lints=allow $BOILERPLATE"
# Note that crate and dependency names can differ:    vvvvvvvvvvvvvv
declare -A LIBS=([foldhash]=foldhash [allocator_api2]=allocator-api2 [equivalent]=equivalent [hashbrown]=hashbrown)
declare -A VERSIONS=([foldhash]=0.1.4 [allocator_api2]=0.2.21 [equivalent]=1.0.1 [hashbrown]=0.15.2)
# And all of these are more things Cargo figures out from Cargo.toml for you
declare -A EDITIONS=([foldhash]=2021 [allocator_api2]=2018 [equivalent]=2015 [hashbrown]=2021)
declare -A CONFIGS=(
  [foldhash]='' \
  [allocator_api2]='--cfg feature="alloc"' \
  [equivalent]='' \
  [hashbrown]='--cfg feature="allocator-api2" --cfg feature="default" --cfg feature="default-hasher" --cfg feature="equivalent" --cfg feature="inline-more" --cfg feature="raw-entry"' \
)
declare -A EXTERNS=(
  [foldhash]='' \
  [allocator_api2]='' \
  [equivalent]='' \
  [hashbrown]='--extern allocator_api2=liballocator_api2.rlib --extern equivalent=libequivalent.rlib --extern foldhash=libfoldhash.rlib' \
)

for LIB in "${!LIBS[@]}"; do
  # I believe in practice you'd have to account for src/lib.rs possibly not being the correct path too
  rustc --crate-name "$LIB" --edition "${EDITIONS[$LIB]}" "downloads/${LIBS[$LIB]}-${VERSIONS[$LIB]}/src/lib.rs" ${CONFIGS["$LIB"]} ${EXTERNS["$LIB"]} $LIBBOILERPLATE
done

rustc --edition 2021 main.rs --crate-type bin --extern hashbrown=libhashbrown.rlib $BOILERPLATE

I may have ended up with a different set of configuration options, etc, if I didn't take the shortcut.

Cargo also does dependency unification based on SemVer, feature unification, and more.

And as I said I'm not an expert and invoking rustc, so there are quite possibly cleaner approaches, but I imagine you get the point.


And here's how it's done with Cargo.

cargo new hb
cd hb
$EDITOR bin/main.rs
cargo add hashbrown
cargo build --release 

  1. the default edition is 2015, which you don't want ↩︎

  2. std's HashMap is a variation of hashbrown::HashMap ↩︎

2 Likes

Yeah, been there, done that in JavaScript world.

I used jQuery a lot before reading the source code and seeing querySelectorAll(); in addition to jQuery, at that time, failing Promises/A+ tests.

So I dove in to JavaScript itself.

That same with packages build exclusively depedning on Node.js 5 or 10 years ago are not still trying to reconcile CommonJS.

I read all of the time, or used to before I got banned from Reddit, of all places, people in r/learnjavascript asking stuff like, what else should I learn before moving on to React?

Well, I ask, have they mastered resizable ArrayBuffer, TypedArray, DataView, bitwise operators?

The only thing React can do is wrap code around what is shipped in the given browser - an abstraction.

My observation is that, generally, the base library, engine, runtime, whatever, contains the capabilities to achieve most any programming goal, without any libraries or dependencies.

This is what I had to do by hand to refactor code written exclusively for Node.js, the 2d time - after the maintainers updated to Integrity Block Sign Version 2. Now, if they had written the code in a runtime agnostic manner, without the idea that any external dependencies were necessary, or the code would only be run by Node.js, I wouldn't have had to write that code and debug line by line by hand. I'm alright with that though. I kind of understand where to go to adjust the code by hand the next time the maintainers make a breaking change.

In the process I decided to write a WebSocket and HTTP server from scratch - borrowing from somebody elses code who also originally wrote the code only for Node.js Build/rebuild wbn-bundle.js from webbundle-plugins/packages/rollup-plugin-webbundle/src/index.ts with bun

  1. git clone https://github.com/GoogleChromeLabs/webbundle-plugins
  2. cd webbundle-plugins/packages/rollup-plugin-webbundle
  3. bun install -p
  4. In src/index.ts comment line 18, : EnforcedPlugin, line 32 const opts = await getValidatedOptionsWithDefaults(rawOpts); and lines 65-121, because I will not be using Rollup
  5. Bundle with Bun bun build --target=node --format=esm --sourcemap=none --outfile=webpackage-bundle.js ./webbundle-plugins/packages/rollup-plugin-webbundle/src/index.ts
  6. Create reference to Web Cryptography API that will be used in the code in the bundled script instead of node:crypto directly import { webcrypto } from "node:crypto";
  7. In /node_modules/wbn-sign/lib/utils/utils.js use switch (key.algorithm.name) {
  8. getRawPublicKey becomes an async function for substituting const exportedKey = await webcrypto.subtle.exportKey("spki", publicKey); for publicKey.export({ type: "spki", format: "der" });
  9. In /node_modules/wbn-sign/lib/signers/integrity-block-signer.js use const publicKey = await signingStrategy.getPublicKey(); and [getPublicKeyAttributeName(publicKey)]: await getRawPublicKey(publicKey); verifySignature() also becomes an async function where const algorithm = { name: "Ed25519" }; const isVerified = await webcrypto.subtle.verify(algorithm, publicKey, signature, data); is substituted for const isVerified = crypto2.verify(undefined, data, publicKey, signature);
  10. In /node_modules/wbn-sign/lib/web-bundle-id.js serialize() function becomes async for return base32Encode(new Uint8Array([...await getRawPublicKey(this.key), ...this.typeSuffix]), "RFC4648", { padding: false }).toLowerCase();; and serializeWithIsolatedWebAppOrigin() becomes an async function for return ${this.scheme}${await this.serialize()}/;; toString() becomes an async function for return Web Bundle ID: ${await this.serialize()} Isolated Web App Origin: ${await this.serializeWithIsolatedWebAppOrigin()};
  11. In src/index.ts export {WebBundleId, bundleIsolatedWebApp};
  12. In index.js, the entry point for how I am creating the SWBN and IWA I get the public and private keys created with Web Cryptography API, and use Web Cryptography API to sign and verify

Now, I'm banned from WICG as a whole, too, for questioning Web Speech API specification, years ago. They still link to the above repository because, I suppose, nobody else had yet dug in to create runtime agnostic code implementing a server in the browser that works. Wasn't a simple matter of throwing some line on a command line. I had to create other workarounds while the maintainers were trying, for whatever reasons they had, to stop me from exploiting the technology the way I wanted to, for my own purposes GitHub - guest271314/isolated-web-app-utilities: Isolated Web App Utilities.

So, I don't mind digging in to the minutae if necessary.

I'm highly skeptical of "best practices" and oceans of dependencies. Make it work with the base library itself, only if you can't depend on somebody elses code, that can change at the maintainers' whim.

I tried out with the code from the random Web site that converted JavaScript to Rust - after looking first for a JavaScript to Rust converter written deliberately by humans, first. I located no such library or toll in the wild.

Not a fan of "artificial intelligence" branding. It's just a program.

I guess I can't go back in time to not use the random online Rust to JavaScript converter, to write the algorithm from scratch.

I'll keep trying to keep Rust in the JavaScript<=>C<=>WASM<=Rust=>native executable algorithm loop I'm experimenting with on my machine.

Thank you all kindly for your input and feedback.

I'll keep reading if y'all got more to say.

Cheers.

that can change at the maintainers' whim.

Not with cargo & crates. Once published, a version can never be unpublished. And you can specify one specific version of the dependency, then you always get exactly that version. Nothing the maintainer can do about it.

I agree with the other people here, the entire Rust ecosystem is built around cargo and rustup. And the Cargo.toml is really easy to learn, and cargo new writes a default project setup for you anyway, which isn't bloated at all. It contains two files and a git setup. (Files being src/main.rs and Cargo.toml both with about 5 lines of code)

3 Likes

The biggest thing I had to relearn starting with Rust wasn't the borrow checker, or traits, but that dependencies don't suck in Rust. You're not tying yourself to any runtime (because you're the runtime) other than the standard library, and most dependencies that can will run without that too. You can't run into a left-pad because yanking a package only prevents that version getting resolved, if you've already locked that version it's still available, and most impressively, it's rare that I open up a package source and go "I can do this so much nicer myself" (this is partially simply due to early adopters being better devs, but also Rust is a lot better at building libraries)

As a footnote the reason jQuery has a thin wrapper around querySelectorAll() is because the guts of jQuery got added to browsers :yum:. There's a lot of that in JavaScript more recently specifically due to it being good for browsers to include everything they can for performance.

Similarly you can create a project which has a json manifest and a shell script for a build system. Someone might find it useful and will want to depend on it. They’ll probably just fork it to create a Cargo.toml and reorganize your project in a way cargo understands. It cuts both ways!

That said, cargo is probably my favorite thing about Rust. It does dependecy version resolution, fetches deps and builds them and builds your project for a target of your choosing, runs tests, examples and publishes your project to crates.io if you so choose.
All of this without having to wrangle scripts, toolchain files, different build systems and toolchains. The toml format to me is saner than json or xml for a project manifest.

You can choose not to use dependencies in your project, however the standard can’t cover every use case out there, and eventually you might want to use something written by a domain expert instead of writing everything from scratch.

2 Likes