Rust vs Julia language

I read the below at Julia language community, would like to hear from Rust language community their opinion about the different between the two languages and best place to use each.

In general I'm quite sure I wouldn't want to write low-level software in any language that includes a GC and/or JIT compilation (e.g. Julia, but also Python, Ruby, JavaScript, every .NET and JVM language etc).

To me it seems that the only real raison d'etre of Julia is scientific computing, even if it is more flexible than that, strictly speaking (it is Turing-complete, after all).

As a third point, I likely won't be using Julia for any large scale projects. I wanted to get away from typeless/unityped (also mistakenly known as "dynamically typed") programming languages because as soon as you try to scale up a code base written in them, you suddenly are faced with a nasty choice: Either replicate the absent type system with unit tests and/or assertions (causing a lot of otherwise senseless work because the code base is large), which adds significantly to any other unit/integration tests you might write, or take the huge chance that some function/method somewhere gets an argument of a "wrong type" (e.g. a string where an int is expected) and thus the code mysteriously blows up... but only at runtime, likely hours or days after the code was written, which in turn makes debugging much more expensive.
Neither choice is appealing to me.
Now, Julia does seem to have something that looks like gradual typing, but I'm not sure how that plays out in the real world. It might be sufficient help in scaling up a code base, or it might not (e.g. because you run into weird corner cases on the boundary of explicitly typed code with untyped code). Its usefulness fully depends here on the implementation, not on the underlying PL theory.

So, with large scale projects and low-level projects being handled better by alternative language ecosystems, what remains?
Toy projects, scientific computing, perhaps even some web dev (assuming Julia is even targeting WASM).
But none of those things can't be done already and all of them can already be done in Rust (low-level + large-scale + web) or Python (SciComp), so Julia will have to come up with something else to convince me personally that it's worthwhile to invest in.

3 Likes

Good explanation! I found Julia interesting, but this very niche tech. Probably it will never come to mainstream.

What about the below 2 options in Julia:

  1. __ precompile __() so it became as compile language instead of Script language
  2. Optionally Typed, so types can be defined like x::Int8, function sinc(x)::Float64 to avoid the typless concern and potential conflicts

Rust and Julia have different goals so I don't think a "versus" comparison is helpful. A better discussion is what features of Julia should Rust steal?

17 Likes

You are right, I think @[hasanOryx] is not asking which is better, but what each of them have pros and cons.

Rust is much different, I think it is mix of C++ and Go with some extras not found in any other language.

2 Likes

I'm not sure what you're asking about those features?

If you're asking if they would make me decide to really invest in Julia, the answer is "never say never, but probably not".

1 Like

trying to understand the pros and cons of each one

There are a number of software systems that are essentially a set of hacky shell scripts.
If those projects had access to compilation it surely would be beneficial to them.

On the other hand, most of those projects tend to be old, since newer projects tend to be written in proper programming languages. So for scripting I don't see the benefit in compiling the code.

Gradual typing is a different beast, but I've already said plenty about that above.

I had tried using Julia before I switched to Rust for some scientific computing projects (agent-based simulation for reinforcement learning), but was frustrated by the lack of low level control and unpredictable performance and memory usage when using parallel threads. I'd make minor changes and the runtime would increase by 10x. I may not have given Julia enough of a chance, but I found Rust to be much better in this regard. I do miss the REPL from Julia, however.

3 Likes

I think there is REPL - rusti, but do not know how it works.

I do the vast majority of my programming in Julia (but Rust looks like a very interesting and potentially fun language), and there are definitely features -- as robsmith notes -- in Rust that I would like in Julia.

I am a graduate student studying statistics, so I am very much in Julia's (initial) target audience, without much experience programming for reasons other than smashing numbers together.

The number one feature of Julia that I hope other languages steal:
Multiple dispatch.
Both Rust and Julia use type inference to infer the types of variables within function bodies. Julia combines this with multiple dispatch, so that a different version of the function gets compiled for each combination of argument types that function gets called with.
Simple example (note: I started Julia with --math-mode=fast, otherwise it won't fuse multiply and add instructions on floating point types):

julia> foo(a,b,c) = a * b + c
foo (generic function with 1 method)

julia> @code_native foo(2.0, 3.0, 4.0) # disassemble the function when the arguments are all Float64
	.text
; Function foo {
; Location: REPL[1]:1
; Function +; {
; Location: REPL[1]:1
	vfmadd213sd	%xmm2, %xmm1, %xmm0
;}
	retq
	nopw	%cs:(%rax,%rax)
;}

julia> @code_native foo(2, 3, 4) # integers default to 64 bit on 64 bit systems in Julia
	.text
; Function foo {
; Location: REPL[1]:1
; Function *; {
; Location: REPL[1]:1
	imulq	%rsi, %rdi
;}
; Function +; {
; Location: int.jl:53
	leaq	(%rdi,%rdx), %rax
;}
	retq
	nopl	(%rax)
;}

julia> @code_native foo(2.0, 3.0, 4) # mix doubles and integers; asm promotes the integer
	.text
; Function foo {
; Location: REPL[1]:1
; Function +; {
; Location: promotion.jl:313
; Function promote; {
; Location: promotion.jl:284
; Function _promote; {
; Location: promotion.jl:261
; Function convert; {
; Location: number.jl:7
; Function Type; {
; Location: REPL[1]:1
	vcvtsi2sdq	%rdi, %xmm2, %xmm2
;}}}}}
; Function +; {
; Location: float.jl:395
	vfmadd213sd	%xmm2, %xmm1, %xmm0
;}
	retq
	nopl	(%rax,%rax)
;}

julia> A, B, C = randn(1000,1000), randn(1000,1000), randn(1000,1000);

julia> @code_native foo(A, B, C) # when called with matrices, defaults to BLAS call
	.text
; Function foo {
; Location: REPL[1]:1
	pushq	%r15
	pushq	%r14
	pushq	%r12
	pushq	%rbx
	subq	$56, %rsp
	vxorps	%xmm0, %xmm0, %xmm0
	movq	$0, 16(%rsp)
	movq	%fs:0, %r15
	movq	%rsp, %rcx
; Function *; {
; Location: matmul.jl:141
; Function similar; {
; Location: array.jl:332
; Function Type; {
; Location: boot.jl:404
; Function Type; {
; Location: boot.jl:396
	movabsq	$jl_system_image_data, %rdi
	vmovaps	%xmm0, (%rsp)
	movq	%rsi, 48(%rsp)
	movq	$2, (%rsp)
	movq	-10920(%r15), %rax
	movq	%rax, 8(%rsp)
	movq	%rcx, -10920(%r15)
	movabsq	$jl_alloc_array_2d, %rax
	movq	8(%rsi), %r14
	movq	(%rsi), %rbx
	movq	16(%rsi), %r12
;}}}}
; Function *; {
; Location: array.jl:154
	movq	24(%rbx), %rsi
	movq	32(%r14), %rdx
;}
; Function *; {
; Location: matmul.jl:141
; Function similar; {
; Location: array.jl:332
; Function Type; {
; Location: boot.jl:404
; Function Type; {
; Location: boot.jl:396
	callq	*%rax
;}}}
; Function mul!; {
; Location: matmul.jl:143
	movabsq	$"julia_gemm_wrapper!_-497477299", %r9
	movl	$1308622848, %esi       # imm = 0x4E000000
	movl	$1308622848, %edx       # imm = 0x4E000000
	movq	%rax, %rdi
	movq	%rbx, %rcx
	movq	%r14, %r8
	movq	%rax, 16(%rsp)
	callq	*%r9
;}}
	movq	%rax, 32(%rsp)
	movq	%rax, 16(%rsp)
	movabsq	$"japi1_+_-497477288", %rax
	movabsq	$jl_system_image_data, %rdi
	leaq	32(%rsp), %rsi
	movl	$2, %edx
	movq	%r12, 40(%rsp)
	callq	*%rax
	movq	8(%rsp), %rcx
	movq	%rcx, -10920(%r15)
	addq	$56, %rsp
	popq	%rbx
	popq	%r12
	popq	%r14
	popq	%r15
	retq
	nop
;}

julia> bar(a, b, c) = foo.(a, b, c) # dot broadcasts
bar (generic function with 1 method)

julia> using StaticArrays

julia> static_vector_4 = @SVector randn(4); # vector parametrically typed by length

julia> @code_native bar(static_vector_4, static_vector_4, static_vector_4) # compiled into single `vfmadd`
	.text
; Function bar {
; Location: REPL[6]:1
; Function materialize; {
; Location: broadcast.jl:748
; Function copy; {
; Location: broadcast.jl:24
; Function _broadcast; {
; Location: broadcast.jl:94
; Function macro expansion; {
; Location: broadcast.jl:133
; Function foo; {
; Location: REPL[1]:1
; Function *; {
; Location: REPL[6]:1
	vmovupd	(%rsi), %ymm0
	vmovupd	(%rdx), %ymm1
;}}}}}}
	movq	%rdi, %rax
; Function materialize; {
; Location: broadcast.jl:748
; Function copy; {
; Location: broadcast.jl:24
; Function _broadcast; {
; Location: broadcast.jl:94
; Function macro expansion; {
; Location: broadcast.jl:133
; Function foo; {
; Location: REPL[1]:1
; Function +; {
; Location: float.jl:395
	vfmadd213pd	(%rcx), %ymm0, %ymm1
;}}}}}}
	vmovupd	%ymm1, (%rdi)
	vzeroupper
	retq
	nopl	(%rax,%rax)
;}

The @ designates a macro in Julia. If I understand correctly, Rust's macros are similar (ie, work on Rust's AST), but are designated with a bang! (which Julia instead uses as a convention to designate functions that mutate arguments).

Dispatches get resolved at compile time whenever possible (ie, they do unless types of inputs are unknown).
In Julia, we do not have to replicate an absent type system. Julia has one, and we can use it to provide alternative method bodies. What if we want different behavior for strings?

julia> foo(a::String, b::String, c::String) = a * b * c # `*` is concatenation for some reason
foo (generic function with 2 methods)

julia> using SIMD # SIMD intrinsics

julia> const _m256d = Vec{4,Float64}
Vec{4,Float64}

julia> foo(a::Vec{W}, b::Vec{W}, c::Vec{W}) where W = fma(a, b, c)
foo (generic function with 3 methods)

julia> a, b = _m256d(2.0), _m256d(3.0); # vbroadcasts

julia> c = vload(_m256d, pointer(C)) # C is a matrix from earlier; vmovupd
4-element Vec{4,Float64}:
Float64⟨-1.5053969768456859, 0.1665546660560075, 0.258014792575286, 0.5408998822694233⟩

julia> @code_native foo(a, b, c)
	.text
; Function foo {
; Location: REPL[16]:1
; Function fma; {
; Location: SIMD.jl:1007
; Function llvmwrap; {
; Location: SIMD.jl:663
; Function llvmwrap; {
; Location: SIMD.jl:663
; Function macro expansion; {
; Location: REPL[16]:1
	vmovupd	(%rsi), %ymm0
	vmovupd	(%rdx), %ymm1
;}}}}
	movq	%rdi, %rax
; Function fma; {
; Location: SIMD.jl:1007
; Function llvmwrap; {
; Location: SIMD.jl:663
; Function llvmwrap; {
; Location: SIMD.jl:663
; Function macro expansion; {
; Location: SIMD.jl:685
	vfmadd213pd	(%rcx), %ymm0, %ymm1
;}}}}
	vmovapd	%ymm1, (%rdi)
	vzeroupper
	retq
	nopl	(%rax,%rax)
;}


julia> foo("Hello", ", ", "world!")
"Hello, world!"

julia> foo(2.0, 3.0, 4.0)
10.0

julia> foo(a, b, c)
4-element Vec{4,Float64}:
Float64⟨4.494603023154314, 6.166554666056007, 6.258014792575286, 6.540899882269423⟩

I'm sure the idea of overloading operators is familiar to everyone here.
But my point is that not typing your function's inputs is strictly optional, and you can write as many different versions of the function with different combinations typed as you would like.

The only effect of typing the arguments is to change the dispatching behavior, so that if a function needs an argument to be a string, that can be specified. Or perhaps, it only needs an abstract string, so you can more broadly restrict the method to any AbstractString.
It is similarly common to see arguments restricted to AbstractArray, accepting arguments with array-like behavior, with the appropriate methods of the array interface defined (ie, getindex, size, length). Although it is common for these to assume some methods that are not always defined (eg, setindex!, mul! for in place multiplication). There are many generic fall back definitions, eg for *, so that this operation will work for any reasonable implementation of any subtype of AbstractArray.
However, the generic fall backs could be extremely slow. For example, if that AbstractArray is a SparseArray, you definitely need to make sure specific versions have been implemented...

Which brings me to a general pattern of Julia's performance. Rather than throwing an error or giving you a warning, it will give you the correct answer, but silently be slow. Having to profile to spot a 10x performance regression that suddenly appeared isn't all too uncommon.
Reasons for this include type inference failure, generic fall backs getting used not optimized for your types, or maybe you're writing really low level code and got into a fight with the vectorizer over aliasing because the compiler noticed you used a pointer somewhere.

Knowing my colleagues, this is the behavior they would prefer. They don't care much about speed, they just want to get an assignment or project done. If the library they are using is fast, all the better, especially if they are running a simulation that could take days or weeks. But they don't care whether that code was written in Julia, C++, Fortran, or Rust -- they're just glad they didn't have to write it, and can spend their time focusing on other problems. They just want the code they wrote, be it in R, Python, or Julia, to get the correct answer with as little of their time spent as possible.

Rust -- in the eyes of someone who hasn't really used it -- seems optimized for a different set of priorities, but one I really appreciate. Aimed at making these high performance libraries, perhaps the fastest and most convenient tool for quickly developing a library that is fast, reliable, and light on bugs.
I'm interested in Rust because I enjoy developing code. So even while there isn't much numerical computing done in Rust, it is at the top of the languages I'd be interested to learn, since it seems to bring the most interesting new (to me) ideas to the table -- and it seems great for what i like to do.

~~

Another thing I really like about Julia is the ability to abuse the JIT, by specializing code very heavily on the types involved (often automated by LLVM, sometimes supplemented by heavy use of metaprogramming / generated functions with extensive construction of ASTs based on typed parameters received in arguments).
For that reason, I'd be concerned that I couldn't write code as fast (in run time performance) in any other language.

~~

Finally, Julia is not great for staticially compiling at the moment, although here is a library under active development. If you want that, definitely go Rust. If a REPL is very important to you, Julia's is quite powerful and convenient. (featuring autocompletion, memory to let you type @b to jump through previous instances, across sessions, where you began with such lines, automatic deletion of REPL prompts and output so that you can copy and paste REPL output into another REPL, syntax highlighting through the OhMyREPL library).

~~

Anyway, I'm still unfamiliar with Rust and a Julia evangelist who wont shut up about it, so take everything I say here with a hefty dose of salt.

13 Likes

Thanks a lot for the extensive feedback.

Great posts. Have to say I agree with all of it. My 2 cents: if you are a researcher developing new algorithms Julia (and python) are likely gonna the be the first thing to reach for. Also if you are doing a lot of day to day ad-hoc data exploration or exploratory analysis I'd think a Julia and Python would be the most productive.

Putting those algorithms or analyses into large code bases and/or data pipelines and infrastructure, for me, would be the domain of Rust. It's a bit against the grain of "solving the two language problem", but for me personally I spend way more time reading code, refactoring code, and trying to to optimize. I find I am more productive in languages like Go and Rust than I am in Ruby or Julia or Python when it comes to refactoring.

That said, I hope Julia comes to dominate data science and I'll be doing all I can to make that a reality. I also hope Rust comes to dominate infrastructure. In my ideal world, I'd build my infrastructure and data engineering with Rust, and do my data science-y stuff in Julia... maybe alternating between the two depending on the task.

8 Likes

I like the idea that Julia is based on, but I don't like the fact that they state "fast as C", which is obviously a bad attempt at advertising.

I have experience coding Python, C and C++, Matlab, and lately learning to code in Rust. In all cases, an experienced user can make a piece of code run very fast. What differs though is the level of skill needed to reach a certain level of speed. Thus, an experienced coder in Julia can surely write code that runs at the speed of an inexperienced C coder's code. But, as you can imagine, that is not a fair comparison.

Where Rust shines is in the fact that a coder only needs a reasonable amount of skill to write concurrent and fast algorithms. This code will also be deterministically fast (after taking a bit more care). Julia appears to aim for something similar, with the difference that the coder needs even less skill to get working code vs Rust for data processing tasks. The problem will be that the Julia code, even for experienced coders, will not run at deterministic speeds. This means that you cannot write a mission-critical algorithm like an autonomous vehicle application in Julia, ever. The GC will glitch and your car will crash and burn. With Rust, real-time processing can be guaranteed when following certain requirements.

This is why I probably won't learn Julia, because I intend to use Rust for real-time embedded signal processing. Where I need data processing as part of the prototyping, I'll just use Python because that's what I already know. I also doubt that Julia has the scripting prowess of Python for controlling other windows applications through a com interface, for instance.

In short, Julia has a lot of promise in replacing R. I don't see Julia as a real competitor for Python scripting-wise. For data science, Julia and R will battle it out and Julia will likely win. For Rust, well you shouldn't write prototype code in a systems programming language, rather prototype in Julia (or Python for me), and then re-write into Rust for deploying the application.

6 Likes

For sure.
In Julia you can basically opt-in to really understanding the type system... the kind of understanding that will help a team build fast code.
In Rust you must pay the fee at the door (to some extent). There is no putting off lifetimes, you have get to some level of proficiency before you can be productive.

What this means is that a team can have higher degrees of confidence that code written by other members will meet the standard of the compiler.... there is no hiding from that. Whereas is many languages, Julia among them but certainly not the worst, a team needs to be a bit more careful in gaining consensus on certain "safe" or "fast" patterns.

That said, it bears remembering, as was implied in this thread, the "target" for Julia was, i think, first and foremost, academics and scientists... people whose core competency does not sit in software development. So paying a high up-front cost just to get some code running just won't take for certain populations.... and you can see the design of the language reflects a lot of that ethos. In this regard, Julia really is providing something incredible.
Just the same though, i think Rust has and is doing the same, and really providing something incredible.
If I had my choice I'd do all my work in Rust, and I'd do all my machine learning work in Rust... but those communities take a looong time to build.

7 Likes

It would be good to have a bunch more machine learning crates, but not so as to replicate Julia or Python, i.m.o. I tried to find such for a deep learning expert friend recently, and the state of basic ML crates is not great in Rust.

I do, however, find it problematic that languages tend to want to copy the strengths of each other. It's the classic engineering mistake of "more features are better". Steve Jobs showed us that "less is more", as long as that "less" works extremely well. Hence, I'd rather that Rust crates are written to speed up core ML and other tasks that can be called from within Julia or Python.

ML and other data processing can be done straight in Rust, but then you'd need to have incremental compilation working well and you'd need a very good interactive data visualisation crate. Python is great for this with matplotlib and plotly - you'd generate hundreds of data outputs, incrementally tweaking the code until the correct design is found.

I'd prefer Rust to never lose its comparative advantage of secure embedded and IoT coding, and the WebAssembly thing ties in closely with these fundamentals. To chase the moving target of ML is something that I don't think is practical at this time, and it's not what Rust is best at. I would, however, say that ML inferencing can be a strenth of Rust. Meaning that the final ML designs should be ported from Julia/Python to Rust for deployment. I'm excited about the Risc-V machine learning work and hope that Rust can target the Risc-V extensions for this (using CraneLift, not LLVM), since domain-specific architecture is right up Rust's alley. That way, Rust can run 10x faster than anything else when you give it the right hardware to run on (Risc-V with ML extensions).

Using Rust for deployment and acceleration of core tasks then works to complement Julia, not compete with it.

2 Likes

One of the design purposes of Julia was to not need a second language to create the most performance demanding parts of a program.

1 Like

How's the compilation of Julia getting along? Is it working as intended? Also, if they wanted to not need a second language, they would have had to stay away from the GC. I don't see how one can optimise Julia code to be as performant as optimised Rust code given that the programmer does not have equal low-level control.

It's like that other language called Pony, Pony promises the sun and the moon, but it doesn't actually end up delivering that. Julia has strong backing, but physics is physics and some design decisions make certain targets harder to meet.

I think the "two language problem" needs to be put in context for what the target audience of Julia still is... academics.

The trade-off with Julia is that you still need to put in work to really optimize, and to understand the type system... but that amount of work should be far less than what you'd have to do in R or Python... while not constraining the expressiveness of the language/program. OTH, the amount of work to write your program in Julia should be far less than C++ or Rust. So Julia's trying to hit the balance between development productivity and program efficiency.

But there are other goals the Julia community has and I don't think these revolve around having the fastest running programs.

This is a very good statement of what I think these goals are, by the author of the DifferentialEquations packages: Why Numba and Cython are not substitutes for Julia