Rust web development hosting

So my shared hosting plan is going to expire soon and I am considering my options.

I have not much running on my server, just a static website, a client side JavaScript app and my email with my custom domain.

In the past I wanted to try web development with Node.js and after that with Rust, but unfortunately that is not really possible with a shared hosting plan. Therefore I am considering the cheapest plan at Digital Ocean now, to be able to play with Rust in the future.

I have found that I can set up my email exactly like it is set up now, so that's good.

Is it a good idea to move to Digital Ocean? Are there drawbacks? Are there things I should be aware of before taking the plunge? Do you know of any resources to setup Rust applications on Digital Ocean?

I would also like to eventually set up a blog as well. I guess it will be difficult to make that in Rust so I would have to use another language/framework. Is it hard to have Rust applications running alongside something like Django or even wordpress (if I don't feel like coding it myself)? Are there resources about how to set something like that up?

Lot's of questions, but I want to make sure I am making the right choice :slight_smile:


I just went with Amazon Web Services the other day. They offered a 12-month free, limited service for first-timers.

WRT running multiple things on one server: I just started dabbling with Docker. It seems an ideal way to have multiple services on the same hardware. With Rust binaries it shouldn't matter too much, since they are compiled and most of the dependencies are statically compiled in, but if you are going to run Python/Ruby-based services, using Docker is highly recommended! You can use Docker with Amazon or with Digital Ocean.


Another option is Scaleway which provides 4-core ARMv7 with 2GB RAM and 50GB SSD for only 3€/month. I have a few instances on which I'm hosting several services (email, git, website written in Node.js + MongoDB), it works great :slight_smile:

I've recently managed to cross-compile my Web framework written in Rust, edge, to use it on a Scaleway server using the arm-unknown-linux-gnueabihf target. I initially struggled with an "Illegal instruction" error when running rustup for armv7-unknown-linux-gnueabihf. I found out that ARMv7 is generally understood as ARMv7 with NEON support, and since the ARM cores that Scaleway use do not have NEON support, you need to use the arm-unknown-linux-gnueabihf instead.


Just to throw in another option: I'm working with VServers of hosters who're renting guaranteed Hardware (cores, RAM, bandwidth), which is working quite nice on my side.
netcup is currently my way to go, using 24Gig RAM, 8 Cores and 320Gig SSD (Raid 10) for 23,89€, but probably uninteresting outside of the EU. (And it seems like you can't get the 12month discount outside of germany)
Through KVM I also haven't have a case of hassle because of the virtualisation (like there were before KVM, because of root restrictions). Of course you would/should have to care about this system then.


Have you used digital ocean in the past? How do they compare?

Do you deploy manually, or where you able to set up some sort of auto-deployment? :slight_smile:

Actually Scaleway servers are dedicated servers so you would have to care for the whole operating system yourself.

Automated deployment via SSH is easily possible if you use e.g. Travis-CI and add the SSH private key as an encrypted file. You then only need to scp the files and restart the application, however you intend to do that.

Isn't that also the case with a VPS like Digital Ocean?

It seems like Scaleway offers way more performance for even less money. Is it more complicated than Digital Ocean to set up? Let's say, if I want to set up

  • My email with TXT / MX tables to redirect to my Gmail
  • A database like PostgreSQL
  • And Rust

I haven't, so I cannot compare. On Scaleway you can use prebuilt images to install Wordpress or Node.js for instance, but in your case you would need to install the SMTP server and database yourself.

If you want to run a binary compiled from Rust you don't need to install Rust on the target machine though, since Rust binaries are statically compiled.

Not entirely correct.
They aren't statically compiled, they do depend on libc and I think pthread.
I think what you wanted to say is that the stdlib is compiled into the binary statically.

For the creating the server part I don't know, you just have to choose the image and say "start server" over at Scaleway.
For the software stuff, well, you just said yourself that it is practically the same as with a VPS.

Ah ok :slight_smile:

Thanks for all the info, I am going to do some further research after my exams and then choose one. I am already convinced I should move away from shared hosting now if I want to play with Rust.

Regarding that deployment once again, I would recommend using dynamic_reload to write one daemon that is actually running and using fcgi or something to accept requests from a proxy server (e.g. nginx, for certificates http2 and stuff) which then uses the mentioned dynamic_reload library, to call a function called handle or whatever in a shared library (.so on GNU/Linux) which is the binary you are actually deploying.
Don't forget to actually build that kind of dynamic library then.

You would only need to overwrite that library by scping the one built via Travis CI to your server then and it would use it when you tell the daemon to reload the library, you could use any library that supports listening for file changes or just have a special request to do that. You would also do it, say ever hour or so.

1 Like

That's very interesting, I will definitely look into that! Thanks :slight_smile:

I would recommend looking on the sites like for cheap VPS.

I'm running Arch Linux on my netcup server. I can highly recommend them, you get a 2 core Xeon machine with 6GB RAM and 120GB SSD for 8.5€ a month.
They seem to use PCI-E SSDs because I got 1.6GB/s io speed, which is quite fantastic and 8 times faster than my old 10$ droplet.

Update I haven't seen that @proc mentioned them, already.

You can statically compile Rust binaries using musl. See Taking Rust everywhere with rustup | Rust Blog

This however is not always possible, for example if you are planning on using the glutin crate this will fail at runtime (the building will succeed) due to dynamic library loading not working using musl. At least not on my machine.

thread '<main>' panicked at 'Could not build window: NoBackendAvailable(LibraryOpenError(OpenError { kind: Library, detail: "\"Dynamic loading not supported\", \"Dynamic loading not supported\"" }))', ../src/libcore/

I've used both DigitalOcean and AWS for hosting, and have been happy with both of them. For small personal projects, I think DO is a better choice because it is cheaper and simpler. As long as you don't have a need for some of the things you get on AWS like security groups, cloud load balancers, etc., DO works great.

My personal preference is to run CoreOS nodes and run all my applications with Docker containers. If you're not already familiar with the container ecosystem, there could be a big learning curve involved at first, but it's such a great way to deploy software that I don't think you will regret it.

For Rust programs, I would go one of two ways: Compile the program statically against musl libc (I use GitHub - clux/muslrust: Docker environment for building musl based static linux rust binaries for this) and then just add the resulting binary to an otherwise empty Docker image (using the "scratch" base image) or, if that's not possible, compile the program along with its dynamic dependencies in a Docker image with a "full" OS like Debian, and then just deploy that.

1 Like

That makes me more confident that Digital Ocean is a good bet for my use case.

I have never used docker, so I will probably have some reading to do after my exams :slight_smile:[quote="jimmycuadra, post:17, topic:5975"]
compile the program along with its dynamic dependencies in a Docker image with a "full" OS like Debian, and then just deploy that

Are you saying I should avoid using the compiled binary from my Linux laptop or Travis? If so, why is that?

No, you can use a binary produced elsewhere. The difference I was talking about is between a statically linked binary and a dynamically linked binary. It can be difficult to figure out exactly which dynamic libraries also need to be in the image for it to run. It's usually easiest to just use a full Linux distro as your base image. When your binary is completely statically linked, you can have only the binary in the Docker image without any sort of OS and it will work.

Basically, the difference between this Dockerfile:

FROM scratch
ENTRYPOINT ["/myprog"]
COPY myprog /myprog

for a statically linked binary, and this Dockerfile:

FROM debian
ENTRYPOINT ["/myprog"]
COPY myprog /myprog

for a dynamically linked binary. And the latter is only if your only dynamic dependency is glibc. If you have other dependencies, you will have to install them in the image as well. For example, if you needed OpenSSL:

FROM debian
ENTRYPOINT ["/myprog"]
RUN apt-get update && apt-get -y install openssl && rm -rf /var/lib/apt/lists/*
COPY myprog /myprog

And then you may run into issues if the versions of the dynamic libraries in the image aren't the same as the ones you compiled the binary against, in which case you might want to just compile the binary inside a containerized environment too. It gets hairy quickly, so if you're able to statically link everything, it's a big win in terms of the complexity of packaging it with Docker.

Take a look at my Rust image for Docker for an idea how to install Rust itself inside a container, along with a few common libraries you're likely to need: GitHub - jimmycuadra/docker-rust: DEPRECATED. Use instead.

I see, thanks for the information. This is very helpful :slight_smile: