My rocket implementation is slower than flask

My journey to understand has led me to tackle increasingly complex Python projects, which I initially used to learn Python.

My brother owns a pizzeria, and I developed an app for the waiters. The database isn't huge at all, as one might expect for a restaurant.

I'm using Flask and Postgres. Initially, I created the first APIs in Rocket to retrieve table data and stress-tested them against the Flask server with thousands of queries. I was fully expecting Rocket to outperform Flask, but the results were surprising:

Flask   : 3.141975164413452
Rocket  : 23.789050102233887

To create my Rust web server with Rocket, Diesel, and PostgreSQL, I followed this guide.

I started by using diesel-cli to create the for my database. Then, I created the for the first table like this:

use diesel::prelude::*;
use rocket::serde::{Serialize, Deserialize};
use uuid::Uuid;

use crate::schema::dishes;

#[derive(Queryable, Debug, Identifiable, Serialize, Deserialize)]
#[diesel(table_name = dishes)]
pub struct Dish {
    pub id: Uuid,
    pub price: Option<f32>,
    pub enabled: Option<bool>,

The schema code generated by diesel-cli:

diesel::table! {
    dishes (id) {
        id -> Uuid,
        price -> Nullable<Float4>,
        enabled -> Nullable<Bool>,

Finally, the service:

use diesel::prelude::*;

use rest_ws::{

use rocket::serde::json::{json, Value};

pub fn get_dishes() -> Value {
    use rest_ws::schema::dishes::dsl::*;
    let connection = &mut establish_connection();
    let results: Vec<Dish> = dishes.load::<Dish>(connection).expect("Error loading dishes");

The establish_connection function is defined in

pub mod models;
pub mod schema;

use diesel::pg::PgConnection;
use diesel::prelude::*;
use dotenvy::dotenv;
use std::env;

pub fn establish_connection() -> PgConnection {

    let database_url = env::var("DATABASE_URL").expect("DATABASE_URL must be set");
        .unwrap_or_else(|_| panic!("Error connecting to {}", database_url))

The routes


use rocket::serde::json::Value;

use crate::services;

pub fn get_dishes() -> Value {

Finally, my

#[macro_use] extern crate rocket;

mod services;
mod routes;
use routes::dishes::get_dishes;

fn rocket() -> _ {
    rocket::build().mount("/", routes![index, get_dishes])

I use the command

cargo build --release

And then execute the resulting binary to start the server. It works.

I have to say, it seems overly complex just to create a web server that could potentially be implemented in 4 lines of Python! :rofl:

Regarding the bottleneck, I've identified a significant one (which occurred to me as I was writing this): each time a request is made, a new connection is established. This doesn't happen in my Flask implementation, where the connection is created when the server starts and subsequent requests reuse it. This could provide a significant speed boost, but I'm not sure where to start to achieve it :sweat_smile:

Do you see any other problems in the code? Is the connection issue enough to explain the 7x performance difference?


1 Like

Yes, and yes. It's common knowledge to use the singleton pattern in web development for things such as database connections. You should check Rocket's documentation on how to handle this.

1 Like

Building one endpoint isn't really indicative of how complex it is to build a web server. Believe me, when you have to maintain 10s or 100s of endpoints, you will love Rust.

Rocket provides Managed State you can use to share a pool of connections like r2d2 provides.

Rocket also has an abstraction specifically for databases, including diesel.


I implemented rocket_sync_db_pools with diesel_postgres_pool based on r2d2 and the performance are now better.
Still Flask wins by 0.2 seconds (on 1000 requests).
I now think this is not the where rust overpower python, as soon as I will add some backend real action it should kill python.

Thanks for the hint!


1 Like