Batch 0.1 - A distributed task queue library


#1

A couple days ago, I released v0.1 of batch, a distributed task queue library (think celery or resque).

This library allows you to send tasks to a RabbitMQ broker, so that a worker will be able to pull it and execute the associated handler. It leverages the futures and tokio-core crates to provide asynchronous I/O operations. It tries be as safe as possible while still feeling productive and expressive.

#[derive(Serialize, Deserialize, Task)]
#[task_routing_key = "hello-world"]
struct SayHello {
    to: String,
}

fn main() {
    let mut core = Core::new().unwrap();
    let client = ClientBuilder::new()
        .connection_url("amqp://localhost/%2f")
        .handle(core.handle())
        .build();
    let send = client.and_then(|client| {
        job(SayHello { to: "Ferris".into() }).send(&client)
    });
    core.run(send).unwrap();
}

See the examples or the user guide for more information.


#2

This is quite cool. Didn’t do anything too intensive but read the guide and browsed the source. In the past working with other similar libraries I’ve needed to be able to check the status of a task (running, failed, etc.), save the results/errors and to be able to kill a running task. Is this something you anticipate supporting?


#3

Thanks for the kind words !

Being able to see past/current/pending tasks is definitely something I’ve planned to add, but I’d like to add some workflow primitives (e.g: chains, groups) before adding a dashboard to monitor the queues.

I’ll try to open more issues & milestones on GitHub to gather feedback.