2 way (AKA request response) channel

Hi

I like the idea of mpsc, having independent threads and have data flowing between them (when the data actually needs to go one way).

This feature saves us from using Mutex, for the cost of a little memory.

In some situations, I want to request some data from anther thread so i would like to have a two way pipe, similer to the http request response pattern, not between IP address, but between threads,

I am pretty sure it has some term and some crate I don't have the handle to it.

something like the following

let {server,client}=reqres::channel()
let client1=client.clone()
let client2=client.clone()

thread::spawn(move ||{
for req in server {
req.send(req.read().to_uppercase)
}
})

thread::spawn(move ||{
client1.send("hello")
println!("{}",client1.recv().unwrap())//HELLO
})

thread::spawn(move ||{
client2.send("hello")
println!("{}",client2.recv().unwrap())//HELLO
})

Please give me the right path on this

How about a pair of mpsc channels, one for requests and one for responses?

1 Like

I love this idea for a channel API. The usual way I would do this is to send a reply channel with the request, like so:

let (tx, rx) = mpsc::channel();
let (reply_tx, reply_rx) = mpsc::channel();
// ...
tx.send((foo, reply_tx));
let reply = reply_rx.recv().unwrap();

But I would love a request-response API like so:

let (tx, rx) = channel();
// ...
let reply = tx.call(foo).unwrap();
1 Like

If there isn't i will try to make a crate working with seperate channels in background

Channel will accept a handler as a closure?
Reply would contain the First received item, not the channel itself.

Oh no, I was thinking a convenience wrapper around the "reply channel" pattern I was talking about, where channel.call(foo) is a convenience method which would send foo and wait for a reply. Alternatively one could call channel.send(foo) which would return the reply channel to recv on later.

This blog post does something similar:

1 Like

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.