Applying tower layers when running on Hyper

Hi, I am having a problem getting Tower layers (like timeouts, for example) to be applied correctly when running on Hyper 0.13

This is the code used to reproduce:

use std::future::Future;
use std::net::SocketAddr;
use std::pin::Pin;
use std::task::{Context, Poll};
use std::time::Duration;

use hyper::{Body, http, Request, Response, Server};
use hyper::service::Service;
use tokio::time::delay_for;
use tower::ServiceBuilder;

#[derive(Debug)]
pub struct Svc;

impl Service<Request<Body>> for Svc {
    type Response = Response<Body>;
    type Error = http::Error;
    type Future = Pin<Box<dyn Future<Output=Result<Self::Response, Self::Error>> + Send>>;

    fn poll_ready(&mut self, _cx: &mut Context<'_>) -> Poll<Result<(), Self::Error>> { Poll::Ready(Ok(())) }

    fn call(&mut self, _: Request<Body>) -> Self::Future {
        let rsp = Response::builder();
        let body = Body::from(Vec::from(&b"ack!"[..]));
        let rsp = rsp.status(200).body(body).unwrap();

        let fut = async {
            delay_for(Duration::from_secs(5)).await;
            Ok(rsp)
        };
        Box::pin(fut)
    }
}

#[derive(Debug)]
pub struct MakeSvc;

impl<T> Service<T> for MakeSvc {
    type Response = Svc;
    type Error = std::io::Error;
    type Future = Pin<Box<dyn Future<Output=Result<Self::Response, Self::Error>> + Send>>;

    fn poll_ready(&mut self, _cx: &mut Context<'_>) -> Poll<Result<(), Self::Error>> { Ok(()).into() }

    fn call(&mut self, _: T) -> Self::Future {
        let fut = async { Ok(Svc) };
        Box::pin(fut)
    }
}

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
    env_logger::init();

    let listen_addr =
        format!("{}", "127.0.0.1:8088")
            .parse::<SocketAddr>()
            .unwrap_or_else(|e| {
                panic!("An error occurred while reading listen address {:?}", e);
            });

    let svc = ServiceBuilder::new()
        .timeout(Duration::from_secs(2))
        .service(MakeSvc);

    let server = Server::bind(&listen_addr).serve(svc);
    println!("Listening on http://{}", listen_addr);
    server.await?;
    Ok(())
}

If I move delay_for(Duration::from_secs(5)).await; to call in the MakeSvc impl, the timeout works as expected but that won't actually account for the time spent processing the request in Svc

Any ideas on what I might be doing wrong?

What are you trying to do? It sounds like something related to counting computation time in a timeout? Perhaps you can use instants instead?

Apologies, I should have been more clear in my question. The delay_for is there to simulate a long running operation to trigger the timeout.
The problem I am having is that the timeout configuration seems to be ignored unless the delay_for is in MakeSvc

This topic was automatically closed 90 days after the last reply. New replies are no longer allowed.