How to skip http headers in response?

I'm try to downloading files without any 3party libs

use std::fs::File;
use std::io::{Read, Write};
use std::net::TcpStream;
use std::net::ToSocketAddrs;
use std::time::Duration;

fn main() {
    let resp = get().unwrap();
    let mut output = File::create("image.png").unwrap();

fn get() -> std::io::Result<Vec<u8>> {
    let host = "";
    let ip_lookup = host.to_socket_addrs()?.next().unwrap();
    let mut socket = TcpStream::connect_timeout(&ip_lookup, Duration::from_millis(5000))?;
    let request = format!("GET /image.png HTTP/1.1\r\n\r\n");
    let mut resp: Vec<u8> = Vec::new();
    socket.read_to_end(&mut resp)?;

In downloaded image I see http headers

How I can skip headers and write only image? Thanks for answers.

There's one newline between each header, but two newlines after the last one, so you can read until you find a double-newline. (Note that newlines here are actually two characters: an \r followed by an \n.)

This would work for your particular example, but there are various ways this is insufficient in general:

  1. The server might respond with a compressed response, in which case you would need to decompress it. The headers tell you whether the response is compressed.
  2. The server might respond using a chuncked encoding if it doesn't know the Content-Length up front. In this case, getting the raw contents is quite difficult.

Handling all of these details is why we have 3rd party libs for this :slight_smile:


You need to read those headers, line by line, to figure out what the response is, it could be an error, and how big the payload actually is. Then read those payload bytes.

All of which means writing and http response parser. See An overview of HTTP - HTTP | MDN

All in all it would be much easier to use a crate that already does all that for you. For example reqwest - Rust. Unless you really want to get into all the details of HTTP for educational reasons. Do you?

reqwest incresed size of file x10. stdlib 336kb and reqwest with blocking feature 3.1mb. Now I think good idea using curl.

What are you running this on? Is 3.1Mb really problem? Have you taken steps to minimise the binary size?

This topic was automatically closed 90 days after the last reply. We invite you to open a new topic if you have further questions or comments.