I'm learning Rust, especially with sqlite. I wanted to learn how to store a file into a sqlite database and came up with the code blow. I would be happy if you could do a simple code review. Is my acceptable or is there something I should do better?
use rusqlite::{Connection, DatabaseName};
use std::{
fs::{metadata, File},
io::{Read, Write},
};
fn main() {
let file_path = r"C:\temp\document.pdf";
let mut f = File::open(file_path).unwrap();
let f_len = metadata(&file_path).unwrap().len();
let db_path = r"C:\temp\sqlite.db";
let mut db = Connection::open(db_path).unwrap();
let db = db.transaction().unwrap();
// Create a blob with the size of the file that I want to store
db.execute(
"INSERT INTO documents (data) VALUES (ZEROBLOB(?1))",
[f_len],
)
.unwrap();
let row_id = db.last_insert_rowid();
// Open the blob for writing
let mut blob = db
.blob_open(DatabaseName::Main, "documents", "data", row_id, false)
.unwrap();
// Write the file to the blob
let mut buffer: Vec<u8> = vec![0; 1024];
loop {
let bytes_read = f.read(&mut buffer).unwrap();
if bytes_read == 0 {
break;
}
blob.write_all(&buffer).unwrap();
}
blob.close().unwrap();
let _ = db.commit().unwrap();
}
This is a relatively small buffer. You could easily increase it to 128 kibibytes or more. For a simple program like this, I would suggest using 1 mebibyte:
You already have a file handle in f, there's no need for f_len.
How large is document.pdf? Since it's a PDF file, I would say not too large, in which case I would prefer to read the file in memory and write it directly rather than doing the dance to get a handle to the blob.
So long as it's not more than, say, dozens of MB -- which seems plausible if you're putting it in a DB -- then you might as well just read in std::fs - Rust it and save yourself the extra effort of doing the buffer management.
Note that if you're going to use that, I suggest writing it in a way that emphasizes its binary structure, rather than having it in decimal.