I am a bit lost on how to create a handler that takes a Json response body as a payload via a POST request and save it to postgres.
Basically, in postgres I have a database table called people with:
Column| Type |
id | bigint |
person | jsonb |
I am planning to send a request body with:
{
"name":"test",
"age":30
}
Currently in main.rs I have:
#[derive(Deserialize, Serialize)
struct Person {
name: String,
age: std::num::NonZeroU8,
#[serde(flatten)
extra: Map<String, Value>
}
// Main Function in Question
async fn add_person (
Extension(pool): Extension<PgPool>,
axum::Json(payload): axum::Json<Person>,
) -> Result<axum::Json<Vec<Json<Person>>>, (StatusCode, String)> {
sqlx::query("INSERT INTO people (person) values ($1)")
.bind(payload) // This is where I am lost
.execute(&pool)
.await
.ok();
sqlx::query_scalar("SELECT (person) FROM people")
.fetch_all(&pool)
.await
.map(|item| axum::Json(item))
.map_err(internal_error)
}
I am lost on how to handle the payload in order to use the request body and pass it over to postgres to insert it into the table.
Do you want the age to be part of the JSON field in the table?
You can just bind the individual fields of the struct. sqlx has a derive macro for types but I dont think it will work with the way you have things set up here
You can use sqlx's JSON type to bind the hash map.
The way you're putting JSON in the database is a little odd, especially since it seems to have a standard structure. But I'll just assume you have a good reason to be doing that
I went ahead and split the Person struct into two types. One representing the API JSON, and one representing the database row. You don't necessarily need the database row struct, but sqlx provides a derive macro to make fetching rows with struct representations easier, so I think it makes things a little clearer to do that.
use axum::{routing::post, Extension};
use hyper::StatusCode;
use serde::{Deserialize, Serialize};
use serde_json::{Map, Value};
use sqlx::{types::Json, FromRow, PgPool};
// The JSON sent to and from the endpoint.
#[derive(Deserialize, Serialize)]
struct PersonBody {
name: String,
age: std::num::NonZeroU8,
}
// A representation of the table's schema.
#[derive(FromRow)]
struct PersonRow {
id: i32,
person: Json<Map<String, Value>>,
}
async fn add_person(
Extension(pool): Extension<PgPool>,
axum::Json(payload): axum::Json<PersonBody>,
) -> Result<axum::Json<Vec<PersonBody>>, (StatusCode, String)> {
sqlx::query("INSERT INTO people (person) values ($1)")
// Bind the (Serializable) payload via sqlx::types::Json
.bind(Json(payload))
.execute(&pool)
.await
.ok();
sqlx::query_as("SELECT id, person FROM people")
.fetch_all(&pool)
.await
// Map the error from the query.
.map_err(|err| (StatusCode::INTERNAL_SERVER_ERROR, err.to_string()))
.and_then(|p: Vec<PersonRow>| {
p.into_iter()
.map(|row| {
// Map each row by deserializing the JSON value from the database into `PersonBody`
serde_json::from_value(Value::Object(row.person.0))
// Map the error from serde_json so it's the same error type as the outer result.
.map_err(|err| (StatusCode::INTERNAL_SERVER_ERROR, err.to_string()))
})
// Collect all of the mapped bodies, stopping if we encounter an error
.collect::<Result<Vec<PersonBody>, _>>()
})
// Wrap the resulting array so it gets encoded as JSON.
.map(axum::Json)
}
#[tokio::main]
async fn main() {
let pool = PgPool::connect("postgresql://sample:nopassword@localhost/sample")
.await
.unwrap();
let app = axum::Router::new()
.route("/person/add", post(add_person))
.layer(Extension(pool));
axum::Server::bind(&"0.0.0.0:3000".parse().unwrap())
.serve(app.into_make_service())
.await
.unwrap();
}
You could definitely still fetch just the person JSON, and that would simplify the error handling at the end a bit too.
It appears how I was using sqlx::Json was off.
Never thought of creating two separate types for the JSON endpoint and for the table schema. Thanks for the tip.
By the way, what did you mean by "The way you're putting JSON in the database is a little odd"?
The end goal is to create a database table with an id column as uuid a user column to perform authentications and a data column as jsonb that will be used handle a blob of JSON data of various structures, fordata I plan to use serde `Map<String, Value>. Do you think that would work?
Well I just mean generally the goal of a relational database schema is to reflect what data you need to store. For example you have "name" and "age" in your sample code getting stored in the JSON data in Postgres. But you could just as easily add name and age columns to the table in the database. That will generally use disk space more efficiently, and be easier to query against than using JSON.
If you have data that doesn't have very consistent structure but still needs to get stored in the database then JSON columns make a lot of sense. If all your JSON has the same structure you might want to add columns to your table instead.
The example I used is a simplification of the actual data. For the actual data, the structure of the payload needs to be flexible, because it is irregular and will change.